Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Practical applications for altmetrics in a changing metrics landscape

776 views

Published on

"Practical applications for altmetrics in a changing metrics landscape" - Sara Rouhi, Altmetric product specialist, and Anirvan Chatterjee, Director Data Strategy for CTSI at UCSF

Published in: Science
  • Be the first to comment

  • Be the first to like this

Practical applications for altmetrics in a changing metrics landscape

  1. 1. Practical applications for altmetrics in a changing metrics landscape Sara Rouhi, @RouhiRoo Product Specialist, Altmetric sara@altmetric.com Anirvan Chatterjee, @anirvan Director, Data Strategy Clinical & Translational Science Institute, UCSF anirvan.chatterjee@ucsf.edu
  2. 2. Today we’ll cover • Need and origins • Definitions • Where you can find altmetrics • How they’re being used at UCSF • How they’re being used at Duke University • What the future may bring…
  3. 3. Like all great movements…
  4. 4. It started with a manifesto…www.altmetrics.org
  5. 5. If metrics are about filtering the good research from the bad, traditional metrics* aren’t working *Peer review, journal impact factor, citation counting
  6. 6. *http://blogs.nature.com/news/2014/05/global-scientific-output-doubles-every-nine-years.html Global scientific output doubles every nine years From submission to publication in as little as six weeks
  7. 7. Traditional metrics lag behind…
  8. 8. 44K online mentions of scholarly articles every day. 1 mention every 2 seconds! 50K unique articles are shared each week. >3.5M articles with tracked attention data. The conversation has moved online.. Source: Altmetric internal data, March 2015
  9. 9. Funders want evidence of societal impact Grant funders looking for proof of “broader impacts” often defined as “an effect, change, or benefit to the economy, society, culture, public policies, health, the environment, etc.” Research Excellence Framework, http://www.ref.ac.uk/panels/assessmentcriteriaandleveldefinitions/ Broaden dissemination to enhance scientific and technological understanding, for example, by presenting results of research and education projects in formats useful to students, scientists and engineers, members of Congress, teachers, and the general public. http://www.nsf.gov/pubs/2007/nsf07046/nsf07046.jsp
  10. 10. Expanded administrative remits • Strategic planning • Supervision of academic affairs • Fundraising • Grants administration • Public affairs
  11. 11. So what is Altmetric and what are altmetrics?
  12. 12. Altmetric is a data science company that tracks attention to research outputs, delivering output level metrics via visually engaging, intuitive interfaces. In other words, we help give credit where credit is due. Who are we?
  13. 13. An alternative, more immediate measure of attention From non- traditional sources To provide a larger context What are altmetrics? Not a replacement but a complement Policy documents, blogs, mainstream news, social media Providing a multi- faceted picture of engagement
  14. 14. Multifaceted picture of engagement: Audiences Practitioners General Public Professional Communicators Interested Parties Scholars
  15. 15. Multi-faceted picture of engagement: Interaction • Scholars – Downloads – Citations – Bookmarks/saves • Early career – Social media, blogs • General public – News, blogs, – Social media • Practitioners – Policy documents – Field-specific blogs/Social Media • Research communicators – News, blogs, social media • Interested parties – Policy docs, blogs
  16. 16. Who on campus needs to track this? Administrators (Grants, Departmental, Institutional) Library Marketing/PR/Communications Research Groups
  17. 17. Why? Administrators • Are we in compliance with grant/govt mandates? • Do our research outputs work toward our group/dept/instit. mission? • Does our campus have global reach? • Does our research influence policy, legislation, best practices?
  18. 18. Why? Libraries • Do our collections reflect where our research gets the most attention (i.e. are we missing anything? Are we purchasing the wrong things?) • Does our OA policy bring more attention to our work? • How does our institutional repository bring attention to campus research?
  19. 19. Why? Marketing/PR/Communications • Is anyone out there getting it wrong? • Have we missed opportunities to get in front of a PR/communications storm? • Can we benchmark our outreach efforts? • Are we reaching the target markets we want? • Are we using the right media?
  20. 20. Why? Research Groups • Are we reaching the audiences we want to see our work? • Is anyone misrepresenting/confused by our work? • How do we demonstrate “broader impact” to grant funders? • How can we reach more people with our research? • Are we engaging unexpected communities?
  21. 21. So where will you find altmetrics?
  22. 22. Where will you see our data? Publisher platforms
  23. 23. Where will you see our data? Books
  24. 24. Where will you see our data? Other metrics providers
  25. 25. Where will you see our data? Author Tools “A CV that documents alternative metrics […] offers a much more compelling argument to a tenure committee of their research impact than a traditional publication list.” - Donald Samulack, Editage
  26. 26. Recommendation Engine Integration for Medical research Apps Integrating Altmetric service into publishing platform Altmetric Integration for JAMA and others to monitor research impact Integrates Altmetric data for over 1 million articles Where will you see our data? Platforms
  27. 27. • Institutional repository badge embeds • Badge integration with discovery systems Where will you see our data? Institutional repositories/discovery systems
  28. 28. Who uses our data? Research Management Systems
  29. 29. In practice, how do you use this data?
  30. 30. Three Experiments with Altmetric data April 22, 2015 Anirvan Chatterjee Director, Data Strategy Clinical & Translational Science Institute Clinical & Translational Science Institute
  31. 31. UCSF Profiles profiles.ucsf.edu  Research networking system (like VIVO, Symplectic Elements)  Research profile of 7,000 people on campus • Bios, publications, NIH grants, awards, etc.  Not just a directory  Publications automatically kept current • Heavily used — 100,000+ visits per month from on/off campus • Data reuse — APIs used by 25 other campus systems
  32. 32. Why altmetrics?  Show early impacts of research  Attempt to measure/visualize impact, rather than just anecdata  Doesn’t displace traditional metrics of research output (e.g. citations, journal rankings, etc.)
  33. 33. build quick • fail fast
  34. 34. build quick • fail fast three experiments
  35. 35. 1. Department Data Explorer
  36. 36. Lessons learned  People were interested in seeing what new research was trending  Altmetric badges were easy to integrate
  37. 37. 2. UCSF Profiles publication list
  38. 38. Lessons learned  Altmetrics advocates were supportive  Zero pushback from campus community  Because of easy Altmetric integration, we could add altmetrics even before we added citation data
  39. 39. 3. Article recommendation engine
  40. 40. Background  Many researchers focus on a handful of key journals, but may miss out on trending stories on non-core topics of interest • e.g. cardiologist interested in digital health  We know UCSF researchers’ research topics/interests… • Hand-entered • Algorithmically derived from publications  Our recommendation engine shares new articles of interest that matches researchers’ known areas of interest
  41. 41. Altmetric API  Details at http://api.altmetric.com/  Free to use basic data for apps and mashups, with rate limits  Generous free access for noncommercial academic research projects 5/18/2015Presentation Title and/or Sub Brand Name Here55
  42. 42. Lessons learned so far (work in progress)  Altmetric API made it easy to integrate altmetrics data  Among first round of beta testers: • Most hadn’t seen recommended papers • Some questions about article level metrics vs. journal reputation • Need to improve relevance matching • Enough positive feedback for us to keep exploring
  43. 43. Takeaways from our three experiments…  When it comes to altmetrics, researchers aren’t monolithic • Some bullish, others guardedly positive, few/none offended  Altmetrics data doesn’t yet solve a burning institutional need • We’re hearing more about altmetrics from early adopters, rather than leadership  Low barriers to experimentation • It’s very easy to get started and integrate into our processes • We’re able to keep tossing around ideas to find the best fit
  44. 44. Anirvan Chatterjee Clinical & Translational Science Institute, UCSF anirvan.chatterjee@ucsf.edu • @anirvan
  45. 45. NSF Broader Impacts Criterion To what extent will [the research] enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society? http://www.nsf.gov/pubs/2007/nsf07046/nsf07046.jsp NSF Broader Impacts Criterion
  46. 46. Have you heard of…
  47. 47. “Try the bookmarklet, it’s ultra cool”
  48. 48. Article by article wasn’t enough
  49. 49. My data from the Altmetric for Institutions summary report
  50. 50. Policy documents? Who knew?
  51. 51. 38 12 3 2 6 2 2 25 12 3 2 6 2 2 0 5 10 15 20 25 30 35 40 Ar cle 1 Ar cle 2 Ar cle 3 Ar cle 4 Ar cle 5 Ar cle 6 Ar cle 7 Ar cle 8 Ar cle 9 Ar cle 10 Total No. of stories Total No. of outlets No. Interna onal outlets Demonstrating “broader impact” with International News coverage = 60% Data from Article Details Pages
  52. 52. Demonstrating “broader impact” with International Blog coverage = 40% Data from Article Details Pages
  53. 53. New communities, new conversations From Article Details Pages
  54. 54. Many many more eyeballs 3,941,227 634,343 190,593 187,480 263,719 70,617 137,926 115,455 84,158 5,121 0 500,000 1,000,000 1,500,000 2,000,000 2,500,000 3,000,000 3,500,000 4,000,000 4,500,000 Article 1 Article 2 Article 3 Article 4 Article 5 Article 6 Article 7 Article 8 Article 9 Article 10 Twitter reach by article - Total upward bound: 5,630,639 Data from Article Details Pages Even if 1% click on the article, that’s 56,000 eyes that never would have seen it before Twitter.
  55. 55. Saved Terrie time; saved her program manager time… NIH Program Manager: “[This Altmetric data is] fantastic information for [our] budget report.”
  56. 56. Before Altmetric data she didn’t know… • How broadly her work was disseminated – News vs policy vs blogosphere • The difference in interest by source – Methodology papers via Twitter • That all this data could be aggregated to save time
  57. 57. Recap of where we are… • Education is critical • Tenure/promotion paradigm • “Here one day, gone the next” • Need for sentiment analysis – So it’s not just more numbers • Facilitating industry standards – NISO Altmetrics Whitepaper
  58. 58. Thank you! Questions? Sara Rouhi sara@altmetric.com @RouhiRoo Anirvan Chatterjee anirvan.chatterjee@ucsf.edu @anirvan
  59. 59. Thank you! Questions? Sara Rouhi sara@altmetric.com @RouhiRoo Anirvan Chatterjee anirvan.chatterjee@ucsf.edu @anirvan
  60. 60. Attention exists on a spectrum Tweets/bookmarks Holdings/saves/shares Usage Citations Policy document citations Blog coverage Post publication peer review News coverage • Superficial • Article may or may not have been read • Many potential readers but few actual • Cost-light (er) • Article more likely to be read • Cost-heavy (ier) • Readers = practitioners (?) • Actionable (?)

×