Your SlideShare is downloading. ×

Research impact beyond metrics

7,636
views

Published on

Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund …

Traditional metrics, such as the h-index and journal impact factors, are used to measure the scholarly impact of research. However, in the current climate of accountability by funding providers, fund recipients would benefit from a more comprehensive impact management system (IMS) to facilitate the capture and reporting of narratives (including metrics) about research impact in the academy, on social policy, in industry, and ultimately with the public.

Librarians have always been good at telling and facilitating stories. Research support librarians can use their storytelling skills to contribute to the implementation and administration of an impact management system. Being able to translate research impact into harvestable and reportable metadata is the key.

Published in: Education

4 Comments
22 Likes
Statistics
Notes
No Downloads
Views
Total Views
7,636
On Slideshare
0
From Embeds
0
Number of Embeds
14
Actions
Shares
0
Downloads
59
Comments
4
Likes
22
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Impact beyond metrics: Tellingyour research impact storyResearch Support Community DayBrisbane, Australia, 11 February 2013Pat LoriaResearch LibrarianUniversity of Southern Queensland
  • 2. Limitations of traditional metrics Scholarly Public Recommended Faculty of 1000 Popular press Cited Traditional citations Wikipedia Discussed Scholarly blogs Facebook, Twitter Saved Mendeley, CiteULike Delicious Viewed PDF views HTML views Adapted from Jason Priem, Altmetrics and Revolutions: https://docs.google.com/presentation/pub?id=1Y4JnchsmHHiOQdJsEpQr33qm MWqhZJrPTDAg1cZoCcI&start=false&loop=false&delayms=3000#slide=id.i0
  • 3. Limitations of traditional metrics Scholarly Public Recommended Faculty of 1000 Popular press Cited Traditional citations Wikipedia Discussed Scholarly blogs Facebook, Twitter Saved Mendeley, CiteULike Delicious Viewed PDF views HTML views Adapted from Jason Priem, Altmetrics and Revolutions: https://docs.google.com/presentation/pub?id=1Y4JnchsmHHiOQdJsEpQr33qm MWqhZJrPTDAg1cZoCcI&start=false&loop=false&delayms=3000#slide=id.i0
  • 4. Three generations of impact metrics Journal- Author- Article- level level level metrics metrics metrics
  • 5. Different levels of metrics  Journal-level metrics  Journal impact factor  SJR & SNIP  Author-level metrics  Citations per Paper (CPP)  H-index  Article-level metrics  Articles, datasets, blogs, code, artistic creations  Citations, mentions, views, downloads, etc.  Altmetric bookmarklet, ImpactStory, etc.
  • 6. USQ and the NCP calculator  NCP: Normalized Citations per Paper  Developed by USQ (Library and Systems)  Uses Scopus RDCP data to normalize  Normalization creates a level playing field  Citations in low visibility/low citing fields normalized up  Citations in high visibility/high citing fields normalized down  Source code for NCP calculator to be released soon on GitHub
  • 7. NCP calculator Citations: 2010; Papers: 2007-2009
  • 8. Measuring impact beyond academia …impact is defined in a similar way as for the UK REF, i.e. “an effect on, change, benefit to the economy, society, culture, public policy or services, health, the environment or quality of life beyond academia”. ATN and Go8, Guidelines for completion of case studies in ATN/Go8 EIA Impact Assessment Trial: June – August 2012, http://www.atn.edu.au/eia/Docs/EIA_Trial_Guidelines_FINAL.pdf
  • 9. Measuring impact beyond academia Impact Case Study (ATN & Go8, similar to REF)  Institution  SEO codes • Overheads? • Data sources?  Title of Case Study  Context  Summary of Case Study Impact  Details of Impact  Research underpinning Impact  Research outputs from research underpinning impact  Additional information:  Validation of impact  People  Investment income http://www.atn.edu.au/eia/Docs/EIA_Trial_Guidelines_FINAL.pdf
  • 10. Alternative metrics or altmetrics [A]ltmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship. http://altmetrics.org/about/
  • 11. Altmetric bookmarklet http://www.altmetric.com/bookmarklet.php
  • 12. Altmetric bookmarklet  Free bookmark for Chrome, Firefox and Safari  Click on while viewing paper for impact data  Altmetric API can be embedded into second and third-party platforms, apps and mashups  Only works on PubMed, arXiv or DOI  Only supports publishers who embed Google Scholar friendly citation metadata  Twitter mentions are only available for articles published since July 2011 Example: Oxidants, antioxidants and the current incurability of metastatic cancers
  • 13. Altmetric bookmarklet http://www.altmetric.com/bookmarklet.php Counts Disadvantages / Bugs Altmetric score Difficult to explain Tweeters Facebook users News outlets Science blogs Google+ users Mendeley readers CiteULike readers Connotea readers Bookmarklet only works on PubMed, arXiv or pages containing a DOI Only supports publishers who embed Scholar friendly citation metadata Twitter mentions are only available for articles published since July 2011
  • 14. ImpactStory.org Artefact type Counts Known Bugs 12/12 Journal article Topsy Twitter coverage weak Dataset Science Seeker Different results for DOI Software PubMed and URL searches Slides Dryad Missing metrics Generic SlideShare Metrics too low Wikipedia Figshare PLoS Search PLoS ALM Delicious Facebook No link to mentions Scopus CiteULike GitHub Mendeley
  • 15. ImpactStory.org
  • 16. PlumAnalytics.com  Usage - Downloads, views, book holdings, ILL, document delivery  Captures - Favourites, bookmarks, saves, readers, groups, watchers  Mentions - blog posts, news stories, Wikipedia articles, comments, reviews  Social media - Tweets, +1s, likes, shares, ratings  Citations - Web of Science, Scopus, Google Scholar, Microsoft Academic Search
  • 17. PlumAnalytics.com  articles  presentations  book chapters  source code  books  videos  clinical trials  greatest sources  datasets  researcher graph  figures  group metrics  grants  paid service  patents  no API available
  • 18. Reasons to use altmetrics  Same reasons as using traditional metrics:  Grants, promotions, staff/program review  Review research dissemination strategy  Measure of influence and reach of output  Publishers can add value for authors/readers  Informed decisions by research managers  Comprehensive view of impact for funders  “Citation graph data is like Chekhov’s gun: once on stage, it has to be fired.” (Peter Vinkler, cited by Jason Priem, Altmetrics and Revolutions)
  • 19. Article Level Metrics (ALMs) “Article-Level Metrics are a comprehensive set of impact indicators that enable numerous ways to assess and navigate research most relevant to the field” (http://article-level-metrics.plos.org/alm-info/)
  • 20. PLoS ALMshttp://article-level-metrics.plos.org/alm-info/
  • 21. PLoS example
  • 22. Source: http://article-level-metrics.plos.org/researchers/
  • 23. Research profile systems For example, Symplectic Elements:
  • 24. The great objection! It’s not real impact! = It’s not scholarly impact. Research Impact Story Scholarly Public Government Industry
  • 25. What is USQ’s research impact story?
  • 26. Impact management system (IMS) • Metadata • Metadata • Metadata • Metadata • Metadata • Metadata • Metadata • Metadata Impact • Metadata Metrics Research Curriculum Outputs Vitae • Metadata • Metadata • Metadata • Metadata • Metadata • Metadata Esteem Human Measures ResourcesFlexible IMS Reportable Research Professional Income Activities • • • Metadata Metadata Metadata Metadata is key! • • • Metadata Metadata Metadata Community Industry Impact Impact • Metadata • Metadata Policy • Metadata • Metadata Impact • Metadata • Metadata • Metadata • Metadata • Metadata
  • 27. Impact metadata Design according to reporting needs  Academic  Industry  Citation  Consultation  Publication  Partnership  Teaching  Patent  Supervision  Social  Research project  Altmetric  Research income  Facebook  Government  Twitter  Mendeley  Application  Wikipedia  Consultation  Engagement  Social policy  Media coverage
  • 28. Discussions with Symplectic
  • 29. Librarian as storyteller Image: http://kassonpubliclibrary.blogspot.com.au/2010/03/spring-planting-at-library-storytime.html
  • 30. Advantages of IMS  Facilitates storytelling of research impact  Saves time for researchers and managers  Enables internal and external reporting  Evidence base for individual/program review  Accommodates government requirements  Data for case studies and grant applications  Impact captured for longitudinal analysis  Best IMS support transfer of data to other IMS  Managing impact: a new research literacy?
  • 31. Take-aways  Holistic impact monitoring and reporting  Make it easy for academics and managers  Do you need an impact management system?  What is your research impact story? http://jeps.efpsa.org/blog/2012/06/20/maximizing-research-impact/
  • 32. Research impact discussion Pat Loria Email: pat.loria@usq.edu.au Twitter: @pat_loria Image: http://www.duffysrehab.com/blog/want-to-make-an-impact-consider-betty-ford