The Infrastructure for Alternative Metrics

1,308
-1

Published on

The Infrastructure for Alternative Metrics: What do we need to compare digital apples to digital apples?
Todd Carpenter, Executive Director, NISO
Presented at AAAS Annual Meeting: February 16, 2014
Chicago, IL

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,308
On Slideshare
0
From Embeds
0
Number of Embeds
19
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Infrastructure for Alternative Metrics

  1. 1. The Infrastructure for Alternative Metrics What do we need to compare digital apples to digital apples? ! Todd Carpenter, Executive Director, NISO AAAS Annual Meeting Chicago, IL | February 16, 2014
  2. 2. We hate discussing infrastructure
  3. 3. Photo: Minneapolis College
 of Art and Design Library
  4. 4. About • Non-profit industry trade association accredited by ANSI with 190+ members • Mission of developing and maintaining technical standards related to information, documentation, discovery and distribution of published materials and media • Volunteer driven organization: 400+ spread out across the world • Responsible in various wasy for standards like ISSN, DOI, Dublin Core metadata, DAISY digital talking books, OpenURL, MARC records, and ISBN
  5. 5. Photo by dumbledad on Flickr http://www.flickr.com/photos/dumbledad/298650884/!
  6. 6. But what is the “infrastructure” for assessment?
  7. 7. We stand on the shoulders of giants...
  8. 8. The Mighty Citation
  9. 9. Impact factor
  10. 10. Thomson-Reuters 
 IMPACT FACTOR
  11. 11. Two Stanford Students
  12. 12. Two Stanford Students
  13. 13. They wrote a paper
  14. 14. They started a company in someone’s garage Photo: AP
  15. 15. A simple conclusion can be drawn Citations are awesome, aren’t they? A perfect solution to all the world’s problems, no?
  16. 16. Back to Sir Isaac Newton “What Des-Cartes did was a good step.You have added much several ways, & especially in taking ye colours of thin plates into philosophical consideration. If I have seen further it is by standing on ye shoulders of Giants."
  17. 17. Caveat emptor Citations aren’t always what we think they are
  18. 18. Another problem with citations
  19. 19. So where to from here?
  20. 20. Define altmetrics, please
  21. 21. Social Media Metrics Image: Danny Brown, The State of Social Media Marketing 2012
  22. 22. Usage Based Metrics
  23. 23. Click Map Analysis
  24. 24. Network Analysis Clickstream data yields highresolution maps of science Johan Bollen, Herbert Van de Sompel, et al. PLoS One, February 2009
  25. 25. Network Analysis Clickstream data yields highresolution maps of science Johan Bollen, Herbert Van de Sompel, et al. PLoS One, February 2009
  26. 26. Behavioral Metrics
  27. 27. Sentiment Analysis
  28. 28. Non-traditional content Assessing impact of 
 new forms of communication isn’t as simple as one might think. Image: Domenico, Caron, Davis, et al.
  29. 29. Non-traditional content Assessing impact of 
 new forms of communication isn’t as simple as one might think. Image: Domenico, Caron, Davis, et al.
  30. 30. We’ve got what, now how?
  31. 31. What can be measured? Image: Flickr user karindalziel
  32. 32. How do we 
 measure things?
  33. 33. At what granularity do we measure things?
  34. 34. Consistency in what is counted Source:  Scott  Chamberlain,  Consuming  Article-­‐Level  Metrics: Observations  And  Lessons  From  Comparing  Aggregator  Provider  Data,  Information  Standards  Quarterly,  Summer  2013,  Vol  25,  Issue  2.
  35. 35. Which measurements are valuable?
  36. 36. How do we ensure the validity of those measurements?
  37. 37. How to (or even if we should) deal with other issues such as gaming?
  38. 38. What infrastructure do we need for alternative metrics?
  39. 39. Basic Definitions ! (So we are all talking about the same thing)
  40. 40. Comparison across providers Source:  Scott  Chamberlain,  Consuming  Article-­‐Level  Metrics: Observations  And  Lessons  From  Comparing  Aggregator  Provider  Data,
  Information  Standards  Quarterly,  Summer  2013,  Vol  25,  Issue  2.
  41. 41. Authorship ----Disambiguation & Contributor Roles
  42. 42. Element Identification
  43. 43. Open exchange of component data
  44. 44. = TRUST
  45. 45. Where does NISO fit in?
  46. 46. Alterna(ve  Assessment   Ini(a(ve   ! Phase  1  Mee(ngs October  9,  2013    -­‐  San  Francisco,  CA December  11,  2013  -­‐  Washington,  DC January  23-­‐24  -­‐  Philadelphia,  PA ! Phase  1  report  expected  in  May  2014
  47. 47. Just  a  few  ideas  from  in-­‐person  mee(ngs   Disambigua)on  problems   Create  a  central  repository  for  data   Define  provenance  for  metrics   Foster  culture  of  DOI  use  in  mass  media   Establish  defini)ons  for  metric  types   Define  data  extract/gathering  methods   Standardize  units  of  measure   Define  levels  of  transparency/privacy   !
  48. 48. Alterna(ve  Assessment  Ini(a(ve   ! Phase  2 Presenta(ons  of  report  (June  2014) Priori(za(on  Effort  (June  -­‐  Aug,  2014) Project  approval  (Sept  2014) Working  group  forma(on  (Oct  2014) Consensus  Development  (Nov  2014  -­‐  Dec  2015) Trial  Use  Period  (Dec  15  -­‐  Mar  16) Publica(on  of  final  recommenda(ons  (Jun  16)
  49. 49. May 15, 2013 74
  50. 50. Thank you! Todd Carpenter, Executive Director tcarpenter@niso.org @TAC_NISO ! National Information Standards Organization (NISO) 3600 Clipper Mill Road, Suite 302 Baltimore, MD 21211 USA +1 (301) 654-2512 www.niso.org
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×