The NISO Update provides the latest news about NISO's current efforts, including standards, recommended practices and community meetings covering many areas of interest to the library community. Working group members will provide updates on projects newly underway or recently completed.
NISO Altmetrics Initiative, Todd Carpenter, Executive Director, NISO
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
NISO Update June 2014 Assessment Carpenter
1. Comparing
digital apples to digital apples:
An update on NISO’s Alternative
Assessment Initiative
Todd Carpenter
Executive Director, NISO
ALA Annual
June 30, 2014June 30, 2014 1
15. Steering Committee
• Euan Adie, Altmetric
• Amy Brand, Harvard University
• Mike Buschman, Plum Analytics
• Todd Carpenter, NISO
• Martin Fenner, Public Library of Science (PLoS) (Chair)
• Michael Habib, Reed Elsevier
• Gregg Gordon, Social Science Research Network (SSRN)
• William Gunn, Mendeley
• Nettie Lagace, NISO
• Jamie Liu, American Chemical Society (ACS)
• Heather Piwowar, ImpactStory
• John Sack, HighWire Press
• Peter Shepherd, Project Counter
• Christine Stohn, Ex Libris
• Greg Tananbaum, SPARC (Scholarly Publishing & Academic Resources Coalition)
June 30, 2014 15
16. Alternative Assessment
Initiative
Phase 1 Meetings
October 9, 2013 - San Francisco, CA
December 11, 2013 - Washington, DC
January 23-24 - Philadelphia, PA
Round of 1-on-1 interviews – March/Apr
Phase 1 report published June 9, 2014
June 30, 2014 16
17. Meetings’ General Format
• Collocated with other industry meeting
• Morning: lightning talks, post-it brainstorming
• Afternoon: discussion groups
– X
– Y
– Z
– Report back/react
• Live streamed (video recordings are available)
June 30, 2014 17
18. Meeting Lightning Talks
• Expectations of researchers
• Exploring disciplinary differences in the use of social media in
scholarly communication
• Altmetrics as part of the services of a large university library
system
• Deriving altmetrics from annotation activity
• Altmetrics for Institutional Repositories: Are the metadata
ready?
• Snowball Metrics: Global Standards for Institutional
Benchmarking
• International Standard Name Identifier
• Altmetric.com, Plum Analytics, Mendeley reader survey
• Twitter Inconsistency
June 30, 2014 18
“Lightning" by snowpeak is licensed under CC BY 2.0
20. SF Meeting Discussions
• Business & Use cases
– Publishers want to serve authors, make money
– People don’t value a standard, they value something that helps them
– … Couldn’t identify a logical standard need that actors in the space would value,
and best practices are of interest
• Quality & Data science
– Themes: context, validation, provenance, quality, description & metadata
– We'll never get to the point where assessment can be done without a human
in the loop, but discovery and recommendation can
• Definitions
– Define “ALM” and “Altmetrics”
– Map the landscape
– We'll never get to the point where assessment can be done without a human in the
loop, but discovery and recommendation can
June 30, 2014 20
21. DC Meeting Discussions
• Business and Use Cases
• Discovery
– metrics only get generated if material is discovered
• Qualitative vs. Quantitative
• Identifying Stakeholders and their Values
– stakeholders in outcomes / stakeholders in process of creating metrics
– shared values but tensions
– branding
• Definitions/Defining Impact
– metrics and analyses
– what led to success of citation?
– how to be certain we are measuring the right things
• Future Proofing
– what won't change
– impact - hard to establish across disciplines
June 30, 2014 21
22. Philly Meeting Discussions
• Definitions
– Define life cycle of scholarly output and associated metrics
– Qualitative versus Quantitative aspects - what is possible to define here
– Consider other aspects of these data collections
• Standards
– Develop definitions (what is a download? what is a view?)
– Differentiate between scholarly impact versus popular/social use
– Define sources/characteristics for metrics (social, commercial, scholarly)
• Data Integrity
– Counter biases/gaming
– Association with credible entities - e.g. ORCID ID v. gmail account
– Reproduceability is key
– Everyone needs to be at the table to establish overall credibility
• Use cases (3X)
June 30, 2014 22
24. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 24
25. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 25
26. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 26
27. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 27
28. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 28
29. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 29
30. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 30
31. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption
June 30, 2014 31
32. Potential work themes
Definitions
Application to types of research outputs
Discovery implications
Research evaluation
Data quality and gaming
Grouping, aggregating, and granularity
Context
Adoption & Promotion
June 30, 2014 32
33. Alternative Assessment
Initiative
Phase 2
Presentations of Phase 1 report (June 2014)
Prioritization Effort (June - Aug, 2014)
Project approval (Sept 2014)
Working group formation (Oct 2014)
Consensus Development (Nov 2014 - Dec 2015)
Trial Use Period (Dec 15 - Mar 16)
Publication of final recommendations (Jun 16)
June 30, 2014 33