This document summarizes the progress made by the National Information Standards Organization (NISO) in developing standards for new metrics in scholarship, known as altmetrics. It discusses how NISO held discussions and meetings with over 400 contributors to brainstorm ideas and reach consensus on key elements needed to build trust in metrics, including defining what is counted, how it is identified, aggregation procedures, and data exchange standards. The goal is to establish standardized approaches and definitions that can facilitate consistent measurement and comparison of the broader impacts of scholarly work.
Use of mutants in understanding seedling development.pptx
Let’s All Agree on What We’re Counting and How
1. Let’s
All
Agree
on
What
We’re
Coun2ng
and
How:
Progress
on
standards
for
new
metrics
in
scholarship
Todd A. Carpenter
Executive Director, NISO
@TAC_NISO
August 19, 2015
2. ! Non-‐profit
industry
trade
associa2on
accredited
by
the
American
Na2onal
Standards
Ins2tute
! Mission
of
developing
and
maintaining
technical
standards
related
to
informa2on,
documenta2on,
discovery
and
distribu2on
of
published
materials
and
media
! Volunteer
driven
organiza2on:
400+
contributors
spread
out
across
the
world
! Responsible
(directly
and
indirectly)
for
standards
like
ISSN,
DOI,
Dublin
Core
metadata,
DAISY
digital
talking
books,
OpenURL,
MARC
records,
and
ISBN
About
August
19,
2015
2
4. Since
Jason
Priem
coined
the
term
“altmetrics”
• There
have
been
some
1,170
ar2cles
and
posts
(per
Google
Scholar)
on
the
topic
of
altmetrics
• There
have
been
eight
altmetrics
conferences
• There
are
now
at
least
four
companies
focused
on
providing
altmetrics-‐style
analy2cs
August
19,
2015
4
5. When it becomes a standard,
its just not “alterative”
any more…
6. But
what’s
the
value
in
coun2ng
Tweets
and
Facebook
likes?
August
19,
2015
6
7. More
than
just
popularity
Research
is
poin2ng
to
the
fact
that
there
is
a
modest
posi2ve
correla2on
between
early-‐signal
metrics
(altmetrics)
and
later-‐signal
metrics
(cita2ons)
Do
altmetrics
correlate
with
cita2ons?
Extensive
comparison
of
altmetric
indicators
with
cita2ons
from
a
mul2disciplinary
perspec2ve
by
Rodrigo
Costas,
Zohreh
Zahedi,
Paul
Wouters
Do
Altmetrics
Work?
TwiEer
and
Ten
Other
Social
Web
Services
by
Mike
Thelwall
,
Stefanie
Haustein,
Vincent
Larivière,
Cassidy
R.
Sugimoto
(2006)
Earlier
web
usage
sta2s2cs
as
predictors
of
later
cita2on
impact
by
Brody
T,
Harnad
S
Carr
L
August
19,
2015
7
9. How
do
we
measure
the
impact
of
these
different
forms
of
output?
August
19,
2015
9
10. Would
a
researcher
focus
on
only
one
data
source
or
methodological
approach?
August
19,
2015
10
Duke
University
-‐
Informa2on
Ini2a2ve
at
Duke
(IID)
11. There
aren’t
metrics
and
“altmetrics”;
there
are
only
metrics!
August
19,
2015
11
12. We
have
been
using
non-‐cita2on-‐based
metrics
for
decades
August
19,
2015
12
18. Different
totals
isn’t
necessarily
a
sign
of
an
error
Simply
because
two
methods
of
calcula2on
don’t
yield
the
same
result
doesn’t
mean
that
something
nefarious
is
going
on
August
19,
2015
18
19. Inconsistency
on
TwiEer?
Who
knew?
For
Details
see:
Zhiwu
Xie’s
(Virginia
Tech)
Lightening
Talk:
Twiner
Inconsistency
20. What are the elements
of metrics that we need
to build trust?
22. I
ooen
sound
like
a
broken
record
• Defining
what
is
to
be
counted
=
standards
• How
to
describe
what
to
count
=
standards
• Iden2fica2on
of
what
to
count
=
standards
• Procedures
for
coun2ng
or
not
=
standards
• Aggrega2ng
counts
from
network
=
standards
• Exchange
of
what
was
counted
=
standards
August
19,
2015
22
26.
Phase
1
Mee2ngs
October
9,
2013
-‐
San
Francisco,
CA
December
11,
2013
-‐
Washington,
DC
January
23-‐24,
2014
-‐
Philadelphia,
PA
Round
of
1-‐on-‐1
interviews
–
March/Apr
Phase
1
report
published
in
June
2014
27. Mee2ng
Lightning
Talks
• Expecta2ons
of
researchers
• Exploring
disciplinary
differences
in
the
use
of
social
media
in
scholarly
communica2on
• Altmetrics
as
part
of
the
services
of
a
large
university
library
system
• Deriving
altmetrics
from
annota2on
ac2vity
• Altmetrics
for
Ins2tu2onal
Repositories:
Are
the
metadata
ready?
• Snowball
Metrics:
Global
Standards
for
Ins2tu2onal
Benchmarking
• Interna2onal
Standard
Name
Iden2fier
• Altmetric.com,
Plum
Analy2cs,
Mendeley
reader
survey
• Twiner
Inconsistency
August
19,
2015
27
“Lightning
by
snowpeak
is
licensed
under
CC
BY
2.0
34.
Phase
2
Presenta2ons
of
Phase
1
report
(June
2014)
Priori2za2on
Effort
(June
-‐
Aug,
2014)
Project
approval
(Nov
-‐
Dec
2014)
Working
group
forma2on
(Jan
-‐
March
2015)
Consensus
Development
(March
2015
-‐
Feb
2016)
Trial
Use
Period
(Feb
15
-‐
May
16)
Publica2on
of
final
recommenda2ons
(Aug
16)
36. 200
+
Ideas
Highligh2ng
25
Poten2al
Work
Topics
Defini2ons
Applica2on
to
types
of
research
outputs
Discovery
implica2ons
Research
evalua2on
Data
quality
and
gaming
Grouping,
aggrega2ng,
and
granularity
Context
Adop2on
NB:
NISO
isn’t
pursuing
ALL
of
these
topics
August
19,
2015
36
37. August
19,
2015
37
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of
linle
importance
Moderately
important
Important
Very
important
Community
Feedback
on
Project
Idea
Themes
n=118
38. August
19,
2015
38
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Unimportant
Of
linle
importance
Moderately
important
Important
Very
important
n=118
Community
Feedback
on
Project
Idea
Themes
39. Top-‐ranked
ideas
(very
important
important
70%)
• Develop
specific
defini2ons
for
alterna2ve
assessment
metrics.
(87.9%)
• Promote
and
facilitate
use
of
persistent
iden2fiers
in
scholarly
communica2ons.
(82.8%)
• Develop
strategies
to
improve
data
quality
through
normaliza2on
of
source
data
across
providers.
(80.8%)
• Iden2fy
research
output
types
that
are
applicable
to
the
use
of
metrics.
(79.8%)
• Define
appropriate
metrics
and
calcula2on
methodologies
for
specific
output
types,
such
as
sooware,
datasets,
or
performances.
(78.1%)
• Explore
crea2on
of
standardized
APIs
or
download
or
exchange
formats
to
facilitate
data
gathering.
(72.5%)
• Research
issues
surrounding
the
reproducibility
of
metrics
across
providers.
(70.7%)
August
19,
2015
39
40. Launched
three
working
groups
August
19,
2015
40
A)
Development
of
defini2ons
and
descrip2ons
of
use
B)
Defini2ons
for
appropriate
metrics
and
calcula2on
methodologies
for
non-‐tradi2onal
output
types.
Also
work
toward
promo2on
and
facilita2on
of
use
of
persistent
iden2fiers
C)
Development
of
strategies
to
improve
data
quality
through
source
data
providers
41. Group
A:
Development
of
defini2ons
and
descrip2ons
of
use
August
19,
2015
41
Conducted
research
to
collect,
collated
and
expand
on
exis2ng
defini2ons,
concepts
and
use
cases
Iden2fied
some
elements
where
there
is
consensus,
and
iden2fied
others
where
there
may
be
controversies.
Working
toward
agreement
Second
subgroup
organizing
and
transforming
research
on
a
consolidated
list
of
use
cases
into
a
set
of
formalized
set
42. Group
B:
Non-‐tradi2onal
output
types
and
Persistent
iden2fiers
August
19,
2015
42
Reviewing
exis2ng
work
of
related
projects
such
as:
COUNTER,
bioCADDIE,
RDA
data
publishing
services,
FORCE11
code
cita2on
working
group
Brainstorming
on
what
various
non-‐tradi2onal
research
outputs
exist
to
determine
what
should
be
in
scope
for
the
work
Arranged
partnership
with
a
similar
CASRAI
group
General
discussion
on
iden2fiers
and
applica2ons
43. Group
C:
Strategies
to
improve
data
quality
August
19,
2015
43
Draoing
a
Code
of
Conduct”
Recommenda2ons
focused
on
how
data
providers,
aggregators
and
users
should
behave
in
order
to
introduce
transparency
and
ensure
that
delivered
data
is
trustworthy,
replicable,
consistently
reported
within
and
across
sources,
and
accurately
represents
what
it
intends
and/or
purports
to
measure.
45. Alterna2ve
Assessments
of
our
Assessment
Ini2a2ve
White
paper
downloaded
7,700
2mes
21
substan2ve
comments
received
120
in-‐person
and
virtual
par2cipants
at
the
mee2ngs
These
3
mee2ngs
anracted
400
RSVPs
for
live
stream
Goal:
generate
about
40
ideas,
in
total,
generated
more
than
250
Project
materials
downloaded
more
than
62,000
2mes
550
direct
tweets,
350
RTs
using
the
#NISOALMI
hashtag
Survey
ranking
of
output
by
118
people
Eight
ar2cles
in
tradi2onal
news
publica2ons
18
blog
posts
about
the
ini2a2ve
29
conference
presenta2ons
about
this
project