A panel at the 2015 Science of Team Science (SciTS) Conference
Organizers: Daniel S. Katz (U. of Chicago & Argonne National Laboratory), Amy Brand (Digital Science), Melissa Haendel (Oregon Health & Science University), Holly J. Falk-Krzesinski (Elsevier)
Panelists: Robin Champieux (Oregon Health & Science University) Holly Falk-Krzesinski (Elsevier)Daniel S. Katz (U. of Chicago & Argonne National Laboratory)Philippa Saunders (University of Edinburgh)
Abstract: http://bit.ly/scholarly-recognition
Panel: Our Scholarly Recognition System Doesn’t Still Work
1. Panel: Our Scholarly Recognition
System Doesn’t Still Work
Organizers: Daniel S. Katz, Amy Brand,
Melissa Haendel, Holly J. Falk-Krzesinski
Abstract: http://bit.ly/scholarly-recognition
Panelists:
Robin Champieux (Oregon Health & Science University)
Holly Falk-Krzesinski (Elsevier)
Daniel S. Katz (U. of Chicago & Argonne National Laboratory)
Philippa Saunders (University of Edinburgh)
2. Overall Issue
● Scientific community has (somewhat) figured
out how to recognize and reward individual
work accomplishments, over hundreds of
years
● But most work today is not individual -
rather, it’s in the context of a collaborative
team
● Understanding the contributions of people in
teams is important to science in general,
funders, publishers, universities, ...
3. Examples of the problem
● “Author” no longer indicates a person who wrote
something - it now has a different meaning, more like
“contributor”
o But in many fields, only some contributors are seen
as significant enough to be listed as “authors”
● Ordering of author has discipline-specific meaning,
sometimes, but is also taken to mean more and less
than the “authors” intended in some contexts
o For instance, authorship in economics is
alphabetical, but the concept of first author as more
important has led to those with earlier letters in the
alphabet being more likely to get tenure or win
prizes
4.
5. How can we fix the system?
● Five panelists representing projects looking
at this will present their ideas
6. Panel Schedule
3:15 Introduction
3:20 Holly Falk-Krzesinski (Elsevier). Team science reward and
recognition and publishers’ role in clarifying attribution in a digital world
3:25 Amy Brand (Digital Science), Project CRediT, presentation to be
given by Dan in Amy’s place
3:30 Daniel S. Katz (University of Chicago & Argonne National
Laboratory), Transitive Credit
3:35 Robin Champieux (Oregon Health & Science University), Force11
Attribution Working Group
3:40 Philippa Saunders (University of Edinburgh), The Academy of
Medical Sciences Team Science policy project
3:45 Panel members respond to each other
3:55 General discussion with audience
4:30 end
7. REWARDING TEAM SCIENCE
“We will need to find better ways to do team science
and reward it if we are to solve large overarching
problems. Everybody on the team needs to get the
same big gaudy championship ring…”
– AG Gilman. Silver Spoons and Other Personal Reflections. Annu. Rev.
Pharmacol. Toxicol, 2012
8. Go, Hawks, Go!
“Blackhawks' Stanley Cup rings will be handed
out to players, coaches, equipment managers,
trainers and medical staff…during a private
ceremony.”
9. | 9
• Understanding needs across numerous
stakeholders
• Recognizing differences across disciplines
and the need for thoughtful input from
stakeholders
• Thinking beyond the traditional research
article
• Bibliometrics & scientometrics
• Systems & platforms
• Unintended consequences, concerns
Recognizing Team Science Contribution
10. CONNECT WITH ME
Holly J. Falk-Krzesinski, PhD
Vice President, Strategic Alliances
Global Academic Relations Elsevier
Chicago, IL, USA
h.falk-krzesinski@elsevier.com
http://www.linkedin.com/in/hollyfk
+1 847-848-2953
17. Development and testing of a
taxonomy
17
Harvard – Wellcome Trust effort
Nature 508, 312–313 (17 April 2014) doi:10.1038/508312a
Contributorship | SSP 2014
18. Project CRediT working group
• Liz Allen (Chair), Wellcome Trust
• Amy Brand (Chair), Digital Science
• Micah Altman, MIT Libraries
• Helen Atkins, PLoS
• Monica Bradford, Science/AAAS
• Todd Carpenter, National Information Standards Organization
• Jon Corson-Rikert, Cornell University
• Jeffrey Doyle, Cornell University
• Melissa Haendel, Oregon Health & Science University
• Daniel S. Katz, National Science Foundation
• Veronique Kiermer, Nature Publishing Group
• Nettie Lagace, National Information Standards Organization
• Emilie Marcus, Elsevier Inc
• Walter Schaffer, National Institutes of Health
• Jo Scott, Wellcome Trust
• Gene Sprouse, American Physical Society
• Victoria Stodden, Columbia University 18
19. Term Definition
Conceptualization Ideas; formulation or evolution of overarching research goals and aims.
Methodology Development or design of methodology; creation of models.
Software Programming, software development; designing computer programs; implementation of the
computer code and supporting algorithms; testing of existing code components.
Validation Verification, whether as a part of the activity or separate, of the overall
replication/reproducibility of results/experiments and other research outputs.
Formal Analysis Application of statistical, mathematical, computational, or other formal techniques to analyse
or synthesize study data.
Investigation Conducting a research and investigation process, specifically performing the experiments,
or data/evidence collection.
Resources Provision of study materials, reagents, materials, patients, laboratory samples, animals,
instrumentation, computing resources, or other analysis tools.
Data Curation Management activities to annotate (produce metadata), scrub data and maintain research
data (including software code, where it is necessary for interpreting the data itself) for initial
use and later re-use.
Writing – Original Draft Preparation, creation and/or presentation of the published work, specifically writing the initial
draft (including substantive translation).
Writing – Review &
Editing
Preparation, creation and/or presentation of the published work by those from the original
research group, specifically critical review, commentary or revision – including pre- or post-
publication stages.
Visualization Preparation, creation and/or presentation of the published work, specifically
visualization/data presentation.
Supervision Oversight and leadership responsibility for the research activity planning and execution,
including mentorship external to the core team.
Project Administration Management and coordination responsibility for the research activity planning and
execution.
Funding Acquisition Acquisition of the financial support for the project leading to this publication.
19
28. www.ci.anl.gov
www.ci.uchicago.edu
28 Transitive Credit
Motivation
• Science relies on activities that are not fully
recognized
– Sharing of data; development of common data
resources, software and methodologies; annotation of
data and publications; creating education modules &
tools
• Accepted problem: many recent reports
• Some partial solutions: e.g., NSF biosketch
“products”, not publications
• This talk
– New idea, transitive credit, to address the issue of
crediting indirect contributions
– Leads to potential solutions to other problems
29. www.ci.anl.gov
www.ci.uchicago.edu
29 Transitive Credit
Why traditional citation is failing
• New knowledge clearly builds on past knowledge
• Traditionally, author cites previous papers
• Doesn’t work well for digital products like software
– Often dependent on libraries (assembled software
packages), code fragments, and algorithms
– Identifier that should be cited is not clear
– If cited library depends on another library, contribution
of second library not captured
• Similarly, data citation should credit people who
gathered & curated the data
– Hard for paper author to find these details
30. www.ci.anl.gov
www.ci.uchicago.edu
30 Transitive Credit
Credit Map Concept
1. Decide what to credit
– People and things: Authors, papers, software, data, systems
o Traditionally listed in author list, paper body, acknowledgements,
citations, etc.
o All identified uniquely: using ORCIDs, DOIs, etc.
2. Determine how much credit for each
– Not straightforward
o Perhaps hierarchical: determine credit for authors and how to split
it, credit for software and how to split it, etc.
o We’ve figured out author ordering in all published papers, we can
figure this out too
3. Person who registers product also registers credit map
– Affirmed by registration agency?
31. www.ci.anl.gov
www.ci.uchicago.edu
31 Transitive Credit
Example Credit Map
Paper
Author
B
... Paper
M
... Software
X
...
0.2
0.05 0.2
Author
A
0.2
Data
K
...
0.1
• Stored in JSON-LD, JavaScript Object Notation for
Linked Data, http://json-ld.org/
• Extension of the key-value based JSON document
format
• Provides a way of describing machine-readable
information with semantic context
32. www.ci.anl.gov
www.ci.uchicago.edu
32 Transitive Credit
• Credit maps are related
• Allows weighted credit to flow down and up
• Credit for Software 12 in Paper is 0.2 * 0.3 (6%)
• Could also look at all papers Software 12
contributes to
Author
1
... Paper
4
... Software
12
...
0.1
0.1
0.3
Transitive Credit
Paper
Author
B
... Paper
M
... Software
X
...
0.2
0.05 0.2
Author
A
0.2
Data
K
...
0.1
33. www.ci.anl.gov
www.ci.uchicago.edu
33 Transitive Credit
Issues & future work
• Scientific sociotechnical system is moving to make
this work
– Need unique IDs for people & products
o ORCID & DOIs?
– Registering credit maps
o Implement within handle/DOI?
– Tracking product usage to make generating credit maps
easier
o Provenance systems?
• Standards (e.g. CASRAI, VIVO)?
• Social/cultural acceptance?
• Test in a domain to see what is learned?
34. www.ci.anl.gov
www.ci.uchicago.edu
34 Transitive Credit
Credits
• Initial discussions about this in 2010 Institute for Computing in Science
(ICiS) workshop breakout session with Jacob Foster (U Chicago) &
Robert Stevens (U Manchester)
• Further discussions with David Proctor (NSF) & Ian Foster (U Chicago)
• D. S. Katz, "Citation and Attribution of Digital Products: Social and
Technological Concerns," 1st Workshop on Sustainable Software for
Science: Practice and Experiences (WSSSPE1), in conjunction with
SC13, figshare, DOI: 10.6084/m9.figshare.791606, 2013
• D. S. Katz, "Transitive Credit as a Means to Address Social and
Technological Concerns Stemming from Citation and Attribution of
Digital Products," Journal of Open Research Software, v.2(1): e20, pp.
1-4, 2014 (DOI: 10.5334/jors.be)
• D. S. Katz, A. M. Smith, "Implementing Transitive Credit with JSON-LD,"
2nd Workshop on Sustainable Software for Science: Practice and
Experiences (WSSSPE2), in conjunction with SC14, arXiv:1407.5117
[cs.CY], 2014
35.
36.
37.
38.
39.
40.
41.
42.
43. ‘Team Science’ Project
Academy of Medical Sciences Careers
Committee Task Group
Professor Philippa Saunders FMedSci
Director Post Graduate Research Training,
The University of Edinburgh, UK
44. The Academy Team Science Project
• We set out to investigate:
• Benefits and challenges of team science
• Barriers to reward and recognition for those participating in
team science (particularly earlier career researchers)
• We aim to make recommendations that:
• Address any challenges or barriers
• Enhance career progression of those involved in team
science projects
• Influence the behaviour of researchers as well as policy and
practice of publishers, employers and funders
• Promote culture change in the longer term
45. Report published
Early 2016
Project timeline
Project launch
Call for written
evidence
15 September 2014
Written evidence
closes
7 November 2014
Discussion sessions:
funders, employers,
publishers
2 and 23 February
2015
‘Focus group’
workshop: researchers
May 2015
All-stakeholder
workshop
September 2015
Collect information
Develop conclusions and
recommendations
Drafting report
Summer 2015
46. Evidence gathering and stakeholder
engagement
• Researchers
• Local focus groups and online surveys
• Researcher workshops
• Funders
• UK Research Councils, Clinical funders
• Premier UK charities – Wellcome, Cancer
• Employers
• Universities, Government Institutes, Pharma
• Publishers
• Nature, Science, PLoS, Wiley, Elsevier…
• AND..
47. Findings to date
• Large collaborative projects are increasing, and can result in
high-impact work
• Challenges include difficulty finding collaborators, insufficient
support for ‘staff scientists’, inadequate funding and training.
• Main disincentive is risk of inadequate recognition – in career
development.
• Career development and research assessment focuses on
first and last author publications, PI status on grants, and
demonstration of ‘leadership’ and ‘independence’ – not
‘team science’.
• Not all researchers are adequately recognised through
inclusion on publications or grants
48. Our vision for the future
• We want to remove disincentives for individual researchers,
particularly earlier career researchers, to participate in
‘team science’ by:
• Recognising and rewarding the varied contributions required
for effective ‘team science’
• Improving researchers’ training in team skills
• Improving career prospects of key skilled support staff
• We want to use use our report and our continued engagement
with diverse stakeholders to support changes in:
• Researcher culture
• Publications
• Recruitment, promotion and funding decisions
• Support from funders and employers