Your SlideShare is downloading. ×
0
DORA and
reinventing
research
assessment

EIFL, Istanbul, November, 2013
Mark Patterson, Executive Director, eLife
Outline
• A few words about eLife
• Research assessment – old and new
approaches
• DORA recommendations
Funders taking direct action
eLife: motivations
eLife: motivations
Swift, fair,
decisive
process

Exploit
digital
media

Serve
science

Open
access
%PubMed available as open access in
PMC

14.0%
12.0%

10.0%
8.0%
6.0%
4.0%
2.0%
0.0%
2006

2007

2008

2009

2010

2011

2...
Researchers (authors

and readers)

Institutions

Librarians

Funders

Research
assessment
The public

Policy makers
Publi...
What
journal?

What impact
factor?
The impact
factor is…

• a journal-based metric
• proprietary
• incomplete

http://www.flickr.com/photos/m2w2/191545978/si...
• a journal-based metric
• proprietary
• incomplete

http://www.flickr.com/photos/m2w2/191545978/sizes/z/in/photostream/
Citations

Policy and practice
Media

Usage

Twitter

Textbooks
Reference managers
Wikipedia
v10
New metrics and indicators of
scholarship

• From one measure to many
• From journal to article
• From one output to many
• Recommendations for
publishers, funders, institutio
ns, metrics suppliers, and
researchers
• >9000 signatories

http://w...
Three themes
• eliminate the use of impact factors in
decisions about funding and careers
• opportunities provided by onli...
Funding agencies and institutions
• be transparent about processes
• base judgement on outputs rather than
journal names a...
Taking steps to improve assessment (1)

“Varmus wants to replace a section that now lists
major publications with a narrat...
Taking steps to improve assessment (2)
Sandra L. Schmid, chair of the Department of Cell Biology
at the University of Texa...
Taking steps to improve assessment (3)
No sub-panel will make any use of journal impact
factors, rankings, lists or the pe...
Publishers
• don’t promote the impact factor (ideally)
• provide new metrics and indicators
• make reference lists availab...
Metrics service providers
• be transparent about methods
• make data open
• consider how to respond to gaming
Metrics service providers
Researchers
• challenge use impact factor in research
assessment
• promote assessment on the basis of
research content
• u...
Summary

• Current methods are poor
• New approaches are possible
• And steadily, improvements are being
made
• Signing DORA is a start, but what
happens next?
• Further discussion amongst key
stakeholders
• Identify specific action...
Thank you

Mark Patterson
m.patterson@elifesciences.org
DORA and the reinvention of research assessment
DORA and the reinvention of research assessment
DORA and the reinvention of research assessment
DORA and the reinvention of research assessment
DORA and the reinvention of research assessment
Upcoming SlideShare
Loading in...5
×

DORA and the reinvention of research assessment

238

Published on

Given at the EIFL General Assembly 2013

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
238
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "DORA and the reinvention of research assessment"

  1. 1. DORA and reinventing research assessment EIFL, Istanbul, November, 2013 Mark Patterson, Executive Director, eLife
  2. 2. Outline • A few words about eLife • Research assessment – old and new approaches • DORA recommendations
  3. 3. Funders taking direct action
  4. 4. eLife: motivations
  5. 5. eLife: motivations Swift, fair, decisive process Exploit digital media Serve science Open access
  6. 6. %PubMed available as open access in PMC 14.0% 12.0% 10.0% 8.0% 6.0% 4.0% 2.0% 0.0% 2006 2007 2008 2009 2010 2011 2012
  7. 7. Researchers (authors and readers) Institutions Librarians Funders Research assessment The public Policy makers Publishers
  8. 8. What journal? What impact factor?
  9. 9. The impact factor is… • a journal-based metric • proprietary • incomplete http://www.flickr.com/photos/m2w2/191545978/sizes/z/in/photostream/
  10. 10. • a journal-based metric • proprietary • incomplete http://www.flickr.com/photos/m2w2/191545978/sizes/z/in/photostream/
  11. 11. Citations Policy and practice Media Usage Twitter Textbooks Reference managers Wikipedia
  12. 12. v10
  13. 13. New metrics and indicators of scholarship • From one measure to many • From journal to article • From one output to many
  14. 14. • Recommendations for publishers, funders, institutio ns, metrics suppliers, and researchers • >9000 signatories http://www.flickr.com/photos/24736216@N07/7758828268/ (CC BY-NC2.0)
  15. 15. Three themes • eliminate the use of impact factors in decisions about funding and careers • opportunities provided by online publication (such as new indicators of significance and impact) • assess research on its own merits
  16. 16. Funding agencies and institutions • be transparent about processes • base judgement on outputs rather than journal names and metrics • consider multiple outputs/outcomes • consider multiple metrics and indicators
  17. 17. Taking steps to improve assessment (1) “Varmus wants to replace a section that now lists major publications with a narrative describing the investigator's five major accomplishments.” Science, 25 October 2013
  18. 18. Taking steps to improve assessment (2) Sandra L. Schmid, chair of the Department of Cell Biology at the University of Texas Southwestern Medical Center. “We believe we can recognize excellence that has been missed by journal editors.” Applicants for faculty positions provide a cover letter describing: (1) most significant scientific accomplishment as a graduate student (2) most significant scientific accomplishment as a postdoc (3) overall goals/vision for their research program (4) experience and qualifications that make them able to achieve those goals Science Careers, 3 September 2013
  19. 19. Taking steps to improve assessment (3) No sub-panel will make any use of journal impact factors, rankings, lists or the perceived standing of publishers in assessing the quality of research outputs. An underpinning principle of the REF is that all types of research and all forms of research outputs across all disciplines shall be assessed on a fair and equal basis.
  20. 20. Publishers • don’t promote the impact factor (ideally) • provide new metrics and indicators • make reference lists available as open data • promote responsible authorship and author contributions
  21. 21. Metrics service providers • be transparent about methods • make data open • consider how to respond to gaming
  22. 22. Metrics service providers
  23. 23. Researchers • challenge use impact factor in research assessment • promote assessment on the basis of research content • use new metrics/indicator to support your own career
  24. 24. Summary • Current methods are poor • New approaches are possible • And steadily, improvements are being made
  25. 25. • Signing DORA is a start, but what happens next? • Further discussion amongst key stakeholders • Identify specific actions • Monitor and report progress http://www.flickr.com/photos/24736216@N07/7758828268/ (CC BY-NC2.0)
  26. 26. Thank you Mark Patterson m.patterson@elifesciences.org
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×