Björn Brembs
Universität Regensburg
http://brembs.net
• Thomson Reuters: Impact Factor
• Eigenfactor (now Thomson Reuters)
• ScImago JournalRank (SJR)
• Scopus: SNIP, SJR
Sourc...
Only read publications from high-ranking journals
Publikationstätigkeit
(vollständige Publikationsliste, darunter Originalarbeiten als Erstautor/in,
Seniorautor/in, Impact-...
Only read publications from high-ranking journals
Only publish in high-ranking journals
Is journal rank like astrology?
• Who knows what the IF is?
• Who uses the IF to pick a journal
(rate a candidate, etc.)?
• Who knows how the IF is calcul...
A1 A2
C12
time
citations
published
articles
published
year 1 year 2 year 3
Introduced in 1950’s by Eugene Garfield: ISI
40 60
100
time
citations
published
articles
published
year 1 year 2 year 3
Introduced in 1950’s by Eugene Garfield: ISI
Journal X IF 2010=
All citations from TR indexed journals in 2012 to papers in journal X
Number of citable articles publis...
• Negotiable
• Irreproducible
• Mathematically
unsound
• PLoS Medicine, IF 2-11 (8.4)
(The PLoS Medicine Editors (2006) The Impact Factor Game. PLoS Med 3(6): e291.
http://www.p...
• Rockefeller University Press bought their
data from Thomson Reuters
• Up to 19% deviation from published records
• Secon...
• Left-skewed distributions
• Weak correlation of individual article citation
rate with journal IF
Seglen PO (1997): Why t...
The weakening relationship between the Impact Factor and papers' citations in the digital age (2012): George A.
Lozano, Vi...
Brown, E. N., & Ramaswamy, S. (2007). Quality of protein crystal structures. Acta Crystallographica
Section D Biological C...
Munafò, M., Stothart, G., & Flint, J. (2009). Bias in genetic association studies and impact factor Molecular
Psychiatry, ...
Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal
rank. Frontiers in Human Neur...
Journal rank is a figment of
our imagination.
Fang et al. (2012): Misconduct accounts for the majority of retracted scientific publications. PNAS 109 no. 42 17028-17033
Data from: Fang, F., & Casadevall, A. (2011). RETRACTED SCIENCE AND THE RETRACTION INDEX Infection and Immunity DOI: 10.11...
The weakening relationship between the Impact Factor and papers' citations in the digital age (2012): George A.
Lozano, Vi...
“High-Impact” journals attract
the most unreliable research
“Do you trust scientists?”
“Who can you trust these days?”
“Politicians? Financial experts? Realtors?“
The disaster of our digital
infrastructure
Journal rank is a relic of the
print era
Modified from ARL: http://www.arl.org/bm~doc/arlstats06.pdf, http://www.arl.org/bm~doc/arlstat08.pdf
%Change
-50
0
50
100
...
Institutions produce
publications, data and software
Dysfunctional scholarly
literature
• No scientific impact
analysis
• Limited access
• No global search
• No functional hyperlinks
• No flexible data
visualiz...
Scientific data in peril
Non-existent software
archives
• Institutional email
• Institutional
webspace
• Institutional blog
• Library access card
• Open access
repository
• No ar...
Science, tear down this
paywall!
• Harvest all Open Access Publications
– Accessible via single interface
– Not just from green repositories
– Everything n...
• Global search and access for all
literature, software and data
• Intelligent sort, filter and discover
functionalities
•...
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
The pernicious habit of ranking scientists by the journals they publish in
Upcoming SlideShare
Loading in …5
×

The pernicious habit of ranking scientists by the journals they publish in

18,666 views

Published on

The empirical evidence against the use of journal rank as an evaluation tool and how to fix the scientific infrastructure.

Published in: Technology, Education
1 Comment
5 Likes
Statistics
Notes
  • On slide 57: a kind of demo for making things available via a single interface is the Open Access Media Importer at
    https://commons.wikimedia.org/wiki/User:Open_Access_Media_Importer_Bot ,
    which collects video and audio files from openly licensed articles indexed in PubMed Central, and puts them onto Wikimedia Commons, where they can be accessed in a uniform way via
    https://commons.wikimedia.org/wiki/Category:Uploaded_with_Open_Access_Media_Importer .
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
18,666
On SlideShare
0
From Embeds
0
Number of Embeds
668
Actions
Shares
0
Downloads
34
Comments
1
Likes
5
Embeds 0
No embeds

No notes for slide

The pernicious habit of ranking scientists by the journals they publish in

  1. 1. Björn Brembs Universität Regensburg http://brembs.net
  2. 2. • Thomson Reuters: Impact Factor • Eigenfactor (now Thomson Reuters) • ScImago JournalRank (SJR) • Scopus: SNIP, SJR Source Normalized Impact per Paper
  3. 3. Only read publications from high-ranking journals
  4. 4. Publikationstätigkeit (vollständige Publikationsliste, darunter Originalarbeiten als Erstautor/in, Seniorautor/in, Impact-Punkte insgesamt und in den letzten 5 Jahren, darunter jeweils gesondert ausgewiesen als Erst- und Seniorautor/in, persönlicher Scientific Citations Index (SCI, h-Index nach Web of Science) über alle Arbeiten) Publications: Complete list of publications, including original research papers as first author, senior author, impact points total and in the last 5 years, with marked first and last-authorships, personal Scientific Citations Index (SCI, h-Index according to Web of Science) for all publications.
  5. 5. Only read publications from high-ranking journals
  6. 6. Only publish in high-ranking journals
  7. 7. Is journal rank like astrology?
  8. 8. • Who knows what the IF is? • Who uses the IF to pick a journal (rate a candidate, etc.)? • Who knows how the IF is calculated and from what data?
  9. 9. A1 A2 C12 time citations published articles published year 1 year 2 year 3 Introduced in 1950’s by Eugene Garfield: ISI
  10. 10. 40 60 100 time citations published articles published year 1 year 2 year 3 Introduced in 1950’s by Eugene Garfield: ISI
  11. 11. Journal X IF 2010= All citations from TR indexed journals in 2012 to papers in journal X Number of citable articles published in journal X in 20010/11 €30,000-130,000/year subscription rates Covers ~11,500 journals (Scopus covers ~16,500)
  12. 12. • Negotiable • Irreproducible • Mathematically unsound
  13. 13. • PLoS Medicine, IF 2-11 (8.4) (The PLoS Medicine Editors (2006) The Impact Factor Game. PLoS Med 3(6): e291. http://www.plosmedicine.org/article/info:doi/10.1371%2Fjournal.pmed.0030291) • Current Biology IF from 7 to 11 in 2003 – Bought by Cell Press (Elsevier) in 2001…
  14. 14. • Rockefeller University Press bought their data from Thomson Reuters • Up to 19% deviation from published records • Second dataset still not correct Rossner M, van Epps H, Hill E (2007): Show me the data. The Journal of Cell Biology, Vol. 179, No. 6, 1091-1092 http://jcb.rupress.org/cgi/content/full/179/6/1091
  15. 15. • Left-skewed distributions • Weak correlation of individual article citation rate with journal IF Seglen PO (1997): Why the impact factor of journals should not be used for evaluating research. BMJ 1997;314(7079):497 (15 February) http://www.bmj.com/cgi/content/full/314/7079/497
  16. 16. The weakening relationship between the Impact Factor and papers' citations in the digital age (2012): George A. Lozano, Vincent Lariviere, Yves Gingras arXiv:1205.4328
  17. 17. Brown, E. N., & Ramaswamy, S. (2007). Quality of protein crystal structures. Acta Crystallographica Section D Biological Crystallography, 63(9), 941–950. doi:10.1107/S0907444907033847
  18. 18. Munafò, M., Stothart, G., & Flint, J. (2009). Bias in genetic association studies and impact factor Molecular Psychiatry, 14 (2), 119-120 DOI: 10.1038/mp.2008.77
  19. 19. Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal rank. Frontiers in Human Neuroscience, 7. doi:10.3389/fnhum.2013.00291
  20. 20. Journal rank is a figment of our imagination.
  21. 21. Fang et al. (2012): Misconduct accounts for the majority of retracted scientific publications. PNAS 109 no. 42 17028-17033
  22. 22. Data from: Fang, F., & Casadevall, A. (2011). RETRACTED SCIENCE AND THE RETRACTION INDEX Infection and Immunity DOI: 10.1128/IAI.05661-11
  23. 23. The weakening relationship between the Impact Factor and papers' citations in the digital age (2012): George A. Lozano, Vincent Lariviere, Yves Gingras arXiv:1205.4328
  24. 24. “High-Impact” journals attract the most unreliable research
  25. 25. “Do you trust scientists?”
  26. 26. “Who can you trust these days?”
  27. 27. “Politicians? Financial experts? Realtors?“
  28. 28. The disaster of our digital infrastructure
  29. 29. Journal rank is a relic of the print era
  30. 30. Modified from ARL: http://www.arl.org/bm~doc/arlstats06.pdf, http://www.arl.org/bm~doc/arlstat08.pdf %Change -50 0 50 100 150 200 250 300 350 400 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 2008 Subscription prices CPI/inflation Journals purchased
  31. 31. Institutions produce publications, data and software
  32. 32. Dysfunctional scholarly literature
  33. 33. • No scientific impact analysis • Limited access • No global search • No functional hyperlinks • No flexible data visualization • No submission standards • (Almost) no statistics • No text/data-mining • No effective way to sort, filter and discover • No networking feature • etc. …it’s like the web in 1995!
  34. 34. Scientific data in peril
  35. 35. Non-existent software archives
  36. 36. • Institutional email • Institutional webspace • Institutional blog • Library access card • Open access repository • No archiving of publications • No archiving of software • No archiving of data
  37. 37. Science, tear down this paywall!
  38. 38. • Harvest all Open Access Publications – Accessible via single interface – Not just from green repositories – Everything not obviously illegal • Integrate resulting database – PubMed – Google Scholar • Plug the gaps:
  39. 39. • Global search and access for all literature, software and data • Intelligent sort, filter and discover functionalities • Scientific, evidence-based reputation system • Authoring tool for collaborative writing ans single-click submission • Orders of magnitude cheaper: US$90/paper (e.g. SciELO) vs. US$4,800/paper (subscription)

×