Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Fixing the infrastructure for open science

1,079 views

Published on

Quickly throw together slides for a Knowledge Exchange meeting in Berlin

  • Be the first to comment

Fixing the infrastructure for open science

  1. 1. Björn BrembsUniversität Regensburg http://brembs.net
  2. 2. Institutions producepublications, data and software
  3. 3. • Institutional email • No archiving of• Institutional publications webspace • No archiving of• Institutional blog software• Library access card • No archiving of• Open access data repository
  4. 4. Dysfunctional scholarlyliterature
  5. 5. • Limited access • No global search • No functional hyperlinks • No flexible data visualization • No submission standards • (Almost) no statistics • No text/data-mining • No effective way to sort, filter and discover…it’s like the • No scientific impactweb in 1995! analysis • No networking feature • etc.
  6. 6. Scientific data in peril
  7. 7. Non-existent softwarearchives
  8. 8. Technically feasible today (almost)• No more corporate publishers – libraries archive everything and make it publicly accessible according to a world-wide standard• Single semantic, decentralized database of literature, data and software
  9. 9. Software to control the experiment and save the data
  10. 10. Software to analyze and visualize the data
  11. 11. However, a version is already available
  12. 12. Same type of experiments → samescriptDefault: → same categories → same tags → same authors → same links → same description→ One complete article, in one click.Update the figure:Higher sample size directly publishedwhile analysed, your boss may see theresults before you do! (or you may seethe results of your student before theydo)Possibility to make it public and citablein one click or directly in the R code.
  13. 13. http://dx.doi.org/10.6084/m9.figshare.97792
  14. 14. One person is not aninstitutional infrastructure!
  15. 15. Source Normalized Impact per Paper • Thomson Reuters: Impact Factor • Eigenfactor (now Thomson Reuters) • ScImago JournalRank (SJR) • Scopus: SNIP, SJR
  16. 16. Publikationstätigkeit(vollständige Publikationsliste, darunter Originalarbeiten alsErstautor/in, Seniorautor/in, Impact-Punkte insgesamt und in denletzten 5 Jahren, darunter jeweils gesondert ausgewiesen als Erst- undSeniorautor/in, persönlicher Scientific Citations Index (SCI, h-Indexnach Web of Science) über alle Arbeiten)Publications:Complete list of publications, including original research papers as firstauthor, senior author, impact points total and in the last 5 years, withmarked first and last-authorships, personal Scientific Citations Index(SCI, h-Index according to Web of Science) for all publications.
  17. 17. Lies, damn lies andbibliometrics
  18. 18. Introduced in 1960’s by Eugene Garfield: ISI citations articles 2010 2008 and 2009 IF=5 Articles published in 08/09were cited an average of 5 times in 10.
  19. 19. Journal X IF 2010= All citations from TR indexed journals in 2010 to papers in journal X Number of citable articles published in journal X in 2008/9 €30,000-130,000/year subscription rates Covers ~11,500 journals (Scopus covers ~16,500)
  20. 20. • Negotiable• Irreproducible• Mathematically unsound
  21. 21. • PLoS Medicine, IF 2-11 (8.4) (The PLoS Medicine Editors (2006) The Impact Factor Game. PLoS Med 3(6): e291. http://www.plosmedicine.org/article/info:doi/10.1371%2Fjournal.pmed.0030291)• Current Biology IF from 7 to 11 in 2003 – Bought by Cell Press (Elsevier) in 2001…
  22. 22. • Rockefeller University Press bought their data from Thomson Reuters • Up to 19% deviation from published records • Second dataset still not correctRossner M, van Epps H, Hill E (2007): Showme the data. The Journal of Cell Biology, Vol.179, No. 6, 1091-1092http://jcb.rupress.org/cgi/content/full/179/6/1091
  23. 23. • Left-skewed distributions • Weak correlation of individual article citation rate with journal IFSeglen PO (1997): Why the impact factor of journals should not be used for evaluating research. BMJ 1997;314(7079):497 (15 February)http://www.bmj.com/cgi/content/full/314/7079/497
  24. 24. Relation of journal-impact factor to retractions for fraud or suspected fraud, error, and plagiarism, or duplicate publication. Fang F C et al. PNAS 2012;109:17028-17033©2012 by National Academy of Sciences
  25. 25. (A) Number of retracted articles for specific causes by year of retraction. Fang F C et al. PNAS 2012;109:17028-17033©2012 by National Academy of Sciences

×