Love for science or 'Academic Prostitution' - ERC talk version


Published on

Invited talk at the European Research Council-Brussels (Scientific Seminar, 12 April 2013): "Love for Science or 'academic prostitution'". In this talk I present a personal revision (sometimes my own vision) of some issues that I consider key for doing Science. It was focused on the expected audience, mainly Scientific Officers with background in different fields of science and scholarship, but also Agency staff.
Abstract: In a recent Special issue of Nature concerning Science Metrics it was claimed that " Research reverts to a kind of 'academic prostitution' in which work is done to please editors and referees rather than to further knowledge."If this is true, funding agencies should try to avoid falling into the trap of their own system. By perpetuating this 'prostitution' they risk not funding the best research but funding the best sold research.
Given the current epoch of economical crisis, where in a quest for funds researchers are forced into competitive game of pandering to panelists, its seems a good time for deep reflection about the entire scientific system.
With this talk I aim to provoke extra critical thinking among the committees who select evaluators, and among the evaluators, who in turn require critical thinking to the candidates when selecting excellent science.
I will present some initiatives (e.g. new tracers of impact for the Web era- 'altmetrics'), and on-going projects (e.g. how to move from publishing advertising to publishing knowledge), that might enable us to favor Science over marketing.

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Love for science or 'Academic Prostitution' - ERC talk version

  1. 1. Qetesh, 1120 BC Neith, 1250 BC Love for Science or ‘Academic Prostitution’? Lourdes Verdes-Montenegro Instituto de Astrofísica de Andalucía (CSIC) ERC Scientific Seminar, Brussels,12/04/2013
  2. 2. NGC 5216: Keenans System by Winder/Hager
  3. 3. Environment and galaxies Large sample Can’t reproduce! Sharing Reinvent? e-ScienceEfficient search Different wavelengths What to publish? Analysis tools NGC 5216: Keenans System by Winder/Hager
  4. 4. Outline• Some warnings: •Marketing, Citations, Tricks• Economy?• New tools to measure impact• New publication methods• Reproducibility: data + methods• Then what?
  5. 5. Research  reverts  to  a  kind  of  academic  pros4tu4on,  in  which  work  is  done  to  please  editors  and  referees  rather  than  to  further  knowledge.
  6. 6. Research  reverts  to  a  kind  of  academic  pros4tu4on,  in  which  work  is  done  to  please  editors  and  referees  rather  than  to  further  knowledge. Academia  is  to  knowledge  what   pros4tu4on  is  to  love;  close  enough  on   the  surface  but,  to  the  nonsucker,  not   exactly  the  same  thing Nassim  Nicholas  Taleb,  The  Bed  of  Procrustes:   Philosophical  and  Prac5cal  Aphorisms
  7. 7. ...  “Science  is  being  killed  by  numerical  ranking,”[...]  Ranking  systems  lures  scien4sts  into  pursuing  high  rankings  first  and  good  science  second.  
  8. 8. Evaluator of yearly review ofFP7 EC STREP project:“There are people who arepaying other researchers toget their papers cited, so as toincrease their h-index”
  9. 9. Marketing for Scientists is a Facebook group, a blog, a workshop, anda book published by Island Press, meant to help scientists build thecareers they want and restore science to its proper place in society.Sometimes, unlocking the mysteries of the universe just isnt enough.
  10. 10. Difficult  to  learn  from  mistakes  made  in  evalua4ons  of  tenure  promo4ons  and  grants,  because  the  decision-­‐making  processes  are  rarely  transparent
  11. 11. ...  an  authors  h-­‐index  can  reflect  longevity  as  much  as  quality  —  and  can  never  go  down  with  age,  even  if  a  researcher  drops  out  of  science  altogether.
  12. 12. Cita4ons:  •  Simple  way  to  denote  influence•Hard  to  compare  between  fields  or  career  stagesImpact  factor:•In  2005,  89%  of  Nature’s  impact  factor  was  generated  by  25%  of  the  ar4cles
  13. 13. Is peer review any good? (Casati et al) • Rankings of the review process vs impact (citations): Very little correlation •Peer review filters out papers that are most likely to have impact: Not confirmedExploring and Understanding Scientific Metrics in CitationNetworks (Krapivin et al) PaperRank PaperRank Citation counts Citation counts
  14. 14. Reputation and Impact in Academic CareersAlexander M. Petersen Goal: to better understand the role of social ties, author reputation, and the citation life cycle of individual papers • author reputation dominates in the initial phase of a papers citation life cycle --> papers gain a significant early citation advantage if written by authors already having high reputations in the scientific community.
  15. 15. CITATIONS “Remains of Holocene giant pandas from Jiangdong Mountain (Yunnan, China) and their relevance to the evolution of quaternary environments in south-western China” (by Jablonski et al. and published in Historical Biology)“A quick look at the actual conversations about the paper reveal that itwas Figure 7, not the research content of the paper, that attracted all ofthe attention” Jean Liu, 2013, Who loves Pandas?
  16. 16. CITATIONS“A quick look at the actual conversations about the paper reveal that itwas Figure 7, not the research content of the paper, that attracted all ofthe attention” Jean Liu, 2013, Who loves Pandas?
  17. 17. CITATIONSRobert Antonucci. NATURE, 495, 165
  18. 18. ECONOMY? “What has economics to do with science?economics is about understanding how human beings behave when one or more resources are scarce” Blog M Nielsen 2008 People pushed to apply for grants
  19. 19. ECONOMY?Examples of advices to improve chances of getting a grant: •title of the project counts 50% •proposals circulated in the instituteOK that sounds fun, but what does it reflect? “Evaluators don’t have time to read in detail proposals”“Evaluators are not experts, so if your full institute can follow and find it attractive a typical evaluator will”
  20. 20. ECONOMY? R. Brooks (Univ. New South Economy has a bad influence in:• Candidates: pushed to get funds• Funders: expensive to get enough experts during enough time hence in Science
  21. 21. ECONOMY? “OPTING FOR OPEN ACCESS MEANS CONSIDERING COSTS, JOURNAL PRESTIGE AND CAREER IMPLICATIONS” STEPHEN PINCOCK, 2013. NATURE, 495, 539• Senior advice to young scientists: go to the most prestigious journal
  22. 22. ECONOMY?•F
  23. 23. IMPACTCitations represent less than 1% of usage for an article. PLOS (Public Library of Science) (November 2012) Richard Cave at the Charleston Conference 2012, Charleston
  24. 24. IMPACTaltmetrics is the creation and study of new metrics basedon the Social Web for analyzing, and informing scholarship.
  25. 25. IMPACTIndicators for funding bodies of recent research (a large number ofdownloads, views, plays...):how open and accessible scientists are making theirresearchStrongly recommend altmetrics be considered not as a replacement butas a supplement for careful expert evaluation:to highlight research products that might otherwise go unnoticed Alternative metrics are thought to free researchers from conventional measures of prestige STEPHEN PINCOCK, 2013. NATURE, 495, 539
  26. 26. IMPACT & ECONOMY
  27. 27. IMPACT Head of digital services at the Wellcome Trust Library (one of the worlds major resources for the study of medical history):Policies of the UK programme forassessing research quality, theResearch Excellence Framework:no grant-review sub-panel “willmake any use of journal impactfactors, rankings, lists or theperceived standing of publishers inassessing the quality of researchoutputs”
  28. 28. IMPACT Not only a solution: it is just happeningIn the next ten years, most scholars will join suchnetworks, driven by both the value of improvednetworking and the fear of being left out ofimportant conversations.The flow of scholarly informationis expanding by orders ofmagnitude, swamping our paper-based filtering system J. PRIEM, 2013. NATURE, 495, 437J. PRIEM, 2013. NATURE, 495, 437
  29. 29. In the Web era, scholarship leaves footprints. IMPACTThe editors and reviewers employed as proxy communityassessors will be replaced by the aggregated, collectivejudgements of communities themselves J. PRIEM, 2013. NATURE, 495, 437 is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship.
  30. 30. In the Web era, scholarship leaves footprints. IMPACTThe editors and reviewers employed as proxy communityassessors will be replaced by the aggregated, collectivejudgements of communities themselves J. PRIEM, 2013. NATURE, 495, 437 is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship.
  31. 31. ALTMETRICS = NEW PUBLICATION METHODSAuthority and expertise are central in the Web era as they were in the journal era.The difference is that whereas the paper-based system used subjective criteria toidentify authoritative voices, the Web-based one assesses authorityrecursively from the entire community. J. PRIEM, 2013. NATURE, 495, 437
  32. 32. ALTMETRICS = NEW PUBLICATION METHODSReputation:a (very) rough measurement of how much the MathOverflow communitytrusts you.never given, earned by convincing other users that you know what youretalking about. •good question or helpful answer: voted up by peers: 10 points •off topic or incorrect: voted down: -2 points.✴10 = Make community wiki posts✴100 =Vote down✴250 = Vote to close or reopen your questions✴2000 = Edit other peoples posts....
  33. 33. ALTMETRICS = NEW PUBLICATION METHODS• Journals adopting an open/collaborative process of review/evaluation •arXiv, Nature Precedings, PlosOne• Social bookmarking and tagging: •Connotea, CiteULike,, BibSonomy
  34. 34. NEW PUBLICATION METHODS NANOPUBLICATIONS Smallest unit of publishable information•Triplet - subject, predicate, object: UNIPROT 05067 is a protein•Uniquely identified and attributed to its author•Can be serialized using existing ontologies and RDF•Machine readable: knowledge exchange assisted by computers•Administered by the Concept Web Alliance•Based on open standard• Twittered nano-publication assessed by 1000 experts
  35. 35. A blog that reports on retractions of scientific papers(Ivan Oransky - executive editor of Reuters Health - and Adam Marcus- managing editor of Anesthesiology News)Aim: to increase the transparency of the retraction process:retractions of papers generally are not announced, and the reasons forretractions are not publicized.
  36. 36. Open access, peer-reviewed, promotes discussion of results: •unexpected, controversial, provocative and/or negative •that challenge current models, tenets or dogmas. •illustrate how commonly used methods and techniques are unsuitable for studying a particular phenomenon.Not all will turn out to be of such groundbreaking significance. However, we strongly believe that such "negative" observations and conclusions, based on rigorous experimentation and thorough documentation, ought to be published in order to be discussed, confirmed or refuted by others.
  37. 37. NEW PUBLICATION METHODS A MOVEMENT TO PUBLISH RESEARCH IN REAL TIMEThe journal Push lets scholars build journal articles incrementally, witheach version tracked and open online, available for collaboration andcomment throughout (see
  38. 38. publishing articles like software releases versioned DOIs formal
  39. 39. Is NOT a release early, insteadof peer review model.Treat research as software: release notes & version management
  40. 40. ATTENTION TO PUBLISHING DATA AND METHODS Many scientists are too busy or lack the knowledge to tackle data- management on their own R. MONASTERSKY, 2013. NATURE, 495, 430
  41. 41. ATTENTION TO PUBLISHING DATA AND METHODSAbelard and Héloise: Why Data and PublicationsBelong TogetherEefke Smit (International Association of STM Publishers: memberscollectively publish nearly 66% of all journal articles)• Journals to require availability of underlying research material as an editorial policy• Ensure data is stored, curated and preserved in trustworthy places• Ensure links (bi-directional) and persistent identifiers between data and publications• Establish uniform citation practices of data
  42. 42. ATTENTION TO PUBLISHING DATA AND METHODS MOVING FROM NARRATIVES (LAST 300 YRS) TO THE ACTUAL OUTPUT OF RESEARCH• How to measure science output:• data in any format (tables, images, etc)• algorithms• analysis tools• NSF example:• Chapter II.C.2.f(i)(c), Biographical Sketch(es), hasbeen revised to rename the “Publications” section to“Products” and amend terminology and instructionsaccordingly. This change makes clear that productsmay include, but are not limited to, publications, datasets, software, patents, and copyrights.• To make it count, however, it needs to be bothcitable and accessible.
  43. 43. ATTENTION TO PUBLISHING DATA AND METHODSDataOne(US NSF funded) Preservation + access to multi-scale, multi-discipline, and multi-national science data: biological data from the genome to the ecosystem of environmental data available from atmospheric, ecological, hydrological, and oceanographic sourceThe Collage Authoring Environment(Nowakowski et al) A software infrastructure which enables domain scientists to collaboratively develop and publish their work in the form of executable papersPaper Maché: Creating Dynamic ReproducibleScience(Brammer et al 2011) Paper management system using virtual environments so that the full experiment is packaged with a Virtual machine.
  44. 44. Wf4Ever (Workflows forever) project Astronomy Use CasePreservation of the methods • Investigatesand develops technological infrastructure for the preservation and efficient retrieval and reuse of scientific workflows • Introduced the concept of a Research Object, containing the artefacts needed to interpret or reconstruct research- High investment in data infrastructures- Exploitation is usually an issue- Workflows as live-tutorials EU FUNDED FP7 STREP PROJECT DECEMBER 2010 – DECEMBER 2013
  45. 45. Abelard  and  Héloise:  Why  Data  and  Publica4ons  Belong  Together   Ee%e  Smit  (Interna/onal  Associa/on  of  STM  Publishers:  members  collec4vely  publish   nearly  66%  of  all  journal  ar4cles) • Journals  to  require  availability  of  underlying  research  material  as  an   editorial  policy   • Ensure  data  is  stored,  curated  and  preserved  in  trustworthy  places • Ensure  links  (bi-­‐direc/onal)  and  persistent  iden/fiers  between  data   and  publica/ons   • Establish  uniform  cita/on  prac/ces  of  data Astronomy:  ADS has been linking papers with Vizier data. Now also observing proposals, telescope, software is being referenced Collaboration with Wf4Ever started to transform to RO.
  46. 46. THEN WHAT?
  47. 47. (Carole Goble, Beyond the PDF 2013)
  48. 48. ave to h ore eem t bef nci es s righ age nce ... ing Scie und t in en ce f ffecH e es! itive deadlin pos ws re vie (Carole Goble, Beyond the PDF 2013)
  49. 49. SOME IDEAS•Give to the committees the scientometry so they know de facto thattheir role is not counting numbers?•Include someone with basic understanding of scientometrics•Allow to submit top few papers for evaluation to allow tenurecandidates to submit just their top few papers for evaluation (K. Shaw, Scientific Method blog)•In an evaluation, researchers have to show that at least one of theirResearch Objects has been used by someone else. Maybe cited.Preferably Used. (Carole Goble, Beyond the PDF 2013)•Each review needs to be the subject of evaluation, just as with scientificpublications.
  50. 50. SOME REFLEXIONS Can excellence kill Science?•Science works through micro improvements and multiple errorsand failures until something finally works•We’ve become paralyzed with the notion that showing incrementalimprovements and corrections hurts, rather than helps, our personalcareers and science. Who Killed the PrePrint, and Could It Make a Return? By Jason Hoyt and Peter Binfield
  51. 51. SOME REFLEXIONSShift the balance to the Methodology •Clear hypothesis •Data Is it reproducible? is Science •Formula •Methods Give less weight to the results: better quality
  52. 52. Understanding metrics, reducing reliance on rankings, and suggestingnew ways to evaluate scientists are only the beginning; it’s going totake a sea change and lots of cooperation among scientists, journals,and academic and government institutions to banish the “publish orperish” mentality.(K. Shaw, Scientific Method blog)The aim doesn’t justify the mean (Scientific) Method Robert Antonucci. NATURE, 495, 165