Love for science or 'Academic Prostitution' - IAA version

1,305 views

Published on

This is an updated version of an invited talk I presented at the European Research Council-Brussels (Scientific Seminar): "Love for Science or 'academic prostitution'".
It has been updated to be presented at my home institution (Instituto de Astrofísica de Andalucía - CSIC) in a scientific seminar (14 June 2013).
I have included some new slides and revised others.

I present a personal revision (sometimes my own vision) of some issues that I consider key for doing Science. It was focused on the expected audience, mainly Scientific Officers with background in different fields of science and scholarship, but also Agency staff.
Abstract: In a recent Special issue of Nature concerning Science Metrics it was claimed that " Research reverts to a kind of 'academic prostitution' in which work is done to please editors and referees rather than to further knowledge."If this is true, funding agencies should try to avoid falling into the trap of their own system. By perpetuating this 'prostitution' they risk not funding the best research but funding the best sold research.
Given the current epoch of economical crisis, where in a quest for funds researchers are forced into competitive game of pandering to panelists, its seems a good time for deep reflection about the entire scientific system.
With this talk I aim to provoke extra critical thinking among the committees who select evaluators, and among the evaluators, who in turn require critical thinking to the candidates when selecting excellent science.
I will present some initiatives (e.g. new tracers of impact for the Web era- 'altmetrics'), and on-going projects (e.g. how to move from publishing advertising to publishing knowledge), that might enable us to favor Science over marketing.

Published in: Technology
2 Comments
0 Likes
Statistics
Notes
  • Thanks to you, and fully agree on your comment. I was discussing with a colleague today on a related topic, i.e. on how much one can trust citizen's science, and my reaction was, how much can we trust scientist's science. We might be better trained, but are also more biased. The advantage of citizens is that, in general, they don't have a preferred result.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Thanks for the insightful topic! May I also suggest that there is another type of prostitution of science that is also very prevalent: Scientism/ humanism, or the treatment of science as a fashionable psuedo-scientific religious philosophy instead of a practical learning tool.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

No Downloads
Views
Total views
1,305
On SlideShare
0
From Embeds
0
Number of Embeds
146
Actions
Shares
0
Downloads
5
Comments
2
Likes
0
Embeds 0
No embeds

No notes for slide

Love for science or 'Academic Prostitution' - IAA version

  1. 1. Love for Scienceor‘Academic Prostitution’?Qetesh, 1120 BCNeith, 1250 BCLourdesVerdes-MontenegroInstituto de Astrofísica de Andalucía (CSIC)ERC Scientific Seminar, Brussels,12/04/2013---> IAA Seminar 13/06/2013
  2. 2. NGC 5216: Keenans System by Winder/Hager
  3. 3. NGC 5216: Keenans System by Winder/HagerReinvent?Environment and galaxies Large sampleDifferent wavelengthsEfficient searchCan’t reproduce!Analysis toolsWhat should we publish?Sharinge-Science
  4. 4. Outline• Some warnings:•Marketing, Citations,Tricks• Economy?• New tools to measure impact• New publication methods• Reproducibility: data + methods• Then what?The fine art of salami publishing
  5. 5. Research  reverts  to  a  kind  of  academic  pros4tu4on,  in  which  work  is  done  to  please  editors  and  referees  rather  than  to  further  knowledge.
  6. 6. Research  reverts  to  a  kind  of  academic  pros4tu4on,  in  which  work  is  done  to  please  editors  and  referees  rather  than  to  further  knowledge.Academia  is  to  knowledge  what  pros4tu4on  is  to  love;  close  enough  on  the  surface  but,  to  the  nonsucker,  not  exactly  the  same  thingNassim  Nicholas  Taleb,  The  Bed  of  Procrustes:  Philosophical  and  Prac5cal  Aphorisms
  7. 7. ...  “Science  is  being  killed  by  numerical  ranking,”[...]  Ranking  systems  lures  scien4sts  into  pursuing  high  rankings  first  and  good  science  second.  
  8. 8. The  misuse  of  the  journal  impact  factor  is  highly  destruc4ve,  invi4ng  a  gaming  of  the  metric  that  can  bias  journals  against  publishing  important  papers  in  fields  [...]  that  are  much  less  cited  than  others    And  it  wastes  the  4me  of  scien4sts  by  overloading  highly  cited  journals  such  as  Science  with  inappropriate  submissions  from  researchers  who  are  desperate  to  gain  points  from  their  evaluators.
  9. 9. Evaluator of yearly review ofFP7 EC STREP project:“There are people who arepaying other researchers toget their papers cited, so as toincrease their h-index”
  10. 10. Marketing for Scientists is a Facebook group, a blog, a workshop, anda book published by Island Press, meant to help scientists build thecareers they want and restore science to its proper place in society.Sometimes, unlocking the mysteries of the universe just isnt enough.
  11. 11. Difficult  to  learn  from  mistakes  made  in  evalua4ons  of  tenure  promo4ons  and  grants,  because  the  decision-­‐making  processes  are  rarely  transparent
  12. 12. ...  an  authors  h-­‐index  can  reflect  longevity  as  much  as  quality  —  and  can  never  go  down  with  age,  even  if  a  researcher  drops  out  of  science  altogether.
  13. 13. Cita4ons:  •  Simple  way  to  denote  influence•Hard  to  compare  between  fields  or  career  stagesImpact  factor:•In  2005,  89%  of  Nature’s  impact  factor  was  generated  by  25%  of  the  ar4cles
  14. 14. Is peer review any good? (Casati et al)•Rankings of the review process vs impact (citations):Very little correlation•Peer review filters out papers that are most likely to have impact:Not confirmedExploring and Understanding Scientific Metrics in CitationNetworks (Krapivin et al)Citation counts Citation countsPaperRankPaperRank
  15. 15. Reputation and Impact in Academic CareersAlexander M. PetersenGoal: to better understand the role of social ties, author reputation,and the citation life cycle of individual papers•author reputation dominates in the initial phase of a papers citationlife cycle --> papers gain a significant early citation advantage ifwritten by authors already having high reputations in the scientificcommunity.
  16. 16. CITATIONS“Remains of Holocene giant pandas from Jiangdong Mountain(Yunnan, China) and their relevance to the evolution ofquaternary environments in south-western China” (byJablonski et al. and published in Historical Biology)“A quick look at the actual conversations about the paper reveal that itwas Figure 7, not the research content of the paper, that attracted all ofthe attention”Jean Liu, 2013, Who loves Pandas?
  17. 17. “A quick look at the actual conversations about the paper reveal that itwas Figure 7, not the research content of the paper, that attracted all ofthe attention”CITATIONSJean Liu, 2013, Who loves Pandas?
  18. 18. Robert Antonucci. NATURE, 495, 165CITATIONS
  19. 19. ECONOMY?“What has economics to do with science?economics is about understanding how human beings behavewhen one or more resources are scarce”Blog M Nielsen 2008People pushed to apply for grants
  20. 20. ECONOMY?Examples of advices to improve chances of getting a grant:•title of the project counts 50%•proposals circulated in the instituteOK that sounds fun, but what does it reflect?“Evaluators don’t have time to read in detail proposals”“Evaluators are not experts, so if your full institute can follow and find itattractive a typical evaluator will”
  21. 21. ECONOMY?R. Brooks (Univ. New South Wales)Economy has a bad influence in:• Candidates: pushed to get funds• Funders: expensive to get enough experts during enough timehence in Science
  22. 22. • Senior advice to young scientists: go to the most prestigious journal“OPTING FOR OPEN ACCESS MEANS CONSIDERING COSTS, JOURNALPRESTIGE AND CAREER IMPLICATIONS”STEPHEN PINCOCK, 2013. NATURE, 495, 539ECONOMY?
  23. 23. • FECONOMY?
  24. 24. PLOS (Public Library of Science) (November 2012)Richard Cave at the Charleston Conference 2012, CharlestonCitations represent less than 1% of usage for an article.IMPACT
  25. 25. altmetrics is the creation and study of new metrics basedon the Social Web for analyzing, and informing scholarship.IMPACT
  26. 26. Indicators for funding bodies of recent research (a large number ofdownloads, views, plays...):how open and accessible scientists are making theirresearchStrongly recommend altmetrics be considered not as a replacement butas a supplement for careful expert evaluation:to highlight research products that might otherwise go unnoticedIMPACTAlternative metrics are thought to free researchersfrom conventional measures of prestigeSTEPHEN PINCOCK, 2013. NATURE, 495, 539
  27. 27. IMPACT & ECONOMY
  28. 28. Head of digital services atthe Wellcome Trust Library(one of the worlds major resourcesfor the study of medical history):Policies of the UK programme forassessing research quality, theResearch Excellence Framework:no grant-review sub-panel “willmake any use of journal impactfactors, rankings, lists or theperceived standing of publishers inassessing the quality of researchoutputs”IMPACT
  29. 29. J. PRIEM, 2013. NATURE, 495, 437Not only a solution: it is just happeningIn the next ten years, most scholars will join suchnetworks, driven by both the value of improvednetworking and the fear of being left out ofimportant conversations.The flow of scholarly informationis expanding by orders ofmagnitude, swamping our paper-based filtering systemIMPACTJ. PRIEM, 2013. NATURE, 495, 437
  30. 30. In the Web era, scholarship leaves footprints.The editors and reviewers employed as proxy communityassessors will be replaced by the aggregated, collectivejudgements of communities themselvesJ. PRIEM, 2013. NATURE, 495, 437is the creation and study ofnew metrics based on theSocial Web for analyzing, andinforming scholarship.IMPACT
  31. 31. In the Web era, scholarship leaves footprints.The editors and reviewers employed as proxy communityassessors will be replaced by the aggregated, collectivejudgements of communities themselvesJ. PRIEM, 2013. NATURE, 495, 437is the creation and study ofnew metrics based on theSocial Web for analyzing, andinforming scholarship.IMPACT
  32. 32. ALTMETRICS = NEW PUBLICATION METHODSAuthority and expertise are central in the Web era as they were in the journal era.The difference is that whereas the paper-based system used subjective criteria toidentify authoritative voices, the Web-based one assesses authorityrecursively from the entire community.J. PRIEM, 2013. NATURE, 495, 437
  33. 33. Reputation:a (very) rough measurement of how much the MathOverflow communitytrusts you.never given, earned by convincing other users that you know what youretalking about.•good question or helpful answer: voted up by peers: 10 points•off topic or incorrect: voted down: -2 points.✴10 = Make community wiki posts✴100 =Vote down✴250 =Vote to close or reopen your questions✴2000 = Edit other peoples posts....ALTMETRICS = NEW PUBLICATION METHODS
  34. 34. ALTMETRICS = NEW PUBLICATION METHODS• Journals adopting an open/collaborative process of review/evaluation•arXiv, Nature Precedings, PlosOne• Social bookmarking and tagging:•Connotea, CiteULike, Del.icio.us, BibSonomy
  35. 35. •Triplet - subject, predicate, object:UNIPROT 05067 is a protein•Uniquely identified and attributed to its author•Can be serialized using existing ontologies and RDF•Machine readable: knowledge exchange assisted by computers•Administered by the Concept Web Alliance•Based on open standard•Twittered nano-publication assessed by 1000 expertsNANOPUBLICATIONSSmallest unit of publishable informationNEW PUBLICATION METHODSnanopub.org
  36. 36. On September 18, 2007, a few dozen neuroscientists, psychiatrists, anddrug-company executives gathered in a hotel conference room in Brussels tohear some startling news. It had to do with a class of drugs known asatypical or second-generation antipsychotics, which came on the market inthe early nineties. The drugs, sold under brand names such as Abilify,Seroquel, and Zyprexa, had been tested on schizophrenics in severallarge clinical trials, all of which had demonstrated a dramatic decreasein the subjects’ psychiatric symptoms. As a result, second-generationantipsychotics had become one of the fastest-growing and mostprofitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa wasgenerating more revenue than Prozac. It remains the company’s top-sellingdrug.But the data presented at the Brussels meeting made it clear that somethingstrange was happening: the therapeutic power of the drugs appeared tobe steadily waning.steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, inthe early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any betterthan first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now lookeven worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.
  37. 37. On September 18, 2007, a few dozen neuroscientists, psychiatrists, anddrug-company executives gathered in a hotel conference room in Brussels tohear some startling news. It had to do with a class of drugs known asatypical or second-generation antipsychotics, which came on the market inthe early nineties. The drugs, sold under brand names such as Abilify,Seroquel, and Zyprexa, had been tested on schizophrenics in severallarge clinical trials, all of which had demonstrated a dramatic decreasein the subjects’ psychiatric symptoms. As a result, second-generationantipsychotics had become one of the fastest-growing and mostprofitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa wasgenerating more revenue than Prozac. It remains the company’s top-sellingdrug.But the data presented at the Brussels meeting made it clear that somethingstrange was happening: the therapeutic power of the drugs appeared tobe steadily waning.steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, inthe early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any betterthan first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now lookeven worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.
  38. 38. On September 18, 2007, a few dozen neuroscientists, psychiatrists, anddrug-company executives gathered in a hotel conference room in Brussels tohear some startling news. It had to do with a class of drugs known asatypical or second-generation antipsychotics, which came on the market inthe early nineties. The drugs, sold under brand names such as Abilify,Seroquel, and Zyprexa, had been tested on schizophrenics in severallarge clinical trials, all of which had demonstrated a dramatic decreasein the subjects’ psychiatric symptoms. As a result, second-generationantipsychotics had become one of the fastest-growing and mostprofitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa wasgenerating more revenue than Prozac. It remains the company’s top-sellingdrug.But the data presented at the Brussels meeting made it clear that somethingstrange was happening: the therapeutic power of the drugs appeared tobe steadily waning.steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, inthe early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any betterthan first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now lookeven worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.Reproducible  experimentsGood  sta4s4cs
  39. 39. On September 18, 2007, a few dozen neuroscientists, psychiatrists, anddrug-company executives gathered in a hotel conference room in Brussels tohear some startling news. It had to do with a class of drugs known asatypical or second-generation antipsychotics, which came on the market inthe early nineties. The drugs, sold under brand names such as Abilify,Seroquel, and Zyprexa, had been tested on schizophrenics in severallarge clinical trials, all of which had demonstrated a dramatic decreasein the subjects’ psychiatric symptoms. As a result, second-generationantipsychotics had become one of the fastest-growing and mostprofitable pharmaceutical classes. By 2001, Eli Lilly’s Zyprexa wasgenerating more revenue than Prozac. It remains the company’s top-sellingdrug.But the data presented at the Brussels meeting made it clear that somethingstrange was happening: the therapeutic power of the drugs appeared tobe steadily waning.steadily waning. A recent study showed an effect that was less than half of that documented in the first trials, inthe early nineteen-nineties. Many researchers began to argue that the expensive pharmaceuticals weren’t any betterthan first-generation antipsychotics, which have been in use since the fifties. “In fact, sometimes they now lookeven worse,” John Davis, a professor of psychiatry at the University of Illinois at Chicago, told me.Reproducible  experimentsGood  sta4s4cs
  40. 40. A blog that reports on retractions of scientific papers(Ivan Oransky - executive editor of Reuters Health - and Adam Marcus- managing editor of Anesthesiology News)Aim: to increase the transparency of the retraction process:retractions of papers generally are not announced, and the reasons forretractions are not publicized.
  41. 41. Open access, peer-reviewed, promotes discussion of results:•unexpected, controversial, provocative and/or negative•that challenge current models, tenets or dogmas.•illustrate how commonly used methods and techniques areunsuitable for studying a particular phenomenon.Not all will turn out to be of such groundbreaking significance.However, we strongly believe that such "negative"observations and conclusions, based on rigorousexperimentation and thorough documentation, oughtto be published in order to be discussed, confirmed orrefuted by others.
  42. 42. •Several journals have adopted apractice of automatically rejecting anymanuscript that has received two criticalreports.Unfortunately, such a policy virtuallyensures that important new ideas arerejected, whereas innovative papers arejust the sort that we should most wantto publish.Helmut A.Abt June 2013ApJ Editor-in-Chief for 28 years, till 1999NEW PUBLICATION METHODS
  43. 43. The journal Push lets scholars build journal articles incrementally, witheach version tracked and open online, available for collaboration andcomment throughout (see http://push.cwcon.org).A MOVEMENT TO PUBLISH RESEARCH IN REAL TIMENEW PUBLICATION METHODS
  44. 44. publishing articles likesoftware releases formalversioned DOIs
  45. 45. Is NOT a release early, insteadof peer review model.Treat research as software:release notes & version management
  46. 46. Many scientists are too busy or lack the knowledge to tackle data-management on their ownR. MONASTERSKY, 2013. NATURE, 495, 430ATTENTION TO PUBLISHING DATA AND METHODS
  47. 47. Abelard and Héloise: Why Data and PublicationsBelong TogetherEefke Smit (International Association of STM Publishers: memberscollectively publish nearly 66% of all journal articles)•Journals to require availability of underlying research material as aneditorial policy•Ensure data is stored, curated and preserved in trustworthy places•Ensure links (bi-directional) and persistent identifiers between dataand publications•Establish uniform citation practices of dataATTENTION TO PUBLISHING DATA AND METHODS
  48. 48. • How to measure science output:• data in any format (tables, images, etc)• algorithms• analysis tools• NSF example:• Chapter II.C.2.f(i)(c), Biographical Sketch(es), hasbeen revised to rename the “Publications” section to“Products” and amend terminology and instructionsaccordingly.This change makes clear that productsmay include, but are not limited to, publications, datasets, software, patents, and copyrights.•To make it count, however, it needs to be bothcitable and accessible.MOVING FROM NARRATIVES (LAST 300YRS)TO THE ACTUALOUTPUT OF RESEARCHATTENTION TO PUBLISHING DATA AND METHODShttp://datapub.cdlib.org
  49. 49. DataOne(US NSF funded)Preservation + access to multi-scale, multi-discipline, and multi-national science data:biological data from the genome to the ecosystem of environmental data availablefrom atmospheric, ecological, hydrological, and oceanographic sourceThe Collage Authoring Environment(Nowakowski et al)A software infrastructure which enables domain scientists to collaborativelydevelop and publish their work in the form of executable papersPaper Maché: Creating Dynamic ReproducibleScience(Brammer et al 2011)Paper management system using virtual environments so that thefull experiment is packaged with aVirtual machine.ATTENTION TO PUBLISHING DATA AND METHODS
  50. 50. Wf4Ever (Workflows forever) projectAstronomy Use CasePreservation of the methods•Investigates and develops technological infrastructure for thepreservation and efficient retrieval and reuse of scientificworkflows•Introduced the concept of a Research Object, containing theartefacts needed to interpret or reconstruct research-High investment in datainfrastructures-Exploitation is usually anissue-Workflows as live-tutorialsEU FUNDED FP7 STREP PROJECTDECEMBER 2010 – DECEMBER 2013
  51. 51. Canube: Ciencia Abierta en la NubeOpen Science•El proyecto Canube es un proyecto de Ciencia Abiertacorrespondiente a la II Convocatoria de Proyectos del Campusde Excelencia Internacional BioTIC de la Universidad deGranada (aprobado Marzo 2013)•La propuesta del proyecto está publicada en Ciencia Abierta enla Nube: CANUBE con una licencia CC-BY-SA•CANUBE propone la difusión de modelos de ciencia abierta enel conjunto del CEI-BioTIC y el resto de los componentes de esteproyecto, así como la inclusión de conceptos de computación ennube en la misma. Estos conceptos serán aplicados a diferenteslíneas de investigación y desarrollo, tanto de los investigadores dela UGR como de los agregados.
  52. 52. Astronomy:  ADS has been linking papers withVizier data. Now alsoobserving proposals, telescope, software is being referencedCollaboration with Wf4Ever started to transform to Research ObjectsAbelard  and  Héloise:  Why  Data  and  Publica4ons  Belong  Together  Ee7e  Smit  (Interna4onal  Associa4on  of  STM  Publishers:  members  collec4vely  publish  nearly  66%  of  all  journal  ar4cles)•Journals  to  require  availability  of  underlying  research  material  as  an  editorial  policy  •Ensure  data  is  stored,  curated  and  preserved  in  trustworthy  places•Ensure  links  (bi-­‐direc4onal)  and  persistent  iden4fiers  between  data  and  publica4ons  •Establish  uniform  cita4on  prac4ces  of  data
  53. 53. THEN WHAT?
  54. 54. (Carole Goble, Beyond the PDF 2013)
  55. 55. (Carole Goble, Beyond the PDF 2013)Hence funding agencies seem to havepositive effect in Science ... right beforereview deadlines!
  56. 56. •Give to the committees the scientometry so they know de facto thattheir role is not counting numbers?•Include someone with basic understanding of scientometrics•Allow to submit top few papers for evaluation to allow tenurecandidates to submit just their top few papers for evaluation(K. Shaw, Scientific Method blog)•In an evaluation, researchers have to show that at least one of theirResearch Objects has been used by someone else. Maybe cited.Preferably Used. (Carole Goble, Beyond the PDF 2013)•Each review needs to be the subject of evaluation, just as with scientificpublications.SOME IDEAS
  57. 57. •Science works through micro improvements and multiple errorsand failures until something finally works•We’ve become paralyzed with the notion that showing incrementalimprovements and corrections hurts, rather than helps, our personalcareers and science.Who Killed the PrePrint, and Could It Make a Return?By Jason Hoyt and Peter BinfieldSOME REFLEXIONSCan excellence kill Science?Such metrics further block innovation because they encourage scientiststo work in areas of science that are already highly populated, as it is onlyin these fields that large numbers of scientists can be expected toreference one’s work, no matter how outstanding.Science Editorial, 17 May 2013By Bruce Alberts, Science Editor’s in chief
  58. 58. •Clear hypothesis•Data•Formula•MethodsIs it reproducible? is ScienceGive less weight to the results: better qualityShift the balance to the MethodologySOME REFLEXIONS
  59. 59. Understanding metrics, reducing reliance on rankings, and suggestingnew ways to evaluate scientists are only the beginning; it’s going totake a sea change and lots of cooperation among scientists, journals,and academic and government institutions to banish the “publish orperish” mentality.(K. Shaw, Scientific Method blog)The aim doesn’t justify the mean (Scientific) MethodRobert Antonucci. NATURE, 495, 165

×