Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Academic incentives and science
Gerry Carter
DO ACADEMIC
INCENTIVES
HURT SCIENCE ?
YES.
1.YES
2. Maybe- what’s the argument?
3. Huh, what are you talking about?
4. NO
What is science?
• Wikipedia: systematic enterprise that builds and
organizes knowledge in the form of testable explanatio...
What is science?
• Wikipedia: systematic enterprise that builds and organizes
knowledge in the form of testable explanatio...
science
• Rigor
• Falsifiability
• Repeatability
• Measures of uncertainty
What is academia?
• Wikipedia: the internationally recognized establishment
of professional scholars and students, usually...
What is academia?
• Wikipedia: the internationally recognized establishment
of professional scholars and students, usually...
How is scientific success different from
academic success?
What is success?
Science
= being correct, falsifiable, and repeatable
Academia
= having influence and impact, being hired,...
My argument
1. Incentive structures of science are philosophical.
Examples: double-blind design, replication, and peer-rev...
My argument
1. Incentive structures of science are philosophical.
Examples: double-blind design, replication, and peer-rev...
My argument
1. Incentive structures of science are philosophical.
Examples: double-blind design, replication, and peer-rev...
My argument
1. Incentive structures of science are philosophical.
Examples: double-blind design, replication, and peer-rev...
My argument
1. Incentive structures of science are philosophical.
Examples: double-blind design, replication, and peer-rev...
Science and academia
have many shared incentives
To be a good scientist and a good academic…
• Do good research
• Make imp...
As a reviewer, when do you reject a paper
for not enough significance?
As a reviewer, when do you reject a paper
for not enough significance?
What makes a paper good enough?
Science: good enoug...
The academic filter in biology
• 16,000 new US Biology PhD students/ year
• Of those recruits, 63% stay until they graduat...
Mentors: old vs. young
BUT…
…”Not exactly the sort of approach likely to reap
accolades today.”
“I doubt that I would fare...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (1 of 2)
To be a good scientist… To be a good academic…
Be skeptical of your results “Sell” your resul...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Masicampoa & Lalande
2012 Quart ...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
Divergent interests (2 of 2)
To be a good scientist… To be a good academic…
P=0.06 P=0.04
Help others replicate and find f...
What is academic impact?
• Influence on your field
• Usually involves counting citations, regardless of whether
you are be...
What is academic impact?
• A fundamental problem:
• Over time, scientific validity and importance of an article
both incre...
A survey
From Daniel Dennett:
• (A) You solve the major problem of your choice so
conclusively that there is nothing left ...
Philosophers and Scientists
From Daniel Dennett:
• (A) You solve the major problem of your choice so
conclusively that the...
Some of the most influential scholars
said little that was correct
Jacques Derrida
“Brilliant and controversial, Jacques Derrida is one
of the most important thinkers of our time, his
consi...
Deconstruction denies the possibility of a "pure presence": "the present or presence of
sense to a full and primordial int...
“Pluralistic ignorance”
• no one actually believes, but everyone believes that
everyone else believes
The Emperor’s
New Cl...
The anti-academic culture of science
Richard Feynman on his Nobel Prize:
http://www.liveleak.com/view?i=e5a_1185599458
How should we change academia?
Towards a better academia…
1. Make publishing trivial. Make
peer-review rigorous.
2. Encourage replication.
3. Decentraliz...
Publishing
• Make publishing trivial. Make peer-review rigorous.
Article-level metrics.
• Get rid of impact factors, and t...
Journals make money from nothing
Journals make money from nothing
Journals are brands
Journals are brands
In theory, the whole point of a brand is quality control. But in practice,
brands manipulate consumers...
Journals are brands
In theory, the whole point of a brand is quality control. But in practice,
brands manipulate consumers...
Public can read
Publishing
Data
Conclusions
Paper Overhyped
bullshitPicked up
by media
This is how it should work.
“Experi...
Public can
read
Public can read
Publishing
Data
Conclusions
Paper Overhyped
bullshitPicked up
by media
This is how it shou...
Publishing
• Make publishing trivial. Make peer-review rigorous.
Article-level metrics.
• Get rid of journals, impact fact...
Publishing
• Make publishing trivial. Make peer-review rigorous.
Article-level metrics.
• Get rid of journals, impact fact...
Publishing
• Make publishing trivial. Make peer-review rigorous.
Article-level metrics.
• Get rid of journals, impact fact...
Replication
• Encourage replication.
• Open data.
• Metrics to identify what is worth replicating.*
• Publish replications.
A replication crisis
Social psychology
Psychology
Biomedicine
Funding
• Decentralize funding
• Rather than centralized, trickle-down funding
• One idea: most labs receive baseline fund...
Scientists spend about 40% of their time
chasing grants (2007 U.S. Govt study)
experimental physicist at Columbia Universi...
Funding
• Decentralize funding
• Rather than centralized, trickle-down funding
• Use peer-review-like “voting” for funding...
In summary
• Academic incentives can erode good scientific practices.
• We can change academic practices. However, there
n...
What do you think?
Thanks for the discussion.
Upcoming SlideShare
Loading in …5
×

Academia

70,064 views

Published on

informal seminar

Published in: Education

Academia

  1. 1. Academic incentives and science Gerry Carter
  2. 2. DO ACADEMIC INCENTIVES HURT SCIENCE ?
  3. 3. YES.
  4. 4. 1.YES 2. Maybe- what’s the argument? 3. Huh, what are you talking about? 4. NO
  5. 5. What is science? • Wikipedia: systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions • Dictionary: systematic knowledge of the physical world gained through observation and experimentation
  6. 6. What is science? • Wikipedia: systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions • Dictionary: systematic knowledge of the physical world gained through observation and experimentation • the most rigorous way of gaining new knowledge about nature (whatever that may be), and the sum of that knowledge • When you are adhering to the highest standards of logic and evidence, you are thinking scientifically.(And when you’re not, you’re not.)* *Author: Sam Harris
  7. 7. science • Rigor • Falsifiability • Repeatability • Measures of uncertainty
  8. 8. What is academia? • Wikipedia: the internationally recognized establishment of professional scholars and students, usually centered around colleges and universities, who are engaged in higher education and research. • the environment or community concerned with the pursuit of research, education, and scholarship. • the “job” of being a scholar at a university and the community of those scholars
  9. 9. What is academia? • Wikipedia: the internationally recognized establishment of professional scholars and students, usually centered around colleges and universities, who are engaged in higher education and research. • the environment or community concerned with the pursuit of research, education, and scholarship. • the “job” of being a scholar at a university • By “academia”, I don’t mean teaching (vs research). • And I don’t mean the humanities • I mean the aspects that affect scientific research
  10. 10. How is scientific success different from academic success?
  11. 11. What is success? Science = being correct, falsifiable, and repeatable Academia = having influence and impact, being hired, getting funding
  12. 12. My argument 1. Incentive structures of science are philosophical. Examples: double-blind design, replication, and peer-review. They are typically designed to aid scientific goals. 2. Incentives structures in academia are economic. Examples: tenure, h-index, and journal impact factors. They are not designed to meet scientific goals. They emerge as byproducts of human goals and economic competition within and among social groups. (Economic competition exists between publishing companies, between individuals, between labs, between departments, between universities, and academia vs other state-funded institutions). 3. Therefore, many academic incentives  bad science. 4. Question: How can we design better academic incentive structures so that they serve the interests of science?
  13. 13. My argument 1. Incentive structures of science are philosophical. Examples: double-blind design, replication, and peer-review. They are typically designed to aid scientific goals. 2. Incentives structures in academia are economic. Examples: tenure, h-index, and journal impact factors. They are not designed to meet scientific goals. They emerge as byproducts of human goals and economic competition within and among social groups. (Economic competition exists between publishing companies, between individuals, between labs, between departments, between universities, and academia vs other state-funded institutions). 3. Therefore, many academic incentives  bad science. 4. Question: How can we design better academic incentive structures so that they serve the interests of science?
  14. 14. My argument 1. Incentive structures of science are philosophical. Examples: double-blind design, replication, and peer-review. They are typically designed to aid scientific goals. 2. Incentives structures in academia are economic. Examples: tenure, h-index, and journal impact factors. They are not designed to meet scientific goals. They emerge as byproducts of human goals and economic competition within and among social groups. (Economic competition exists between publishing companies, between individuals, between labs, between departments, between universities, and academia vs other state-funded institutions). 3. Therefore, many academic incentives  bad science. 4. Question: How can we design better academic incentive structures so that they serve the interests of science?
  15. 15. My argument 1. Incentive structures of science are philosophical. Examples: double-blind design, replication, and peer-review. They are typically designed to aid scientific goals. 2. Incentives structures in academia are economic. Examples: tenure, h-index, and journal impact factors. They are not designed to meet scientific goals. They emerge as byproducts of human goals and economic competition within and among social groups. (Economic competition exists between publishing companies, between individuals, between labs, between departments, between universities, and academia vs other state-funded institutions). 3. Therefore, many academic incentives  bad science. 4. Question: How can we design better academic incentive structures so that they serve the interests of science?
  16. 16. My argument 1. Incentive structures of science are philosophical. Examples: double-blind design, replication, and peer-review. They are typically designed to aid scientific goals. 2. Incentives structures in academia are economic. Examples: tenure, h-index, and journal impact factors. They are not designed to meet scientific goals. They emerge as byproducts of human goals and economic competition within and among social groups. (Economic competition exists between publishing companies, between individuals, between labs, between departments, between universities, and academia vs other state-funded institutions). 3. Therefore, many academic incentives  bad science. 4. Question: How can we design better academic incentive structures so that they serve the interests of science?
  17. 17. Science and academia have many shared incentives To be a good scientist and a good academic… • Do good research • Make important discoveries • Publish good papers • Produce public knowledge • Teach science to students well
  18. 18. As a reviewer, when do you reject a paper for not enough significance?
  19. 19. As a reviewer, when do you reject a paper for not enough significance? What makes a paper good enough? Science: good enough to be useful evidence for readers Academia: good enough to be an accolade for the writer
  20. 20. The academic filter in biology • 16,000 new US Biology PhD students/ year • Of those recruits, 63% stay until they graduate • Average time to degree = 7 years • Of those graduates, 70% get a postdoc • Of those postdocs, 30% do 1+ more postdoc • Average postdoc time = ? but more than 4 years • Within 6 years, 15% of postdocs get a tenure-track job • We have a 6-7% chance of getting the job we were trained for. Source: ascb.org “Where will a biology PhD take you”
  21. 21. Mentors: old vs. young BUT… …”Not exactly the sort of approach likely to reap accolades today.” “I doubt that I would fare particularly well in today’s academic environment.”
  22. 22. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results
  23. 23. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance
  24. 24. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance Publish negative results Publish “strategically”
  25. 25. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance Publish negative results Publish “strategically” Ignore social prestige Use impact factors to make writing decisions
  26. 26. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance Publish negative results Publish “strategically” Ignore social prestige Use impact factors to make writing decisions Challenge authority Cite authority. Make friends.
  27. 27. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance Publish negative results Publish “strategically” Ignore social prestige Use impact factors to make writing decisions Challenge authority Cite authority. Make friends. Replicate. Replicate. Replicate… if you must
  28. 28. Divergent interests (1 of 2) To be a good scientist… To be a good academic… Be skeptical of your results “Sell” your results Interpret conclusions carefully Highlight/exaggerate importance Publish negative results Publish “strategically” Ignore social prestige Use impact factors to make writing decisions Challenge authority Cite authority. Make friends. Replicate. Replicate. Replicate… if you must Novel exciting results are less likely to be true. Double-check them. Publish novel exciting results before you get scooped.
  29. 29. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Masicampoa & Lalande 2012 Quart J Exp Psych
  30. 30. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory”
  31. 31. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory” Be clear Sound smart
  32. 32. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory” Be clear Sound smart “Science is a marathon, not a sprint” Funded work must be “transformative”
  33. 33. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory” Be clear Sound smart “Science is a marathon, not a sprint” Funded work must be “transformative” Natural history is crucial for biological understanding Natural history is “low impact”
  34. 34. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory” Be clear Sound smart “Science is a marathon, not a sprint” Funded work must be “transformative” Natural history is crucial for biological understanding Natural history is “low impact” Long-term goals for important truths Short-term results with high impact
  35. 35. Divergent interests (2 of 2) To be a good scientist… To be a good academic… P=0.06 P=0.04 Help others replicate and find flaws in your work Have a “territory” Be clear Sound smart “Science is a marathon, not a sprint” Funded work must be “transformative” Natural history is crucial for biological understanding Natural history is “low impact” Long-term goals for important truths Short-term results with high impact Build a consensus. Hot controversial topics are higher impact
  36. 36. What is academic impact? • Influence on your field • Usually involves counting citations, regardless of whether you are being cited for being right or being wrong. • Impact is measured at the level of journals (impact factor) and authors (h-index) • But actual scientific progress made by articles, methods, hypotheses, theories, and often teams of researchers
  37. 37. What is academic impact? • A fundamental problem: • Over time, scientific validity and importance of an article both increases and decreases. • Measures of total impact can only increase. • Papers that are influentially wrong have higher impact metrics than papers that are correct.
  38. 38. A survey From Daniel Dennett: • (A) You solve the major problem of your choice so conclusively that there is nothing left to say (thanks to you, part of the field closes down forever, and you get a footnote in history). • (B) You write an article or book of such tantalizing perplexity and controversy that it stays on the required reading list for centuries to come. Which would you choose?
  39. 39. Philosophers and Scientists From Daniel Dennett: • (A) You solve the major problem of your choice so conclusively that there is nothing left to say (thanks to you, part of the field closes down forever, and you get a footnote in history). • (B) You write an article or book of such tantalizing perplexity and controversy that it stays on the required reading list for centuries to come. Scientists choose A. Philosophers often choose B.
  40. 40. Some of the most influential scholars said little that was correct
  41. 41. Jacques Derrida “Brilliant and controversial, Jacques Derrida is one of the most important thinkers of our time, his considerable body of work having ineradicably altered the landscape of thought in the 20th and 21st centuries.” • Wikipedia: best known for developing a form of semiotic analysis known as deconstruction • one of the major figures associated with post-structuralism and postmodern philosophy • published more than 40 books, hundreds of essays • significant influence upon the humanities and social sciences, including—in addition to philosophy and literature—law, anthropology, historiography, linguistics, sociolinguistics, psychoanalysis, political theory, feminism, and gay and lesbian studies…. also influenced architecture (in the form of deconstructivism), music, art, and art critics. Derrida was said to "leave behind a legacy of himself as the 'originator' of deconstruction.”
  42. 42. Deconstruction denies the possibility of a "pure presence": "the present or presence of sense to a full and primordial intuition".[13][14] It thus denies the possibility of essential or intrinsic and stable meaning and the unmediated access to "reality”. Dictionary: a method of critical analysis of philosophical and literary language that emphasizes the internal workings of language and conceptual systems, the relational quality of meaning, and the assumptions implicit in forms of expression.
  43. 43. “Pluralistic ignorance” • no one actually believes, but everyone believes that everyone else believes The Emperor’s New Clothes
  44. 44. The anti-academic culture of science Richard Feynman on his Nobel Prize: http://www.liveleak.com/view?i=e5a_1185599458
  45. 45. How should we change academia?
  46. 46. Towards a better academia… 1. Make publishing trivial. Make peer-review rigorous. 2. Encourage replication. 3. Decentralize funding, rather than centralized, trickle-down funding
  47. 47. Publishing • Make publishing trivial. Make peer-review rigorous. Article-level metrics. • Get rid of impact factors, and then journals.
  48. 48. Journals make money from nothing
  49. 49. Journals make money from nothing
  50. 50. Journals are brands
  51. 51. Journals are brands In theory, the whole point of a brand is quality control. But in practice, brands manipulate consumers into seeing high quality regardless of actual quality.
  52. 52. Journals are brands In theory, the whole point of a brand is quality control. But in practice, brands manipulate consumers into seeing high quality regardless of actual quality. Journal brands serve as third-party assessment, so that readers “can judge an article by its cover” and can outsource their evaluation of papers to 1 editor and 2+ reviewers. But this is better accomplished by a rating system, like we use for judging movies and books.
  53. 53. Public can read Publishing Data Conclusions Paper Overhyped bullshitPicked up by media This is how it should work. “Experimental treatments affected gut biota (p=0.01) and were correlated with longevity (p=0.03)” “Gut microbiota is linked to longevity in male mice” “Ten reasons why your microbiome is the key to health and long life”
  54. 54. Public can read Public can read Publishing Data Conclusions Paper Overhyped bullshitPicked up by media This is how it should work. But this is how it often works… SPIN Data Journal. University access only. Overhyped bullshit SPIN Slightly hyped paperWritten for high impact SPIN Journal demands brevity and certainty
  55. 55. Publishing • Make publishing trivial. Make peer-review rigorous. Article-level metrics. • Get rid of journals, impact factors, and academic publishing as a business. • Central repository of open-access, peer-reviewed articles (and their reviews). • Magazines can then draw stories from those articles with varying degrees of rigor (Nature, New Scientist, Fox News). This is already basically how ArXive and PLOS ONE work.
  56. 56. Publishing • Make publishing trivial. Make peer-review rigorous. Article-level metrics. • Get rid of journals, impact factors, and academic publishing as a business. • Central repository of open-access, peer-reviewed articles (and their reviews). Magazines can then draw stories from those articles with varying degrees of rigor (Nature, New Scientist, Fox News). • Publish negative results and datasets.
  57. 57. Publishing • Make publishing trivial. Make peer-review rigorous. Article-level metrics. • Get rid of journals, impact factors, and academic publishing as a business. • Central repository of open-access, peer-reviewed articles (and their reviews). Magazines can then draw stories from those articles with varying degrees of rigor (Nature, New Scientist, Fox News). • Publish negative results and datasets. • Published peer reviews. Make them objective and open access. Focus on soundness of results, not subjective importance (which can be assessed over time). Have standardized peer review checklists.
  58. 58. Replication • Encourage replication. • Open data. • Metrics to identify what is worth replicating.* • Publish replications.
  59. 59. A replication crisis Social psychology Psychology Biomedicine
  60. 60. Funding • Decentralize funding • Rather than centralized, trickle-down funding • One idea: most labs receive baseline funding. Some percentage of all funds must be invested in another lab. Bollen et al. 2014. From funding agencies to scientific agency: collective allocation of science funding as an alternative to peer review. EMBO Reports.
  61. 61. Scientists spend about 40% of their time chasing grants (2007 U.S. Govt study) experimental physicist at Columbia University did the calculations: some grants have a net negative value: they do not pay for the time that applicants and peer reviewers spend on them. (Dr. No Money: The Broken Science Funding System. 2011. Scientific American.)
  62. 62. Funding • Decentralize funding • Rather than centralized, trickle-down funding • Use peer-review-like “voting” for funding. • One idea: most labs receive baseline funding. 30% of all funds must be invested in another lab justified by merit or promise. . Bollen et al. 2014. From funding agencies to scientific agency: collective allocation of science funding as an alternative to peer review. EMBO Reports
  63. 63. In summary • Academic incentives can erode good scientific practices. • We can change academic practices. However, there needs to be a cooperative consensus, because this is a public goods dilemma. Like a boycott, it’s only cost- effective, if everyone does it. • It would be useful to enact policy changes that offset the poor incentive structures of academia.
  64. 64. What do you think? Thanks for the discussion.

×