Science Exchange - Measuring Research Efficiency & Quality

2,057 views

Published on

Presentation by Science Exchange Founder & CEO Elizabeth Iorns, Ph.D. at the Altmetrics 2012 Workshop covering the company's plan to address problems with the efficiency and quality of scientific research.

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,057
On SlideShare
0
From Embeds
0
Number of Embeds
22
Actions
Shares
0
Downloads
0
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • Science Exchange - Measuring Research Efficiency & Quality

    1. 1. Measuring Efficiency & Quality As Novel Altmetrics
    2. 2. Why/how do we measure impact?Need to assess the impact of research funding➡ traditionally measured by impact factor, more recently by article level metrics like citations and downloadsThese metrics are not associated with quality➡ retraction rates correlate strongly with impact factor➡ citations don’t stop even after a retractionMetrics to measure the impact of research funding➡ quality: high quality reproducible research➡ efficiency: spend of funding
    3. 3. The Problem:Efficiency & Quality
    4. 4. Problem 1: EfficiencyResearch is not efficient due to individuality andcompetition:➡ Research increasingly specialized, not possible for researchers to do everything themselves ➭ enter core facilities➡ not possible or efficient for every research institute to have every core facilityImplications for efficiency:➡ Need a central open system to enable easy access➡ Need a community-driven platform, not a listing directory
    5. 5. Problem 2: QualityQuality of academic research is under scrutiny:➡ 47 of 53 “landmark” oncology studies not reproduced (Amgen)1➡ 43 of 67 cardio/oncology publications were contradictory (Bayer)2➡ 431 of 432 ms/oncology publications not reproduced (Ionaddis)3Implications for quality:➡ Lack of academic reproducibility results in lack of new therapies➡ Bayer halted 65% of target validation projects in 2011➡ Drug development estimated at ~$4B per NME1. Drug development: Raise standards for preclinical cancer research. Begley CG, Ellis LM. Nature. 2012; 483(7391):531-3.2. Reliability of new drug target claims called into question. Mullard A. Nat Rev Drug Discov. 2011; 10(9):643-4.3. Claims of sex differences: an empirical assessment in genetic associations. Patsopoulos NA, Tatsioni A, Ioannidis JP. JAMA. 2007; 298(8):880-93.
    6. 6. The Cause:Academic Incentives & Metrics
    7. 7. Cause: Academic IncentivesAcademic incentives not aligned with efficient high-quality research➡ Incentives to compete over publications, not collaborate / share➡ Incentives to acquire publications, not validate results / findingsNeed: A platform to track altmetrics of efficiency andquality in academic research
    8. 8. Our Solution:A Marketplace for Science
    9. 9. Science ExchangeMission: “Improve the efficiency of scientificresearch, by making it easy for researchers toaccess a global network of scientific resources& exper tise.” COMPANY VITALS COMPANY METRICS Inception: May 2011 +800 experiment types Headquar ters: Palo Alto, CA +1000 providers Investors: Andreessen Horowitz and others +250,000 visits since launch
    10. 10. An Online MarketplaceScience Exchange provides access to a network ofexperts who operate outside current academicincentive structure➡ fee for service model with reputation / rating system: incentive only to produce high quality data➡ 1000+ exper t providers including academic core facilities & commercial vendors➡ 800+ experiment-types accessible from expert providers Key Value: A central portal for efficient access to specialized corefacilities, that can be used to validate academic research
    11. 11. The Reproducibility Fund$1M Pilot Fund to support reproducibility➡ Researchers submit studies for validation through Science Exchange’s expert providers➡ 40 studies funded in initial pilot➡ Shows proof of principle to NIH & funding agencies
    12. 12. The Reproducibility FundImproved incentives for academic researchers➡ No costs: academic studies funded by Reproducibility Fund➡ Fast-track publication: original results expedited for publication & ‘badged’ as reproducible➡ 2-for-1 publication: replicated results published➡ Awareness: top quality research is rewarded Increase Provide free credits reproducibility, reduce for research drug failure Funding and publishing reproducibility partners Academic Use SciEx platform Researchers to validate research studies
    13. 13. Our Vision:Improved Efficiency& Quality in Science
    14. 14. Short-Term: Altmetric of EfficiencyPromote open access to expert providersImproves collaboration with experts at core facilities & CROs:➡ No requirement to purchase duplicate equipment➡ No need to learn highly specialized one-off techniquesProvides one central, efficient solution:➡ Ensures researchers know where to go➡ Tracks usage & efficiency of research spending as an impt altmetric
    15. 15. Short-Term: Altmetric of QualityPromote high-quality reproducible researchLeverages expert providers on Science Exchange:➡ Improves independent validation of results➡ Improves discovery of robust translatable drug targets➡ Improves tracking / feedback to ensure qualityProvides incentive system for quality:➡ Reproducibility Fund➡ Tracks reproducibility as an important altmetric
    16. 16. Long-Term ObjectivesPromote a cultural change towards efficient andhigh-quality academic researchAdvocate for funding agencies to:➡ Promote efficiency through open access to shared resources➡ Reward efficient use of grant money by researchers➡ Fund validation as well as novel studies to promote reproducibility End Goal: Cultural shift towards altmetrics of efficiency & quality over novelty

    ×