Methods Pyramids as a Knowledge Organizing
Structure for Evidence-Based Medicine
Jodi Schneider
School of Information Sciences
University of Illinois at Urbana-Champaign
Keynote for JCDL 2020 Workshop on Conceptual Modeling, Virtual, 2020-08-01
These are some preliminary thoughts on
METHODS as a meso-level organizing structure in
science.
Macro Organization of Science
Topical Organization of a Field
http://scimaps.org/mapdetail/being_a_map_of_physi_171
http://scimaps.org/mapdetail/map_of_complexity_sc_154/
Topic evolution of a field over time
Micro Organization of Science
Paper Structure
Introduction
Methods
Results
and/Analysis
Discussion
Teufel, S. (2014). Scientific Argumentation Detection as Limited-domain Intention Recognition. In
ArgNLP 2014 Frontiers and Connections between Argumentation Theory and Natural Language
Processing. http://ceur-ws.org/Vol-1341/paper14.pdf
Rhetorical Goals
Argument Modeling
Fu & Schneider, JCDL 2020
Argument Modeling
Fu & Schneider, JCDL 2020
Argument Modeling
Fu & Schneider, JCDL 2020
Meso Organization of Science
Argument Modeling
Fu & Schneider, JCDL 2020
Methods in Evidence-Based Medicine
Figure credit: Duke University Medical Center Library. Introduction to
Evidence-based Practice. What is Evidence-Based Practice (EBP)?
http://guides.mclibrary.duke.edu/c.php?g=158201&p=1036021
“Best Research Evidence”
in Evidence Based Practice
Hierarchy of Evidence
“The Evidence Pyramid”
Figure credit: SUNY Downstate Medical Center. Medical Research Library of Brooklyn.
Evidence Based Medicine Course. A Guide to Research Methods: The Evidence
Pyramid: http://library.downstate.edu/EBM2/2100.htm
“Research design pyramid”
Steele & Tiffin,‘‘Personalised evidence’ for personalised healthcare: integration of a
clinical librarian into mental health services – a feasibility study, The Psychiatric
Bulletin 38(1) (2014), 29–35. doi:10.1192/pb.bp.112.042382
“Research design pyramid, with
highlighted levels indicating the designs
which featured in studies synthesised by
the clinical librarian to provide an
answer to evidence request 6 (broader
clinical). "
“Circle of methods”
Tugwell, Peter, and J. André Knottnerus. 2015. “Is the ‘Evidence-Pyramid’ Now Dead?”
Journal of Clinical Epidemiology 68 (11): 1247–50.
https://doi.org/10.1016/j.jclinepi.2015.10.001
“Circle of methods. Experimental
methods that test specifically for
efficacy (upper half of the circle) have to
be complemented by observa- tional,
non-experimental methods (lower half
of the circle) that are more descriptive
in nature and describe real-life effects
and applicability. Shad- ing indicates the
complementarity of experimental and
quasi-experimental methods, of internal
and external validity"
“wavy lines” & “a lens” for viewing evidence
Murad, Hassan, Alsawas, and Alahdab. 2016. “New Evidence Pyramid.”
BMJ Evidence-Based Medicine 21 (4): 125–27.
https://doi.org/10.1136/ebmed-2016-110401
“Revising the pyramid: (1)
lines separating the study
designs become wavy
(Grading of
Recommendations
Assessment, Development
and Evaluation), (2)
systematic reviews are
‘chopped off’ the pyramid.
(C) The revised pyramid:
systematic reviews are a lens
through which evidence is
viewed (applied). “
Methods as an organizing structure for DOING
clinical science research
• Epidemiology textbooks classify clinical studies as either
observational or experimental and as either prospective or
retrospective.
• Specific study designs are widely agreed upon too: case reports
and case series, case-control studies, cohort studies, and
randomized controlled trials.
Methods as an organizing structure for REPORTING
clinical science research
• They frequently appear in research paper titles and are the
subject of reporting guidelines
• Trials: CONSORT (Begg et al. 1996).
Methods as an organizing structure for REPORTING
clinical science research
• They frequently appear in research paper titles and are the
subject of reporting guidelines
• Observational studies: STROBE (Vandenbroucke et al. 2007)
Methods as an organizing structure for SEARCHING
clinical science research
• Each design has a Medical Subject Heading (MeSH) which can be
used for retrieval.
• Information retrieval groups make search filters (McKibbon et al.
2009 compared 38 randomized controlled trial filters).
• Current work in my group evaluating machine learning tools for
30+ medical types, with training data from MEDLINE/PubMed.
Some of our best-performing models
model AUC
BIOGRAPHY 1.00
SYSTEMATIC REVIEW 1.00
CASE REPORTS 0.99
CONSENSUS DEVELOPMENT CONFERENCE 0.99
CROSS OVER STUDY 0.99
DOUBLE BLIND METHOD 0.99
FOCUS GROUPS 0.99
GENOME WIDE ASSOCIATION STUDY 0.99
HUMAN EXPERIMENTATION 0.99
INTERVIEW 0.99
META ANALYSIS 0.99
PORTRAITS 0.99
PRACTICE GUIDELINE 0.99
TWIN STUDY 0.99
AUTOBIOGRAPHY 0.98
Aaron Cohen, Neil Smalheiser, Jodi Schneider –
Text Mining Pipeline to Accelerate Systematic Reviews in Evidence-Based Medicine
AUC = area under the curve
It’s the probability that a
classifier will rank a
randomly chosen positive
instance higher than a
randomly chosen negative
one.
Taking it further
Boyce, R.D.: A Draft Evidence Taxonomy and Inclusion Criteria for the Drug Interaction
Knowledge Base (DIKB),
http://purl.net/net/drug-interaction-knowledge-base/evidence-types-and-inclusion-criteria
Finer grained evidence hierarchies
Work in progress detecting finer grained methods
Hoang, Boyce, Brochhausen, Utecht, Schneider. “A proposal for determining the evidence types of biomedical documents using a
drug-drug interaction ontology and machine learning.” Poster at AAAI 2019 Spring Symposium on Combining Machine Learning
with Knowledge Engineering (AAAI-MAKE 2019).
Connecting all this to Knowledge Organization
(explicitly)
Integrative Levels
Kleineberg, Michael. 2017. “Integrative Levels.” Knowledge Organization 44 (5):
349–79. https://doi.org/10.5771/0943-7444-2017-5-349
Domains are constituted of…
Hjørland, Birger, and Jenna Hartel. 2003. “Afterword: Ontological, Epistemological
and Sociological Dimensions of Domains.” Knowledge Organization 30 (3/4): 239–45.
(1) ontological theories and concepts about the objects of human
activity;
(2)epistemological theories and concepts about knowledge and
the ways to obtain knowledge, implying methodological
principles about the ways objects are investigated; and
(3) sociological concepts about the groups of people concerned
with the objects.
Discussion Points
• Methods are a key part of the Knowledge Organizing Structure for
Evidence-Based Medicine.
• Methods relate to how we GENERATE evidence.
• Different methods generate evidence of different kinds and strength.
• I believe Methods can be useful in mining claims and arguments from
papers: methods AUTHORIZE claims.
• More specialized hierarchies of evidence can be found in medicine
• Various groups are complicating the “evidence pyramid” hierarchy of
evidence.
Conceptual Modeling in the Main Conference!
• Are papers citing a retracted paper
necessarily wrong?
• Does it matter when citing authors make
use of a paper whose findings are no longer
considered valid?
• When DOES the citation matter?
• Could we selectively alert authors who cite a
retracted or abandoned paper?
Yuanxi Fu, Jodi Schneider. “Towards knowledge maintenance in scholarly digital libraries with keystone citations.”
doi:10.1145/3383583.3398514 Preprint: http://jodischneider.com/pubs/jcdl2020.pdf
Under our framework:
1) A scientific research paper puts forward at
least one main finding, along with a
logical argument, giving reasons and
evidence to support the main finding.
2) The main finding is accepted (or not) on
the basis of the logical argument.
3) Evidence from earlier literature may be
incorporated into the argument by citing a
paper and presenting it as support, using a
citation context.
Main Finding
Arguments
support
support
Data Methods Citations …
Citing ArticleCited Article
Citation Context
“Many papers with
known validity problems
are highly cited [3].”
Our
paper
[3] = Bar-Ilan, J.
and Halevi, G.
2018. Temporal
characteristics
of retracted
articles
References
Begg, C., M. Cho, S. Eastwood, R. Horton, D. Moher, I. Olkin, R. Pitkin, et al. “Improving the Quality of Reporting of Randomized Controlled Trials. The CONSORT Statement.” JAMA 276, no. 8 (August 28,
1996): 637–39. https://doi.org/10.1001/jama.276.8.637
Boyce, Richard. “A Draft Evidence Taxonomy and Inclusion Criteria for the Drug Interaction Knowledge Base (DIKB),” April 26, 2015. https://web.archive.org/web/20150426205701/http://dbmi-icode-
01.dbmi.pitt.edu/dikb-evidence/just-inclusion-criteria/just-inclusion-criteria.html
Brochhausen, Mathias, William R Hogan, Philip E Empey, Jodi Schneider, and Richard D Boyce. “Adding Evidence Type Representation to DIDEO.” In Proceedings of the Joint International Conference on
Biological Ontology and BioCreative Corvallis, Oregon, United States, August 1-4, 2016, 2, 2016. http://ceur-ws.org/Vol-1747/IP02_ICBO2016.pdf
Fu & Schneider. “Towards knowledge maintenance in scholarly digital libraries with keystone citations.” In JCDL 2020. Pages 217–226. https://doi.org/10.1145/3383583.3398514
Grizzle, Amy J., John Horn, Carol Collins, Jodi Schneider, Daniel C. Malone, Britney Stottlemyer, and Richard David Boyce. “Identifying Common Methods Used by Drug Interaction Experts for Finding
Evidence about Potential Drug-Drug Interactions: Web-Based Survey.” Journal of Medical Internet Research 21, no. 1 (2019): e11182. https://doi.org/10.2196/11182
Hjørland, Birger, and Jenna Hartel. 2003. “Afterword: Ontological, Epistemological and Sociological Dimensions of Domains.” Knowledge Organization 30 (3/4): 239–45.
Hoang, Boyce, Brochhausen, Utecht, Schneider. “A proposal for determining the evidence types of biomedical documents using a drug-drug interaction ontology and machine learning.” Poster at AAAI 2019
Spring Symposium on Combining Machine Learning with Knowledge Engineering (AAAI-MAKE 2019). http://jodischneider.com/pubs/aaaimake2019.pdf
Kleineberg, Michael. 2017. “Integrative Levels.” Knowledge Organization 44 (5): 349–79. https://doi.org/10.5771/0943-7444-2017-5-349
Murad, Hassan, Alsawas, and Alahdab. 2016. “New Evidence Pyramid.” BMJ Evidence-Based Medicine 21 (4): 125–27. https://doi.org/10.1136/ebmed-2016-110401
Romagnoli, Katrina M., Scott D. Nelson, Lisa Hines, Philip Empey, Richard D. Boyce, and Harry Hochheiser. “Information Needs for Making Clinical Recommendations about Potential Drug-Drug Interactions:
A Synthesis of Literature Review and Interviews.” BMC Medical Informatics and Decision Making 17, no. 1 (February 22, 2017): 21. https://doi.org/10.1186/s12911-017-0419-3
Schneider, Jodi, and Sally Jackson. “Modeling the Invention of a New Inference Rule: The Case of ‘Randomized Clinical Trial’ as an Argument Scheme For Medical Science.” Argument & Computation 9, no. 2
(January 1, 2018): 77–89. https://doi.org/10.3233/AAC-180036
Steele & Tiffin,‘‘Personalised evidence’ for personalised healthcare: integration of a clinical librarian into mental health services – a feasibility study, The Psychiatric Bulletin 38(1) (2014), 29–35.
https://doi.org/10.1192/pb.bp.112.042382
Teufel, S. (2014). Scientific Argumentation Detection as Limited-domain Intention Recognition. In ArgNLP 2014 Frontiers and Connections between Argumentation Theory and Natural Language Processing.
http://ceur-ws.org/Vol-1341/paper14.pdf
Tugwell, Peter, and J. André Knottnerus. 2015. “Is the ‘Evidence-Pyramid’ Now Dead?” Journal of Clinical Epidemiology 68 (11): 1247–50. https://doi.org/10.1016/j.jclinepi.2015.10.001
Vandenbroucke, Jan P., Erik Von Elm, Douglas G. Altman, Peter C. Gøtzsche, Cynthia D. Mulrow, Stuart J. Pocock, Charles Poole, James J. Schlesselman, and Matthias Egger. “Strengthening the Reporting of
Observational Studies in Epidemiology (Strobe): Explanation and Elaboration.” Annals of Internal Medicine 147, no. 8 (2007): W–163.

Methods Pyramids as an Organizing Structure for Evidence-Based Medicine--SIGCM at JCDL2020--2020-08-01

  • 1.
    Methods Pyramids asa Knowledge Organizing Structure for Evidence-Based Medicine Jodi Schneider School of Information Sciences University of Illinois at Urbana-Champaign Keynote for JCDL 2020 Workshop on Conceptual Modeling, Virtual, 2020-08-01
  • 2.
    These are somepreliminary thoughts on METHODS as a meso-level organizing structure in science.
  • 3.
  • 4.
    Topical Organization ofa Field http://scimaps.org/mapdetail/being_a_map_of_physi_171
  • 5.
  • 6.
  • 7.
  • 8.
    Teufel, S. (2014).Scientific Argumentation Detection as Limited-domain Intention Recognition. In ArgNLP 2014 Frontiers and Connections between Argumentation Theory and Natural Language Processing. http://ceur-ws.org/Vol-1341/paper14.pdf Rhetorical Goals
  • 9.
    Argument Modeling Fu &Schneider, JCDL 2020
  • 10.
    Argument Modeling Fu &Schneider, JCDL 2020
  • 11.
    Argument Modeling Fu &Schneider, JCDL 2020
  • 12.
  • 13.
    Argument Modeling Fu &Schneider, JCDL 2020
  • 14.
  • 15.
    Figure credit: DukeUniversity Medical Center Library. Introduction to Evidence-based Practice. What is Evidence-Based Practice (EBP)? http://guides.mclibrary.duke.edu/c.php?g=158201&p=1036021 “Best Research Evidence” in Evidence Based Practice
  • 16.
  • 17.
    “The Evidence Pyramid” Figurecredit: SUNY Downstate Medical Center. Medical Research Library of Brooklyn. Evidence Based Medicine Course. A Guide to Research Methods: The Evidence Pyramid: http://library.downstate.edu/EBM2/2100.htm
  • 18.
    “Research design pyramid” Steele& Tiffin,‘‘Personalised evidence’ for personalised healthcare: integration of a clinical librarian into mental health services – a feasibility study, The Psychiatric Bulletin 38(1) (2014), 29–35. doi:10.1192/pb.bp.112.042382 “Research design pyramid, with highlighted levels indicating the designs which featured in studies synthesised by the clinical librarian to provide an answer to evidence request 6 (broader clinical). "
  • 19.
    “Circle of methods” Tugwell,Peter, and J. André Knottnerus. 2015. “Is the ‘Evidence-Pyramid’ Now Dead?” Journal of Clinical Epidemiology 68 (11): 1247–50. https://doi.org/10.1016/j.jclinepi.2015.10.001 “Circle of methods. Experimental methods that test specifically for efficacy (upper half of the circle) have to be complemented by observa- tional, non-experimental methods (lower half of the circle) that are more descriptive in nature and describe real-life effects and applicability. Shad- ing indicates the complementarity of experimental and quasi-experimental methods, of internal and external validity"
  • 20.
    “wavy lines” &“a lens” for viewing evidence Murad, Hassan, Alsawas, and Alahdab. 2016. “New Evidence Pyramid.” BMJ Evidence-Based Medicine 21 (4): 125–27. https://doi.org/10.1136/ebmed-2016-110401 “Revising the pyramid: (1) lines separating the study designs become wavy (Grading of Recommendations Assessment, Development and Evaluation), (2) systematic reviews are ‘chopped off’ the pyramid. (C) The revised pyramid: systematic reviews are a lens through which evidence is viewed (applied). “
  • 21.
    Methods as anorganizing structure for DOING clinical science research • Epidemiology textbooks classify clinical studies as either observational or experimental and as either prospective or retrospective. • Specific study designs are widely agreed upon too: case reports and case series, case-control studies, cohort studies, and randomized controlled trials.
  • 22.
    Methods as anorganizing structure for REPORTING clinical science research • They frequently appear in research paper titles and are the subject of reporting guidelines • Trials: CONSORT (Begg et al. 1996).
  • 23.
    Methods as anorganizing structure for REPORTING clinical science research • They frequently appear in research paper titles and are the subject of reporting guidelines • Observational studies: STROBE (Vandenbroucke et al. 2007)
  • 24.
    Methods as anorganizing structure for SEARCHING clinical science research • Each design has a Medical Subject Heading (MeSH) which can be used for retrieval. • Information retrieval groups make search filters (McKibbon et al. 2009 compared 38 randomized controlled trial filters). • Current work in my group evaluating machine learning tools for 30+ medical types, with training data from MEDLINE/PubMed.
  • 25.
    Some of ourbest-performing models model AUC BIOGRAPHY 1.00 SYSTEMATIC REVIEW 1.00 CASE REPORTS 0.99 CONSENSUS DEVELOPMENT CONFERENCE 0.99 CROSS OVER STUDY 0.99 DOUBLE BLIND METHOD 0.99 FOCUS GROUPS 0.99 GENOME WIDE ASSOCIATION STUDY 0.99 HUMAN EXPERIMENTATION 0.99 INTERVIEW 0.99 META ANALYSIS 0.99 PORTRAITS 0.99 PRACTICE GUIDELINE 0.99 TWIN STUDY 0.99 AUTOBIOGRAPHY 0.98 Aaron Cohen, Neil Smalheiser, Jodi Schneider – Text Mining Pipeline to Accelerate Systematic Reviews in Evidence-Based Medicine AUC = area under the curve It’s the probability that a classifier will rank a randomly chosen positive instance higher than a randomly chosen negative one.
  • 26.
  • 27.
    Boyce, R.D.: ADraft Evidence Taxonomy and Inclusion Criteria for the Drug Interaction Knowledge Base (DIKB), http://purl.net/net/drug-interaction-knowledge-base/evidence-types-and-inclusion-criteria Finer grained evidence hierarchies
  • 28.
    Work in progressdetecting finer grained methods Hoang, Boyce, Brochhausen, Utecht, Schneider. “A proposal for determining the evidence types of biomedical documents using a drug-drug interaction ontology and machine learning.” Poster at AAAI 2019 Spring Symposium on Combining Machine Learning with Knowledge Engineering (AAAI-MAKE 2019).
  • 29.
    Connecting all thisto Knowledge Organization (explicitly)
  • 30.
    Integrative Levels Kleineberg, Michael.2017. “Integrative Levels.” Knowledge Organization 44 (5): 349–79. https://doi.org/10.5771/0943-7444-2017-5-349
  • 31.
    Domains are constitutedof… Hjørland, Birger, and Jenna Hartel. 2003. “Afterword: Ontological, Epistemological and Sociological Dimensions of Domains.” Knowledge Organization 30 (3/4): 239–45. (1) ontological theories and concepts about the objects of human activity; (2)epistemological theories and concepts about knowledge and the ways to obtain knowledge, implying methodological principles about the ways objects are investigated; and (3) sociological concepts about the groups of people concerned with the objects.
  • 32.
    Discussion Points • Methodsare a key part of the Knowledge Organizing Structure for Evidence-Based Medicine. • Methods relate to how we GENERATE evidence. • Different methods generate evidence of different kinds and strength. • I believe Methods can be useful in mining claims and arguments from papers: methods AUTHORIZE claims. • More specialized hierarchies of evidence can be found in medicine • Various groups are complicating the “evidence pyramid” hierarchy of evidence.
  • 33.
    Conceptual Modeling inthe Main Conference! • Are papers citing a retracted paper necessarily wrong? • Does it matter when citing authors make use of a paper whose findings are no longer considered valid? • When DOES the citation matter? • Could we selectively alert authors who cite a retracted or abandoned paper? Yuanxi Fu, Jodi Schneider. “Towards knowledge maintenance in scholarly digital libraries with keystone citations.” doi:10.1145/3383583.3398514 Preprint: http://jodischneider.com/pubs/jcdl2020.pdf
  • 34.
    Under our framework: 1)A scientific research paper puts forward at least one main finding, along with a logical argument, giving reasons and evidence to support the main finding. 2) The main finding is accepted (or not) on the basis of the logical argument. 3) Evidence from earlier literature may be incorporated into the argument by citing a paper and presenting it as support, using a citation context. Main Finding Arguments support support Data Methods Citations … Citing ArticleCited Article Citation Context “Many papers with known validity problems are highly cited [3].” Our paper [3] = Bar-Ilan, J. and Halevi, G. 2018. Temporal characteristics of retracted articles
  • 35.
    References Begg, C., M.Cho, S. Eastwood, R. Horton, D. Moher, I. Olkin, R. Pitkin, et al. “Improving the Quality of Reporting of Randomized Controlled Trials. The CONSORT Statement.” JAMA 276, no. 8 (August 28, 1996): 637–39. https://doi.org/10.1001/jama.276.8.637 Boyce, Richard. “A Draft Evidence Taxonomy and Inclusion Criteria for the Drug Interaction Knowledge Base (DIKB),” April 26, 2015. https://web.archive.org/web/20150426205701/http://dbmi-icode- 01.dbmi.pitt.edu/dikb-evidence/just-inclusion-criteria/just-inclusion-criteria.html Brochhausen, Mathias, William R Hogan, Philip E Empey, Jodi Schneider, and Richard D Boyce. “Adding Evidence Type Representation to DIDEO.” In Proceedings of the Joint International Conference on Biological Ontology and BioCreative Corvallis, Oregon, United States, August 1-4, 2016, 2, 2016. http://ceur-ws.org/Vol-1747/IP02_ICBO2016.pdf Fu & Schneider. “Towards knowledge maintenance in scholarly digital libraries with keystone citations.” In JCDL 2020. Pages 217–226. https://doi.org/10.1145/3383583.3398514 Grizzle, Amy J., John Horn, Carol Collins, Jodi Schneider, Daniel C. Malone, Britney Stottlemyer, and Richard David Boyce. “Identifying Common Methods Used by Drug Interaction Experts for Finding Evidence about Potential Drug-Drug Interactions: Web-Based Survey.” Journal of Medical Internet Research 21, no. 1 (2019): e11182. https://doi.org/10.2196/11182 Hjørland, Birger, and Jenna Hartel. 2003. “Afterword: Ontological, Epistemological and Sociological Dimensions of Domains.” Knowledge Organization 30 (3/4): 239–45. Hoang, Boyce, Brochhausen, Utecht, Schneider. “A proposal for determining the evidence types of biomedical documents using a drug-drug interaction ontology and machine learning.” Poster at AAAI 2019 Spring Symposium on Combining Machine Learning with Knowledge Engineering (AAAI-MAKE 2019). http://jodischneider.com/pubs/aaaimake2019.pdf Kleineberg, Michael. 2017. “Integrative Levels.” Knowledge Organization 44 (5): 349–79. https://doi.org/10.5771/0943-7444-2017-5-349 Murad, Hassan, Alsawas, and Alahdab. 2016. “New Evidence Pyramid.” BMJ Evidence-Based Medicine 21 (4): 125–27. https://doi.org/10.1136/ebmed-2016-110401 Romagnoli, Katrina M., Scott D. Nelson, Lisa Hines, Philip Empey, Richard D. Boyce, and Harry Hochheiser. “Information Needs for Making Clinical Recommendations about Potential Drug-Drug Interactions: A Synthesis of Literature Review and Interviews.” BMC Medical Informatics and Decision Making 17, no. 1 (February 22, 2017): 21. https://doi.org/10.1186/s12911-017-0419-3 Schneider, Jodi, and Sally Jackson. “Modeling the Invention of a New Inference Rule: The Case of ‘Randomized Clinical Trial’ as an Argument Scheme For Medical Science.” Argument & Computation 9, no. 2 (January 1, 2018): 77–89. https://doi.org/10.3233/AAC-180036 Steele & Tiffin,‘‘Personalised evidence’ for personalised healthcare: integration of a clinical librarian into mental health services – a feasibility study, The Psychiatric Bulletin 38(1) (2014), 29–35. https://doi.org/10.1192/pb.bp.112.042382 Teufel, S. (2014). Scientific Argumentation Detection as Limited-domain Intention Recognition. In ArgNLP 2014 Frontiers and Connections between Argumentation Theory and Natural Language Processing. http://ceur-ws.org/Vol-1341/paper14.pdf Tugwell, Peter, and J. André Knottnerus. 2015. “Is the ‘Evidence-Pyramid’ Now Dead?” Journal of Clinical Epidemiology 68 (11): 1247–50. https://doi.org/10.1016/j.jclinepi.2015.10.001 Vandenbroucke, Jan P., Erik Von Elm, Douglas G. Altman, Peter C. Gøtzsche, Cynthia D. Mulrow, Stuart J. Pocock, Charles Poole, James J. Schlesselman, and Matthias Egger. “Strengthening the Reporting of Observational Studies in Epidemiology (Strobe): Explanation and Elaboration.” Annals of Internal Medicine 147, no. 8 (2007): W–163.

Editor's Notes

  • #5 Topical organization “X.1 Being a Map of Physics This map is the culmination of a six-year-long labor of love by noted physicist, visual artist, poet, and peace activist Bernard H. Porter. Porter began compiling the historical data upon which the map is based in 1932 while a fellow in radioactive research at Brown University. He then took most of the summer of 1933, working out of his parent’s home in Houlton, Maine, to compose the map’s striking visuals. The following years were spent circulating the map among prominent physicists and historians of science to verify its accuracy. The end result is a rich geography of a scientific field, one that uses mapping conventions to make understandable the way ideas move and develop over time. Ambitious in scope, the map traces the history of physics from the 6th century B.C. to the present day. Key theoretical starting points such as ‘Mechanics,’ ‘Sound,’ ‘and Light’ appear as water sources from which streams of thought emerge. Located alongside these rivers are “villages” representing figures like Isaac Newton, Alessandro Volta, Werner Heisenberg, and other major contributors to the development of physics. Surrounding it all is the map’s border, which is decorated with 24 diagrams that frequently appear in the work of physicists.  References: Porter, Bernard. 1939. Being a Map of Physics. Courtesy of Maine State Library and Mark Melnicove. In "10th Iteration (2014): The Future of Science Mapping," Places & Spaces: Mapping Science, edited by Katy Börner and Samuel Mills. http://scimaps.org “
  • #16 Is Wikipedia a gateway to biomedical research? (Lauren Maggio)
  • #17 Image credit: Mellis, Craig. "Evidence-based Medicine: What Has Happened in the Past 50 Years?" J Paediatr Child Health 51, no. 1 (2015): doi:10.1111/jpc.12800.
  • #28 “As is commonly the case, different evidence is used by each lab - either because certain data were not accessible, or some labs judged certain data to be unreliable or irrelevant to the claim, or some labs interpreted the same data in different ways. SEPIO translates this scenario into the following narrative and set of instances to be represented in its formal modeling of the data.”