Standard models in evidence synthesis work well in settings characterized by a large evidence base, the absence of effect modifiers, and connected networks. Handling sparse data, substantial between-study heterogeneity and disconnected studies, however, poses challenges to researchers and requires advanced methodology.
In the absence of head-to-head studies, evidence synthesis is a well-established technique to indirectly compare novel and established interventions in various disease areas. In standard settings, the most established methods for various outcome types work well and result in realistic effect estimates. However, there are a variety of situations when standard methods may no longer be sufficient:
- if there is only a sparse network of evidence
- if there is a large amount of between-study heterogeneity
- if the network is disconnected
Key Topics Include:
- General introduction into the objectives of conducting evidence synthesis
- Description of typical situations of “non-standard” data, including sparse networks of evidence, a large amount of between-study heterogeneity, or disconnected networks
- Advanced methods to address non-standard data, including the use of informative priors, subgroup analyses, meta-regression and multi-level meta regression, and matching-adjusted indirect comparisons (MAICs)
- Case studies illustrating how these advanced methods of evidence synthesis are applied on actual data
Introduction to meta-analysis (1612_MA_workshop)Ahmed Negida
Chapter 1: Introduction to Meta-analysis
- From the 1612 MA Workshop that will be held on 11th, December, 2016 at Dokki, Giza, Egypt
- Workshop instructor: Mr. Ahmed Negida, MBBCh candidate
Explains how to select a statistical test suitable for your hypothesis. Suggests points to consider before deciding about a test. Gives a list of commonly used parametric and non-parametric tests with their purposes of use.
Introduction to meta-analysis (1612_MA_workshop)Ahmed Negida
Chapter 1: Introduction to Meta-analysis
- From the 1612 MA Workshop that will be held on 11th, December, 2016 at Dokki, Giza, Egypt
- Workshop instructor: Mr. Ahmed Negida, MBBCh candidate
Explains how to select a statistical test suitable for your hypothesis. Suggests points to consider before deciding about a test. Gives a list of commonly used parametric and non-parametric tests with their purposes of use.
An introduction on how to go about a meta-analysis. Primarily designed for people with non statistical background. Heavily borrows from Cochrane Handbook of Systematic Reviews of Interventions.
Clinical Research Statistics for Non-StatisticiansBrook White, PMP
Through real-world examples, this presentation teaches strategies for choosing appropriate outcome measures, methods for analysis and randomization, and sample sizes as well as tips for collecting the right data to answer your scientific questions.
What’s Next in US Payor Communications: The Impact of FDA's Proposed Guidance...Nathan White, CPC
The recent enactment of the 21st Century Cures Act has profound immediate and long-term implications for development and communication of HEOR/RWE in the US, particularly in relation to communications with payors about healthcare economic information (HCEI). In January, the FDA released draft guidance for public comment to outline its thinking around communication to payors of HCEI, but there are still unanswered questions to be addressed in the final guidance. Industry will need to quickly establish new policies and procedures to maintain compliance with the new regulations, especially in relation to OPDP submission requirements – a steep transition from a space that has largely been unregulated.
Basics of Systematic Review and Meta-analysis: Part 3Rizwan S A
A 4 part lecture series on the basics of Systematic Review and Meta-analysis, Part 3 discusses the software needed and analytical techniques used for this purpose.
Bayesian estimations of strong toxic signals [compatibility mode]Bhaswat Chakraborty
“Signals” of adverse drug reactions are, according to WHO, “reported information on a possible causal relationship between an adverse event and a drug, the relationship being unknown or incompletely documented previously. Usually more than a single report is required to detect a signal, depending on the seriousness of the event and the quality of the information.” Once a signal is detected, one can then analyze and confirm it. In detecting signals from large adverse drug reaction (ADR) databases, however, one has to use a procedure that is sensitive (low false negativity) and specific (high true positivity) for the purpose. A whole range of statistical methods have been applied for data mining and signal detection (SD) in pharmacovigilance (PV). My talk would be on Bayesian methods for SD.
The US FDA uses a Bayesian data mining approach developed by William DuMouchel called multi-item gamma poisson shrinker (MGPS). WHO also uses a Bayesian method (Andrew Bate) based on a Bayesian confidence propagation neural network (BCPNN). These estimates provide shrinkage towards zero of the observed to expected number of ADRs, e.g., the empirical Bayesian geometric mean (EBGM) or information component (IC). These Bayesian estimators are robust measures of ADR-drug association.
Bayesian approaches are intuitively appealing when very small numbers are involved and where there is a need of continuous reassessment of probability of association with the acquisition of new data over time. Bayesian estimates such as EBGM are close to null hypothesis of independence even when the data is scarce. For example, if the EBGM is 5 for a drug-renal toxicity combination, then this drug-event combination occurred, on an average, 5 times more frequently than expected in the data set. Several examples Bayesian SD will be given from current research projects.
An introduction on how to go about a meta-analysis. Primarily designed for people with non statistical background. Heavily borrows from Cochrane Handbook of Systematic Reviews of Interventions.
Clinical Research Statistics for Non-StatisticiansBrook White, PMP
Through real-world examples, this presentation teaches strategies for choosing appropriate outcome measures, methods for analysis and randomization, and sample sizes as well as tips for collecting the right data to answer your scientific questions.
What’s Next in US Payor Communications: The Impact of FDA's Proposed Guidance...Nathan White, CPC
The recent enactment of the 21st Century Cures Act has profound immediate and long-term implications for development and communication of HEOR/RWE in the US, particularly in relation to communications with payors about healthcare economic information (HCEI). In January, the FDA released draft guidance for public comment to outline its thinking around communication to payors of HCEI, but there are still unanswered questions to be addressed in the final guidance. Industry will need to quickly establish new policies and procedures to maintain compliance with the new regulations, especially in relation to OPDP submission requirements – a steep transition from a space that has largely been unregulated.
Basics of Systematic Review and Meta-analysis: Part 3Rizwan S A
A 4 part lecture series on the basics of Systematic Review and Meta-analysis, Part 3 discusses the software needed and analytical techniques used for this purpose.
Bayesian estimations of strong toxic signals [compatibility mode]Bhaswat Chakraborty
“Signals” of adverse drug reactions are, according to WHO, “reported information on a possible causal relationship between an adverse event and a drug, the relationship being unknown or incompletely documented previously. Usually more than a single report is required to detect a signal, depending on the seriousness of the event and the quality of the information.” Once a signal is detected, one can then analyze and confirm it. In detecting signals from large adverse drug reaction (ADR) databases, however, one has to use a procedure that is sensitive (low false negativity) and specific (high true positivity) for the purpose. A whole range of statistical methods have been applied for data mining and signal detection (SD) in pharmacovigilance (PV). My talk would be on Bayesian methods for SD.
The US FDA uses a Bayesian data mining approach developed by William DuMouchel called multi-item gamma poisson shrinker (MGPS). WHO also uses a Bayesian method (Andrew Bate) based on a Bayesian confidence propagation neural network (BCPNN). These estimates provide shrinkage towards zero of the observed to expected number of ADRs, e.g., the empirical Bayesian geometric mean (EBGM) or information component (IC). These Bayesian estimators are robust measures of ADR-drug association.
Bayesian approaches are intuitively appealing when very small numbers are involved and where there is a need of continuous reassessment of probability of association with the acquisition of new data over time. Bayesian estimates such as EBGM are close to null hypothesis of independence even when the data is scarce. For example, if the EBGM is 5 for a drug-renal toxicity combination, then this drug-event combination occurred, on an average, 5 times more frequently than expected in the data set. Several examples Bayesian SD will be given from current research projects.
How to scientifically conduct a clinical professional research trial? In the current era of Collaborate or parish, we need to keep this design in our mind.
Enjoy
@copyLeft
Evaluating a pratice guideline is essential given the rapid proliferation of them in the recent times. Here some general principles of evaluation of the guidelines are described with a guideline for panic disorder used in Australia, as an example.
Meta-analysis in Epidemiology is:
Useful tool for epidemiological studies which investigates the relationships between certain risk factors and disease.
Useful tool to improve animal well-being and productivity
Despite of a wealth of suitable studies it is relatively underutilized in animal and veterinary science.
Meta-analysis can provide reliable results about diseases occurrence, pattern and impact in livestock.
It is utmost essential to take benefit of this statistical tool for produce. more reliable estimates of concern effects in animal and veterinary science data.
Professor Martin Wiseman presentation on The Continuous Update Project: Novel approach to reviewing mechanistic evidence on diet, nutrition, physical activity and cancer at FENS European Nutrition Conference, 20-23 October 2015 Berlin (Germany).
Biostatistics in clinical research involves the application of statistical methods to analyze and interpret data from clinical trials. It plays a crucial role in study design, sample size determination, data analysis, and result interpretation. Biostatisticians ensure that clinical research findings are valid, reliable, and meaningful, contributing to evidence-based medicine. Their expertise helps researchers make informed decisions, assess treatment efficacy, and draw accurate conclusions about the safety and effectiveness of interventions.
Next-Generation Safety Assessment Tools for Advancing In Vivo to In Vitro Tra...InsideScientific
Join Prof. Victoria Hutter and Dr. Louis Scott as they showcase the application of high-content imaging and advanced cell lines for drug safety assessment.
Safety concerns play a significant role in the unsuccessful progression of candidate compounds in the later stages of drug development. Establishing the connection between in vitro endpoints and human health outcomes is essential.
In this webinar, Prof. Victoria Hutter and Dr. Louis Scott present a novel tool for in vitro safety assessment in drug development. The morph_ONE™ assay provides a human-centric approach to potentially fill specific regulatory gaps concerning safety issues. This tool is capable of profiling both human and rat alveolar macrophages, offering valuable insights for hazard identification and toxicity assessments. By bridging the divide between cellular effects and overall risk, it has the potential to enhance our understanding of safety-related aspects in drug development.
Key Topics Include:
- Explore distinct in vitro screening techniques for evaluating the safety of emerging inhaled products, facilitating early and informed decisions in compound selection and development.
- How high-content image analysis (HCIA) cell painting assays can be used as a forward-looking high-throughput screening tool, distinguishing unique response patterns in alveolar macrophages.
- Understand the use of the ImmuPHAGE™ and ImmuLUNG™ models in conducting customized evaluations focused on inhalation safety.
A Ready-to-Analyze High-Plex Spatial Signature Development Workflow for Cance...InsideScientific
In this webinar, Aditya Pratapa and Lorcan Sherry present a new workflow for analyzing multiplex immunoflurescence images.
Spatial Signatures are a new class of highly predictive biomarkers that measure the interactions and cellular densities of tumor and immune cells that compose the tumor microenvironment. Based on multiplex immunofluorescence, spatial signatures provide a deeper understanding of complex interactions between tumors and the immune system, enabling improved patient stratification for immunotherapies. A significant hurdle to date has been in developing a data analysis workflow that is straightforward and user-friendly to transform the data rich images into meaningful quantitative spatial signatures.
In this webinar, Aditya and Lorcan review the key features of the new PhenoImager HT 2.0 data analysis workflow. This workflow introduces a simplified framework from scanning to analyzing spectrally unmixed multiplex immunofluorescence images generated on the PhenoImager HT platform. The ready-to-analyze data can be directly imported into image analysis software such as Visiopharm. This presentation covers key aspects of data analysis elements such as image QC, segmentation, phenotyping, and verification – all essential for creating outputs that support the development of a spatial signature.
Key Topics Include:
- Understand Akoya’s new HT 2.0 data analysis workflow
- The challenges in multiplex immunofluorescence analysis and the use of AI and cell
lineage segmentation considerations
- Explore OracleBio’s image analysis workflow incorporating Visiopharm
- Evaluation of analysis data to facilitate spatial profiling and interpretation
Molecule Transport across Cell Membranes: Electrochemical Quantification at t...InsideScientific
In this webinar, Dr. Sabine Kuss will discuss the importance of transmembrane molecule exchange and how to detect and quantify membrane transport of molecules in cells.
Complex biological processes, such as the transport of molecules across cell membranes, are difficult to understand using purely biological methodologies. Investigating cellular transport processes is challenging, because of the highly complex chemical composition of cells and the diffusion of molecules in and around cells at low concentrations. The development and advancement of electroanalytical methods over the last two decades has enabled the monitoring of living cells and their interaction with the environment, including external stimuli, such as pharma-molecules.
This presentation emphasizes electrochemical and electrophysiological methods of detection and quantification but also makes a comparison to other bioanalytical approaches. Join us to discover a substantial diversity in methods used to monitor the transport of cell metabolites, crucial for cell survival, and pharmaceutical compounds, involved in cell characteristics such as drug resistance.
Key Topics Include:
- Understanding transmembrane molecule transport through bioanalytical methods
- Electrochemical approaches to monitor molecule transport across cell membranes
- What bioanalytical and especially electrochemical approaches can reveal
- Challenges associated with instrument limitations
Exploring Predictive Biomarkers and ERK1/2 Phosphorylation: A New Horizon in ...InsideScientific
In this webinar, Dr. Victor Arrieta highlights the link between p-ERK activation and improved survival in rGBM patients using anti-PD-1 immunotherapy.
Recurrent glioblastoma (rGBM) has displayed a varied response to anti-PD-1 immunotherapy, necessitating the identification of predictive biomarkers. Through extensive analyses and 3 clinical studies, we have identified that activation of the MAPK/ERK signaling pathway, particularly ERK1/2 phosphorylation (p-ERK), is associated with longer overall survival (OS) in rGBM patients receiving PD-1 blockade. Initially, enrichment of BRAF/PTPN11 mutations was reported in 30% of responsive rGBM patients, prompting the investigation of p-ERK as a potential marker beyond these mutations.
Our research has unraveled an association between p-ERK abundance and better clinical outcomes following PD-1 blockade, with p-ERK mainly localized in tumor cells. Notably, high p-ERK GBMs contained unique microglia and macrophage phenotypes with elevated MHC class II expression, suggesting a novel interplay between MAPK activation and the tumor immune microenvironment.
While these insights establish a pivotal role for p-ERK in predicting PD-1 blockade response in rGBM, the implementation in clinical settings calls for further validation and accuracy. Nonetheless, these findings pave the way for more personalized and effective immunotherapy strategies, emphasizing the significance of the tumor microenvironment and its interaction with therapeutic interventions in GBM.
Key Topics Include:
- The activation of the MAPK signaling pathway, specifically ERK1/2 phosphorylation (p-ERK), is identified as a predictive biomarker for longer overall survival in recurrent glioblastoma (eGBM) patients undergoing PD-1 blockade
- High p-ERK tumors in rGBM present a distinct myeloid cell phenotype with elevated MHC class II expression, signifying a connection between MAPK pathway activation and the immune microenvironment
- The implementation of p-ERK as a predictive biomarker in clinical settings requires further validation and exploration of variables impacting its evaluation
Exploring Estrogen’s Role in Metabolism and the Use of 13C-Labeled Nutrients ...InsideScientific
Dr. Reilly Enos and Dr. Eran Levin discuss estrogen's metabolic impact and how isotopic labeling and 13C-labeled nutrients can be used for animal physiology and nutrition research.
Reilly Enos, PhD – Harnessing the power of estrogen to regulate metabolic processes
Dr. Reilly Enos’ research focuses on the role that sex steroids and their receptors play in regulating metabolic processes, particularly in the setting of obesity. In this webinar, Dr. Enos will discuss his research on tissue-specific fluctuations of sex steroids throughout the estrous cycle in mice, provide insights into the importance of the quantity of estrogen necessary to impact physiological processes, as well as an understanding of the central versus peripheral effects of estrogen action.
Eran Levin, PhD – Unlocking Insights: Utilizing 13C Labeled Nutrients for Cutting-Edge Physiology and Nutrition Research
Dr. Eran Levin will discuss the potential of using 13C-labeled nutrients in physiology and nutrition research in animal models. Specifically, he will share practical tips for designing and conducting experiments using isotopic labeling techniques and demonstrate how they can provide unprecedented insights into metabolic pathways, nutrient utilization, and behaviors in both vertebrate and invertebrate models including insects, reptiles, and mammals.
Key Topics Include:
- The role that estrogen plays in regulating metabolic and behavioral processes in males and females
- The tissue-specific fluctuations of sex steroids throughout the estrous cycle
- Insight into the importance of tissue-specificity in developing hormonal therapies
- The importance of estrogen quantity in regulating physiological processes
- Understand the diverse range of 13C labeled nutrients available
- Specific applications of labeled amino acids in studies of protein metabolism, cellular signaling, and typical nutrient utilization
- How to integrate 13C labeling techniques with respirometry for a comprehensive assessment of metabolic processes, energy expenditure, and substrate utilization in animal models
- How to calculate metabolic rates in free-flying animals using 13C bicarbonate
Longitudinal Plasma Samples: Paving the Way for Precision OncologyInsideScientific
Experts present a cell-free plasma biobank and describe the role of longitudinal plasma samples for cancer research, disease monitoring, and biomarker development.
Through liquid biopsies, it is now possible to repeatedly and non-invasively interrogate the molecular landscape of solid tumors via a blood draw over the whole treatment course. Until now, liquid biopsies can be used for screening, disease monitoring and prognosis. Circulating tumor DNA (ctDNA) and circulating tumor cells (CTCs) have been the most explored targets in this technology for commercial applications up to the present time.
In collaboration with a continuously expanding oncology network, Indivumed Services has established a unique high-quality cell-free plasma biobank that is exclusively focused on collecting longitudinal whole blood samples from cancer patients. This allows molecular insight by providing quick access to longitudinal plasma from cancer patients that have undergone treatment. ctDNA can then be isolated from longitudinal cell-free plasma to allow for monitoring of disease progression by providing diagnostic and prognostic information, potentially in real time.
Key Topics Include:
- Gain insights into Indivumed Services’ longitudinal plasma collection process
- Understand the advantages and benefits of utilizing longitudinal plasma samples for cancer research
- Explore applications of longitudinal plasma samples for biomarker research and development of companion diagnostics
Fully Characterized, Standardized Human Induced Pluripotent Stem Cell Line an...InsideScientific
In this webinar, experts present a standardized stem cell line and its differentiation into neural cells for disease modeling and assay development.
Reproducible research with human induced pluripotent stem cells (iPSCs) depends on thoroughly characterized and quality-controlled cell lines. In this webinar, Dr. Andrew Gaffney and Dr. Erin Knock from STEMCELL Technologies describe the generation of a standardized induced pluripotent stem cell (iPSC) line. Developed with the upcoming ISSCR Standards Initiative characterization guidelines in mind, this highly characterized line is karyotypically stable, demonstrates trilineage differentiation potential, and expresses undifferentiated cell markers. Further, STEMCELL has developed a highly pure, ready-to-use neural progenitor cell product expressing PAX6 and SOX1 over multiple passages.
Dr. Knock shows how these multipotent cells are suitable for customized downstream differentiation to various CNS cell types, such as forebrain neurons, midbrain neurons, and astrocytes. These progenitor cells are the ideal controls for standardizing downstream differentiation protocols, modeling diseases, and assay development.
Key Topics Include:
- Discover how STEMCELL’s induced pluripotent stem cell lines are derived and characterized
- Learn how to differentiate induced pluripotent stem cell lines into all three germ layers
- Explore the features of STEMCELL’s neural progenitor cell product
- Differentiate neural progenitor cells into a variety of neural cell types, including neurons and glia
How to Create CRISPR-Edited T Cells More Efficiently for Tomorrow's Cell Ther...InsideScientific
Ian Foster and Steven Loo-Yong-Kee discuss Artisan Bio's STAR-CRISPR system for optimized gene editing in cell therapy, with a focus on the genetifc modification of T cells for cancer immunotherapy.
Cell therapy is an emerging field with great promise for the treatment of various diseases. One of the most exciting areas of cell therapy is the use of T cells that have been genetically modified to recognize and kill cancer cells. While the use of T cells for cancer immunotherapy has tremendous promise, there is still room for improvement. The efficiency, expansion, and functionality of T cells can be enhanced by genetic modification using the STAR-CRISPR system.
Artisan Bio is a biotechnology company focused on developing a CRISPR-mediated editing platform to improve the efficacy and safety of cell therapy products. In this webinar, we will provide a comprehensive overview of Artisan Bio’s STAR-CRISPR system, which is designed to improve the specificity and efficiency of gene editing for cell therapies. We will explain the system’s key components and how we are using a risk-based approach to optimize and validate the editing platform. The webinar will focus on Artisan Bio’s approach to building T cell OS/APPS through iterative improvements to achieve best-in-class editing capabilities and improved cell health metrics.
Key Topics Include:
- Learn about Artisan Bio’s proprietary high-performance STAR-CRISPR system for improving the specificity and efficiency of gene editing for cell therapies
- Explore Artisan Bio’s risk-based, systems approach to technology development, including how to implement Design of Experiments (DoE) and Quality by Design (QbD) principles to optimize and validate any process
- Case study of the application of QbD to Artisan Bio’s STAR-CRISPR platform to edit T cells for cancer immunotherapy with preliminary data showing improved efficacy, expansion, and functionality
Peripheral and Cerebral Vascular Responses Following High-Intensity Interval ...InsideScientific
Dr. Bert Bond and Max Weston will present an overview on their study investigating the effects high-intensity interval exercise has on cerebrovascular health.
Physical activity reduces the risk of developing cardiovascular diseases (CVD) and dementia. This benefit cannot be explained by changes in traditional CVD risk factors alone, and direct improvements in vascular health are thought to play a key role. However, our understanding of how exercise can be optimized for improvements in blood-vessel health is limited.
High-intensity interval exercise (HIIE) is known to improve peripheral vascular function, and there is a growing interest in the effects of HIIE on cerebrovascular health. However, it is not clear whether the acute improvements in peripheral vascular function following HIIE are also seen in the major blood-vessels of the brain.
In the Bond lab’s study, 30 minutes of HIIE completed at both 75% and 90% V̇O2max improved peripheral vascular function 1 and 3h following exercise in healthy young adults, compared with work-matched continuous moderate-intensity exercise and a sedentary control condition. By contrast, cerebrovascular function was unchanged following all conditions. This is the first study to identify that acute improvements in peripheral vascular function following high-intensity interval exercise are not mirrored by improvements in cerebrovascular function in healthy young adults.
Leveraging Programmable CRISPR-Associated Transposases for Next-Generation Ge...InsideScientific
Dr. Sam Sternberg discusses a novel CRISPR-Cas9 system using programmable, RNA-guided transposase, and highlights its implications for kilobase-scale genome engineering in cell and gene therapies.
The utility of programmable, RNA-guided CRISPR-Cas systems in genome engineering continues to evolve. Nature has afforded scientists novel and diverse gene editing functionality, from nuclease-dependent CRISPR-Cas9 to second-generation base and prime editors that do not produce double-strand breaks.
In this webinar, Dr. Sam Sternberg describes a new CRISPR-Cas9 paradigm relying on nuclease-deficient bacterial transposons that catalyze RNA-guided integration of mobile genetic elements into the genome. The discovery of a fully programmable, RNA-guided transposase lays the foundation for kilobase-scale genome engineering with broad applications for developing cell and gene therapies.
Key Topics Include:
- The basics of first- and second-generation CRISPR-Cas technologies from a scientist at the forefront of their development
- Mechanisms, accommodation, and cell type diversity of CRISPR-Cas programmable transposition
- How transposase factor coordination enables highly specific, genome-wide DNA integration to target sites
- Implications of programmable transposases that obviate the need for DNA double-strand breaks and homologous recombination
Simple Tips to Significantly Improve Rodent Surgical OutcomesInsideScientific
Dr. Marcel Perret-Gentil presents six simple-to-implement techniques to significantly improve surgical outcomes.
You may feel proficient, even confident in performing rodent surgery; however, you may be surprised how simple improvements can have a huge impact to your animal’s recovery and data. The presentation is designed for individuals who have minimal or no rodent surgical skills but is also a great opportunity for those with considerable experience wanting to improve outcomes as well as teach such key principles.
Key Topics Include:
- Improve surgical outcomes that will lessen post-op morbidity and mortality
- Improve data yield after rodent surgery
- Implementation of key principles into a rodent surgical program
Cardiovascular Autonomic Dysfunction in the Post-COVID Landscape: Detection a...InsideScientific
A world-wide spread of the novel Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) has triggered a pandemic and generated over 600 million reported cases around the globe. A substantial portion of patients who suffered Corona Virus Disease 2019 (COVID-19) have not recovered completely but continue to experience lingering symptoms for months to years. This novel clinical syndrome has been termed Long COVID or Post-acute Sequalae of COVID-19 (PASC).
Observational studies have indicated that in about one third of cases PASC can be associated with cardiovascular (CV) autonomic dysfunction including postural orthostatic tachycardia syndrome, inappropriate sinus tachycardia, orthostatic hypotension, reflex syncope and microvascular dysfunction. The presence of CV autonomic dysfunction in PASC is important to detect since although frequently overlooked, it may be effectively treated in contrast to many other Long COVID-related symptoms.
This webinar highlights CV dysautonomia as a specific sequalae of acute COVID-19 and guides the audience in the diagnostic work-up of PASC patients with suspected cardiovascular complications.
Creating Better Gene-Edited Cell Lines with the FAST-HDR SystemInsideScientific
Cell lines are the core of biological research. Scientists need cell lines for drug development, basic biology research, safety testing, and biologic therapeutic production. Since the 1980s, genetic manipulation has allowed researchers to tailor cell lines to the experiment or production purpose. Over time, the requirements for these cell lies have risen. In many cases, the cells require multiple genetic edits and must produce data that passes FDA. Moreover, the current funding environment often requires rapid delivery of these cells so scientists can produce data to support further budget and/or investment. This is particularly acute for knock-in cell lines. Current technologies may take months to complete a cell line, allow a limited number of edits, and often have off-target effects that are not suitable for FDA filings. ExpressCells uses its patented FAST-HDR plasmid--along with CRISPR, to address these problems. The FAST-HDR process can precisely knock-in multiple genes (while supporting other types of genetic modifications), ensure precise placement of these edits, and deliver them months faster than competing technologies.
This webinar will discuss the basis of the FAST-HDR technology and illustrate several uses. The first part is a presentation by Oscar Perez-Leal, MD, the inventor of the technology. Oscar will discuss the problems he faced as a researcher and how FAST-HDR was designed to address them. He will outline the details of the technology, the history of its development, and several examples where he used FAST-HDR. The second part is a conversation with Jon Weidanz, PhD. Jon will outline the challenges he faced at AbeXXa and how he selected a FAST-HDR custom cell line for his project. He'll outline the learnings from using this cell line, some of which were unexpected, but valuable to future development.
By attending this program, attendees will:
- Understand the current challenges in creating custom gene-edited cell lines
- Know the technology underlying the FAST-HDR gene-editing system, including its use with CRISPR
- Be able to describe the advantages of the FAST-HDR system
- Learn about several case studies using gene-edited cell lines
Functional Recovery of the Musculoskeletal System Following Injury - Leveragi...InsideScientific
Watch Dr. Sarah Greising discuss the current pathophysiologic understanding of the skeletal muscle remaining following traumatic musculoskeletal injuries.
Volumetric muscle loss (VML) injuries result in the abrupt loss of skeletal muscle fibers, causing chronic functional disability in part due to limited muscle regeneration and vast co-morbidities. With a focus on clinically relevant outcome measurements for skeletal muscle function in both small and large animal models of VML injury, this webinar presents various near-term interventions for the restoration of tissue function following complex injuries. Interventions evaluated focus on regenerative rehabilitation approaches using regenerative pharmaceuticals to correct underlying muscle pathophysiology.
Designing Causal Inference Studies Using Real-World DataInsideScientific
In this webinar, experts provide an overview of causal inference, along with step-by-step guidance to designing these studies using real-world healthcare data.
Causal inference is used to answer cause and effect research questions and yield estimates of effect. Causal study design considerations and statistical methods address the effects of confounding variables and other potential biases and allow researchers to answer questions such as, “Does treatment A produce better patient outcomes compared to Treatment B?”
Causal study interpretations have traditionally been restricted to randomized controlled trials; however, causal inference applied to observational healthcare data is growing in importance, driven by the need for generalizable and rapidly delivered real-world evidence to inform regulatory, payer, and patient/provider decision making. The application of causal inference methods leads to stronger and more powerful evidence. When these techniques are applied to observational data, the results generated are both from and for the real world.
Presenters walk through several real-world case studies including the PCORI-funded BESTMED study and a collaborative study with a prominent pharmacy payer.
Social Media Data: Opportunities and Insights for Clinical ResearchInsideScientific
Many new data are emerging in recent years - real time data is collected through digital health technologies, including apps and wearables, monitoring data, social media data, public datasets, and patient organization data, in addition to primary and secondary datasets.
Real life data are highly informative and can be used to address a range of challenges throughout the product life cycle. Data from social media can generate valuable insights as patients often gather in digital communities to get answers and share their experiences. Conversations on social networks merit special consideration as they can have real world influence over treatment management decisions.
Social media data can reveal the motivations that impact patient healthcare decisions and behaviors through each stage of the care pathway. These data provide both the patient and caregiver perspectives at the same time. For this reason, conversations on social networks offer an opportunity to deepen our understanding on:
- The fears and hopes associated with patient treatments
- Daily needs and difficulties patients are facing in managing their disease
- The impact of disease on patient health related quality of life
- Identification in real life of the stages of the care pathway and patient perceptions
- Reactions to health policies
Watch this webinar for insights on how to collect, use, analyze, and interpret social media data in different contexts. Our experts share knowledge from over fifteen years of successfully developing and adapting algorithms to treat this kind of data.
We Are More Than What We Eat Dietary Interventions Depend on Sex and Genetic ...InsideScientific
To learn more visit: https://insidescientific.com/webinar/we-are-more-than-what-we-eat-dietary-interventions-depend-on-sex-and-genetic-background/
Despite evidence that sex and genetic background are key factors in the response to diet, most studies of how diet regulates metabolic health and even longevity in mice examine only a single strain and sex.
Using multiple strains and both male and female mice, Dr Lamming's team has found that improvements in metabolic health and in longevity in response to reduced levels of protein or specific amino acids strongly depend on sex and strain. While some phenotypes were conserved across strains and sexes, including increased glucose tolerance and energy expenditure, they observed high variability in adiposity, insulin sensitivity, and circulating hormones. Using a multi-omics approach, they identified mega-clusters of differentially expressed hepatic genes, metabolites, and lipids associated with each phenotype, gaining new insight into role of the energy balance hormone FG21 in the response to protein restriction.
Antibody Discovery by Single B Cell Screening on Beacon®InsideScientific
Amy Sheng, PhD provides an overview of antibody screening platforms and presents applications and case studies using the Beacon® platform for antibody discovery.
Single B cell screening is a powerful and efficient strategy for generating antigen-specific monoclonal antibodies. Distinguished with fluorescence-activated B cell sorting, the Beacon® platform is based on plasma cell screening, making it easier to obtain antibody genes.
The Beacon® single-cell optofluidic system combines a unique optoelectro positioning (OEP) technology with novel microfluidic technology. It can be used to accurately select single cells on a chip, perform multiple single-cell assays, and export target cells based on specific results. The Beacon® optofluidic platform preserves the diversity of B cells, generating high-quality positive hits at an early stage of discovery and avoiding the loss of “good clones”.
Key Topics Include:
- B cell differentiation and development
- Pros and cons of mainstream antibody screening platforms
- Mechanisms, applications, and case studies using the Beacon® platform for antibody screening
- Sino Biological’s capacity using the Beacon® platform
Experimental Design Considerations to Optimize Chronic Cardiovascular Telemet...InsideScientific
Phil Griffiths, PhD, presents a summary of chronic cardiovascular telemetry studies and considerations for experimental design.
Ensuring you collect the best and most physiologically accurate data from your chronic telemetry experiments requires careful planning and experimental design. This webinar will give an insight into the practical aspects of designing chronic animal experiments to set you on the best path for success. The benefits of chronic studies, how to select the most appropriate sample size for your study, some basic tips and tricks for data acquisition and handling, and how to ensure high animal welfare are discussed.
Key Topics Include:
- What are the benefits of chronic over acute studies?
- How to decide the best sample sizes and the length of experiments?
- Basic tips for data acquisition and handling
- How to maintain high animal welfare standards
Strategic Approaches to Age-Related Metabolic Insufficiency and Transition in...InsideScientific
In this webinar, Dr. Dennis Turner delves into dementia syndrome, the metabolic changes that occur, and the importance of proper physiological monitoring of animal models.
Brain metabolism transforms with normal aging, and transient, dynamic metabolic insufficiency may underlie critical progression from aging into dementia syndrome and Alzheimer’s disease (AD). Age-related brain metabolism balances vascular-related substrate supply and transport mechanisms into extracellular space to neurons with cellular metabolic needs and utilization. Dynamic metabolic insufficiency can occur when there is intermittent supply-demand mismatch.
Adequacy of neurovascular coupling to provide sufficient cerebral blood flow (CBF) to meet neuronal demand in vivo in a mouse AD model, compared to aged controls were studied. Dr. Turner’s lab analyzed the response to maximal neuronal metabolic demands, spreading depression and anoxia, using imaging, CBF measurements, and oxygen and glucose levels. These in vivo studies require human-similar anesthesia conditions, through monitoring temperature, blood pressure/pulse oximetry, and respiration, to maintain homeostasis. The lab confirmed abnormal neurovascular coupling in a mouse model of AD in response to these metabolic challenges, showing disruption much earlier in dementia than in equivalently aged individuals. Chronic metabolic treatments could influence dementia syndrome progression.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
This presentation explores a brief idea about the structural and functional attributes of nucleotides, the structure and function of genetic materials along with the impact of UV rays and pH upon them.
What is greenhouse gasses and how many gasses are there to affect the Earth.moosaasad1975
What are greenhouse gasses how they affect the earth and its environment what is the future of the environment and earth how the weather and the climate effects.
hematic appreciation test is a psychological assessment tool used to measure an individual's appreciation and understanding of specific themes or topics. This test helps to evaluate an individual's ability to connect different ideas and concepts within a given theme, as well as their overall comprehension and interpretation skills. The results of the test can provide valuable insights into an individual's cognitive abilities, creativity, and critical thinking skills
Deep Behavioral Phenotyping in Systems Neuroscience for Functional Atlasing a...Ana Luísa Pinho
Functional Magnetic Resonance Imaging (fMRI) provides means to characterize brain activations in response to behavior. However, cognitive neuroscience has been limited to group-level effects referring to the performance of specific tasks. To obtain the functional profile of elementary cognitive mechanisms, the combination of brain responses to many tasks is required. Yet, to date, both structural atlases and parcellation-based activations do not fully account for cognitive function and still present several limitations. Further, they do not adapt overall to individual characteristics. In this talk, I will give an account of deep-behavioral phenotyping strategies, namely data-driven methods in large task-fMRI datasets, to optimize functional brain-data collection and improve inference of effects-of-interest related to mental processes. Key to this approach is the employment of fast multi-functional paradigms rich on features that can be well parametrized and, consequently, facilitate the creation of psycho-physiological constructs to be modelled with imaging data. Particular emphasis will be given to music stimuli when studying high-order cognitive mechanisms, due to their ecological nature and quality to enable complex behavior compounded by discrete entities. I will also discuss how deep-behavioral phenotyping and individualized models applied to neuroimaging data can better account for the subject-specific organization of domain-general cognitive systems in the human brain. Finally, the accumulation of functional brain signatures brings the possibility to clarify relationships among tasks and create a univocal link between brain systems and mental functions through: (1) the development of ontologies proposing an organization of cognitive processes; and (2) brain-network taxonomies describing functional specialization. To this end, tools to improve commensurability in cognitive science are necessary, such as public repositories, ontology-based platforms and automated meta-analysis tools. I will thus discuss some brain-atlasing resources currently under development, and their applicability in cognitive as well as clinical neuroscience.
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...Wasswaderrick3
In this book, we use conservation of energy techniques on a fluid element to derive the Modified Bernoulli equation of flow with viscous or friction effects. We derive the general equation of flow/ velocity and then from this we derive the Pouiselle flow equation, the transition flow equation and the turbulent flow equation. In the situations where there are no viscous effects , the equation reduces to the Bernoulli equation. From experimental results, we are able to include other terms in the Bernoulli equation. We also look at cases where pressure gradients exist. We use the Modified Bernoulli equation to derive equations of flow rate for pipes of different cross sectional areas connected together. We also extend our techniques of energy conservation to a sphere falling in a viscous medium under the effect of gravity. We demonstrate Stokes equation of terminal velocity and turbulent flow equation. We look at a way of calculating the time taken for a body to fall in a viscous medium. We also look at the general equation of terminal velocity.
Toxic effects of heavy metals : Lead and Arsenicsanjana502982
Heavy metals are naturally occuring metallic chemical elements that have relatively high density, and are toxic at even low concentrations. All toxic metals are termed as heavy metals irrespective of their atomic mass and density, eg. arsenic, lead, mercury, cadmium, thallium, chromium, etc.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Salas, V. (2024) "John of St. Thomas (Poinsot) on the Science of Sacred Theol...Studia Poinsotiana
I Introduction
II Subalternation and Theology
III Theology and Dogmatic Declarations
IV The Mixed Principles of Theology
V Virtual Revelation: The Unity of Theology
VI Theology as a Natural Science
VII Theology’s Certitude
VIII Conclusion
Notes
Bibliography
All the contents are fully attributable to the author, Doctor Victor Salas. Should you wish to get this text republished, get in touch with the author or the editorial committee of the Studia Poinsotiana. Insofar as possible, we will be happy to broker your contact.
Evidence Synthesis for Sparse Evidence Base, Heterogeneous Studies, and Disconnected Networks
1. Copyright 2022. All Rights Reserved. Contact Presenter for Permission
Evidence Synthesis for
Sparse Evidence Base,
Heterogeneous Studies, and
Disconnected Networks
Matthias Hunger, MSc, Dr. rer. biol. hum
Lead Epidemiologist
Global HEOR & Epidemiology
ICON
Nathan Green, PhD
Senior Research Fellow
Department of Statistical Science
University College London
Katrin Haeussler, MSc, PhD
Senior Health Economist
Global HEOR & Epidemiology
ICON
2. Introduction into evidence synthesis
Heterogeneity
Case study on choosing suitable priors
Matching-adjusted indirect comparison (MAIC)
Multilevel network meta-regression (ML NMR)
2
Agenda
4. What is evidence synthesis?
4
– All relevant randomized controlled trials (RCTs) of high
quality in a medical area are identified through a
systematic review of the literature
– These RCTs are all put together “in one melting pot”
– Statistical methodology is used to calculate a pooled
treatment effect (e.g. a relative effect in terms of odds
ratio or relative risk or an absolute effect in terms of risk
difference)
– This pooled treatment effect can help to draw conclusions
on comparative efficacy and safety of interventions of
interest
Image source: Irish Medieval Food - Pottage
Lora O'Brien - Irish Author & Guide
5. Why do we conduct evidence synthesis?
5
– One RCT alone is not enough
evidence to come to a final
conclusion on most effective and
safest treatment
– RCTs often result in contradictory
conclusions
– Quality assessment can help
investigating which RCTs are
of best design and therefore
most reliable
– Evidence synthesis of high quality
RCTs can help overcome
contradictory conclusions of
individual RCTs Drawing by Maki Naro, source: https://slate.com/technology/2015/04/vaccines-
and-autism-a-new-study-shows-no-connection.html
6. Which methods of evidence synthesis exist?
6
– Head-to-head studies are available comparing
experimental intervention (Tx A) to comparator of
interest (Tx B)
– à Simplest approach: Pairwise meta-analysis
– Weights are assigned to individual studies
based on variance
– No head-to-head studies are available
– à More complex approaches involving indirect
comparisons
– Bucher method based on simple equations
– Network meta-analysis (NMA) based on
generalized linear models
– Bayesian and frequentist approaches
available
Tx A Tx B
Study A, 2007
Study B, 2005
Study C, 2010
Study D, 2015
Study E, 2008
PBO Tx B
Tx A
Study G, 2009
Study H, 2016
Study J, 2012
Study K, 2015
Study L, 2014
Indirect
Comparison
A vs B
7. Comparison of Bayesian and Frequentist approaches
Frequentist
– Based on repeated experiments
– In a frequentist NMA, hypothesis testing takes place and
results can be interpreted as showing a statistically significant
difference or the absence thereof.
– The 95% CI for many samples would contain the true
population parameter 95% of the time. CIs cannot be
interpreted in terms of probabilities
– No additional information can be included and the analysis is
based solely on the observed data.
– Ranking through P-Score
– Can either be based on weighted regression models
following Rücker or conducted by means of the Bucher
method. The Bucher ITC is based on simple equations.
Bayesian
– Formal combination of prior distribution with likelihood
distribution to obtain a posterior distribution.
– Every parameter is defined as a random variable
– No hypothesis testing takes place in a Bayesian NMA.
Therefore, comparability of treatments can be shown
directly and we do not speak of statistical significance.
Treatments are deemed comparable or one treatment is
favorable over another treatment.
– Interpretation of 95% credible intervals is not based on
repeated experimentation and therefore can be conducted
as: with 95% probability, a certain value lies within the
credible interval.
– Additional information can be included in the priors to
strengthen the evidence base
– Ranking through SUCRA
– Probabilities of each treatment to be better than each
comparator can be estimated
– Following NICE DSU guidelines
Hackenberger, B.K.: Bayes or not Bayes, is this the question? Croat Med J 2019;60(1):50-52.
10. Similarity assumption
10
– The key assumption for indirect treatment comoparisons is the similarity
assumption
– Similarity means that one would expect
the relative effect of A vs. PBO to remain
unchanged if the study was conducted
under the conditions of the B vs. PBO
study (and vice versa)
– Similarity means that the pairs of trials
analyzed are comparable regarding
potential effect modifiers
12. Feasibility of NMA
Do the patients in all trials match the target population
for decision / inference
Yes, all trials No, only a subset
Combine in NMA
Are there reasons to think that
there are differences
in treatment effect
No Yes
1: Restrict NMA
to relevant trials
2: Consider subgroup
analysis / meta-regression
3: Use methods addressing effect
modifying (MAIC, ML-NMR)
13. 13
Networks we like…and do not like…
Large network of evidence Sparse network of evidence
Disconnected network of evidence
S
t
u
d
y
9
S
t
u
d
y
A
S
t
u
d
y
B
S
t
u
d
y
C
Study 8
Study K
S
t
u
d
y
1
2
S
t
u
d
y
Y
S
t
u
d
y
Z
Study 13
Study U
Study V
Study W
Study 7
Study D
Study E
Study 13
Study U
Study V
Study W
Study 13
Study U
Study V
Study W
Study
7
Study
D
Study
E
S
t
u
d
y
7
S
t
u
d
y
D
S
t
u
d
y
E
S
tu
d
y
1
S
tu
d
y
2
S
tu
d
y
N
S
tu
d
y
O
S
tu
d
y
P
Study 11
Study X
Study 3
Study L
Study M
Study 10
Study G
Study H
Study I
Study J
Study 6
Study Q
Study R
Study S
Study T
S
t
u
d
y
1
0
S
t
u
d
y
G
S
t
u
d
y
H
S
t
u
d
y
I
S
t
u
d
y
J
S
tu
d
y
1
0
S
tu
d
y
G
S
tu
d
y
H
S
tu
d
y
I
S
tu
d
y
J
S
t
u
d
y
4
S
t
u
d
y
5
S
t
u
d
y
F
Study 8
Study 7
S
t
u
d
y
1
0
S
t
u
d
y
G
S
tu
d
y
1
0
S
tu
d
y
G
S
tu
d
y
H
S
tu
d
y
I
S
tu
d
y
J
S
t
u
d
y
9
S
t
u
d
y
A
S
t
u
d
y
B
S
t
u
d
y
C
Study 13
Study U
Study V
Study W
Study 13
Study U
Study V
Study W
Study 13
Study U
Study V
Study W
Study
7
Study
D
Study
E
S
t
u
d
y
7
S
t
u
d
y
D
S
t
u
d
y
E
S
t
u
d
y
4
S
t
u
d
y
5
S
t
u
d
y
F
– Standard case
– NICE DSU guidelines can be followed straightforwardly
– Informative priors could be investigated
– Different model assumptions in Bayesian and frequentist approaches can result
in different degrees of confidence in the results
– Bayesian model: Crucial to elicit suitable priors
– Frequentist model: No priors used, yet not as flexible
– MAIC would be a possibility to compare to
disconnected single-arm study
Study 10
S
t
u
d
y
1
0
S
t
u
d
y
G
S
t
u
d
y
1
0
S
t
u
d
y
G
S
t
u
d
y
H
S
t
u
d
y
I
S
t
u
d
y
J
Effect modifying
– MAIC, ML-NMR
15. Non-informative priors on between-study standard deviation τ –
uniform priors
– As per NICE DSU, standard non-informative priors on between-study standard deviation are usually
uniform(0,5) or uniform(0,2)
– These ensure that the highest density of sd is usually at low values around 0-0.5, with a long tail of the
distribution
– These are considered to be non-informative and therefore can lead to unrealistically wide credible interval
bounds in case of sparse evidence base
– A possible alternative would be to use so-called „weakly informative“ priors instead
– Commonly, a variety of half-normal and gamma distributions are used in the literature
Ren S., Oakley J.E., Stevens J.W.: Incorporating Genuine Prior Information about Between-Study Heterogeneity in Random Effects Pairwise
and Network Meta-analyses. Medical Decision Making 2018, 38(4):531-542.
16. Informative priors on between-study standard deviation τ
– Informative priors on the between-study heterogeneity parameter τ are assumed to follow t and log-
normal distributions for continuous and binary outcomes, respectively.
– If the number of trials is small (4 studies or less), the “default” practice of using non-informative
priors for the between trial standard deviation (τ ~ Uni(0,5)) is likely to result in posteriors which
allow for unrealistically high levels of heterogeneity (i.e. resulting in extremely wide 95% credible
intervals).
– The solution advised by NICE is to use informative priors, based on expert opinion or on meta-
epidemiological data (NICE DSU TSD3 p16).
– Rhodes et al. present informative priors on respiratory diseases, obtained through predictive
distributions for continuous outcome. The authors found that heterogeneity was substantially lower
in meta-analyses related to respiratory diseases and therefore show separate results.
– Turner et al. present informative priors for binary outcome.
Rhodes et al.: Predictive distributions were developed for the extent of heterogeneity in meta-analyses of continuous outcome data. J Clin Epidemiol. 2015; 68(1):52-60.
Turner et al.: Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis. Statist. Med. 2015, 34 984-998.
17. Weakly informative priors on between-study standard
deviation τ – half-normal priors
“In practice, the half-normal distribution is quite commonly used; the reasons for its popularity are probably its simple and familiar
form, its near-uniform behavior at the origin along with a reasonably quickly decaying upper tail, as well as considerations of
numerical stability.”
Röver C., Bender R., Dias S., Schmid C.H., Schmidli H., Sturtz S., Weber S., Friede T.: On weakly informative prior
distributions for the heterogeneity parameter in Bayesian random-effects meta-analysis. Res Syn Meth. 2021;12:448-474.
18. Network of evidence on count outcome
18
S
t
u
d
y
3
S
t
u
d
y
6
Comparator H
S
t
u
d
y
7
Study 7
Study 7
S
t
u
d
y
2
Study 5
Comparator E
Intervention X
Study 4
Comparator I
Comparator D
Study
1
Comparator G
Comparator F
Comparator A
Comparator C
S
t
u
d
y
4
Comparator B
Study 4
19. Results using different priors
Model fit
DIC -6.71
19
SD 0.39, 95% CrI [0.01; 2.25]
Results on non-informative uni(0,5) prior on τ Results on informative t(-5.18,2.47²,5) prior on log(τ²)
DIC -7.23 SD 0.073, 95% CrI [0.003; 0.523]
Model fit
Model fit
DIC -5.88
SD 0.38, 95% CrI [0.04; 1.60] SD 0.13, 95% CrI [0.01; 0.35]
Results on weakly informative HN(0,0.15²) prior
DIC -5.65
Results on weakly informative HN(0,0.16²) prior
Model fit
Favors Intervention X Favors comparator
Favors Intervention X Favors comparator
IR Ratio (CI) of Intervention X vs. Other
Favors Intervention X Favors comparator
IR Ratio (CI) of Intervention X vs. Other
20. Results on weakly informative HN(0,τ2) prior, threshold analysis
τ Intervention X in favour of comparators
0.11 Comparator G
Comparator E
Comparator D
Comparator C
Comparator B
0.12 Comparator G
Comparator D
Comparator B
0.13 Comparator G
Comparator D
Comparator B
0.14 Comparator G
Comparator B
0.15 Comparator G
0.16 -
20
21. Justification of prior selection
– A suitable prior on between-study heterogeneity τ has to be chosen with care
and the selection has to be justified
– One possibility is to focus on informative priors from the literature, such as
Turner et al. or Rhodes et al.
– If a particular half-normal, Gamma, Exponential, ... prior is to be selected, this
has to be justified by assessing relative and absolute model fit
– Relative model fit in terms of Deviance Information Criterion (DIC)
– Absolute model fit to the data in terms of posterior predictive checks
22. Posterior predictive checks
22
– Assessing model fit to data
– How much do posterior inferences change when other prior distributions are used?
– If the model fits, the replicated data generated under the model should look similar to observed data – the observed data
should look plausible under the posterior predictive distribution
– An observed discrepancy can be due to model misfit or chance
– Simulated values are drawn from the posterior predictive distribution of replicated data and compared to samples of observed
data
– Any systematic differences between the simulations and the data indicate potential failings of the model
– Graphical posterior predictive checks
– The data are displayed alongside simulated data from the fitted model, and systematic discrepancies between real and simulated data
are searched for
– All data can be displayed directly
– Data summaries or parameter inferences can be displayed in case of large datases
– Graphs of residuals or other measures of discrepancy between model and data can be shown
– Numerical posterior predictive checks
– Specifying a test quantity and an appropriate predictive distribution
– Discrepancy between test quantites can be summarized by a p-value
– If p-value is in reasonable range between 0.05 and 0.95, the model fit to the data is deemed ok
24. Matching-adjusted indirect comparisons (i)
24
– Matching-adjusted indirect comparisons (MAIC) can overcome limitations of
classical ITCs in the following situations:
– Important effect modifiers (such as
disease severity) bias the indirect
comparison of A vs. B (violation of similarity
assumption)
AND/OR
– Treatments of interest A and B
cannot be connected through a
common comparator
S
t
u
d
y
9
S
t
u
d
y
A
S
t
u
d
y
B
S
t
u
d
y
C
Study 13
Study U
Study V
Study W
Study 13
Study U
Study V
Study W
Study 13
Study U
Study V
Study W
S
t
u
d
y
7
S
t
u
d
y
D
S
t
u
d
y
E
S
t
u
d
y
4
S
t
u
d
y
5
S
t
u
d
y
F
Study 10
25. Matching-adjusted indirect comparisons (ii)
25
– MAICs can adjust for the effect of treatment effect modifiers in an anchored
indirect comparison
– MAICs can also remove some of the biases that unadjusted (naïve) direct
comparisons of outcomes can have when a common comparator arm is
missing (unanchored comparison)
– MAICs require individual patient data (IPD) from clinical trials for one
treatment (typically the company’s own treatment), but only published,
aggregate data on baseline characteristics and outcomes for the comparator
treatment
28. How does MAIC work? – Steps in a nutshell
28
Data
collection
Matching
criteria
Matching
Recalculating
outcomes
– Individual patient data for treatment A
– Published aggregate data for treatment B
– Select patient and disease characteristics known as effect
modifiers (and prognostic variables)
– Apply weighting to patients receiving treatment A to match
characteristics of patients receiving treatment B
– Compare weighted outcomes for treatment A to observed
outcomes for treatment B
29. Matching criteria
29
– Choice of variables to be matched/weighted on should be carefully considered:
– Including too many variables will reduce precision (by reducing the effective sample size)
– Failure to include relevant variables will result in bias
– For anchored comparisons: all effect modifiers but no purely prognostic variables
– For unanchored comparisons: all effect modifiers and prognostic variables
– Evidence that a variable is an effect modifier/prognostic variable for the outcome of
interest should be based on quantitative evidence, expert opinion, or systematic
literature reviews
– Conduct sensitivity analyses using different sets of matching variables
– Select patient and disease characteristics known as effect modifiers
(and prognostic variables)
30. Matching
30
– Matching is accomplished by re-weighting patients in the IPD trial by their odds of
having been enrolled in the comparator trial
– Approach is similar to propensity score
weighting and is performed for all the
selected matching criteria simultaneously
– For anchored comparisons, matching
is also performed for the placebo arms;
it is then possible to compare relative
treatment effects (vs. placebo) between
A and B
– Apply weighting to patients receiving treatment A to match
characteristics of patients receiving treatment B
31. Recalculating Outcomes
31
– Weighted outcomes can be calculated for any statistics of the outcome of interest
– Ideally, statistical methods that take into account uncertainty around estimated weights
(such as Generalized Estimating Equations) should be used
– After matching, the effective sample size (ESS) can be calculated
– If the populations were balanced, each patient in the index trial would get a weight close to 1 and
the effective sample size would be similar to the original sample size
– Low effective sample size may occur when the populations differ substantially in one or more of
the characteristics matched
– Compare weighted outcomes for treatment A to observed outcomes
for treatment B
32. Recommendations by NICE DSU
32
The development of a Technical Support Document (TSD) by the NICE Decision Support
Unit (DSU) was released in December 2016 and a number of key recommendations are
made1:
1. Only use unanchored comparisons when a connected network is not available
2. Anchored comparisons must demonstrate that there are effect modifiers and that there is
imbalance in the effect modifiers
3. All effect modifiers but no purely prognostic variables should be matched in an anchored
comparison
4. Unanchored comparisons must include assessments of error due to unaccounted for
covariates
5. Indirect comparisons should be carried out on the linear predictor scale
6. Must explicitly state the target population
1 Phillippo DM et al.: NICE DSU Technical Support Document 18: methods for population-adjusted indirect comparisons in submission to NICE. https://research-
information.bris.ac.uk/en/publications/nice-dsu-technical-support-document-18-methods-for-population-adj
33. MAIC Case Study: Secukinumab in AS
33
– Ankylosing spondylitis (AS) is a form of arthritis causing inflammation of the
spinal joints that can lead to severe, chronic pain and discomfort1
– Secukinumab, a new antibody against interleukin 17A, has shown efficacy for
up to 104 weeks in patients with active ankylosing spondylitis2
OBJECTIVE
– To compare the efficacy of secukinumab with that of adalimumab
using matching-adjusted indirect comparison (MAIC) in patients with active AS
in terms of ASAS203 and ASAS40
1
Spondylitis Association of America. https://www.spondylitis.org/Ankylosing-Spondylitis
2
Baeten D, et al., 2015 N Engl J Med 373:2534–48
3 Mapi Trust https://eprovide.mapi-trust.org/instruments/assessment-in-ankylosing-spondylitis-response-criteria
34. Case study: Matching approach
34
Figure from Maksymowych et al.
Presented at AMCP 2017
35. Case study: Baseline characteristics before/after matching
35
Figure from Maksymowych et al.
Presented at AMCP 2017
36. Case study: Efficacy outcomes after matching (i)
36
– Placebo-adjusted (anchored)
comparisons were feasible at week 8
and week 12
– There is no evidence that ASAS 20 or
ASAS 40 responses differed significantly
between secukinumab and adalimumab
at week 12
Figure from Maksymowych et al. Presented at AMCP 2017
37. Case study: Efficacy outcomes after matching (ii)
37
– As the unbiased placebo phase ended
at week 12, comparisons at week 16
and week 24 were unanchored
– There was weak evidence (p=0.047)
that ASAS20 responses were higher
with secukinumab than adalimumab at
week 16
– At week 24, there was up to moderate
evidence (p=0.017/0.012) that ASAS20
and ASAS40 responses were higher
with secukinumab
39. Bias
39
– Constancy of relative effects
𝑑!" !" = 𝑑!" !#
– Biased if there are
differences in effect modifiers
between studies
40. External validity target population for HTA decision-making
40
– MAIC and STC are restricted to contrast treatments in the study B sample
– May not be representative of target population of eligible patients for study B
– May differ to the target population of routine clinical practice in the
jurisdiction
– Valid estimate of treatment effect in one context is not necessarily valid in
another
– MAIC and STC have been designed for pairwise comparisons
41. What we would like
41
– Synthesis of larger treatment networks of any size
– Avoid aggregation bias
– Produces estimates in the target population
42. Multilevel network meta-regression (ML-NMR)
42
– Population-adjustment methods aim to
relax this assumption using IPD to adjust
for differences in effect modifiers
between studies
– Ideally, we would have IPD for every
study but more typical we have for only a
subset
– ML-NMR based on Jackson et al. (2006,
2008)
– ML-NMR synthesizes mixtures of IPD
and AgD and performs population
adjustment in networks of any size
1. Define individual-level regression model
– IPD network meta-regression
2. Average (integrate) over the aggregate
study population to form the aggregate-
level model
– Use efficient and general numerical
integration
Phillippo et al (2020) Multilevel Network Meta-Regression for population-adjusted treatment comparisons J R Stat Soc: A 183(3)
47. Equations
47
– Individual level straightforward in many case e.g. sum of Normal or Poisson
outcomes
– Aggregate level integration easier in some cases e.g. identity link or discrete
covariates but in general use numerical integration
48. Open questions
48
– Application to disconnected networks (unanchored scenario) unclear
– Extension to survival analysis setting required
– Current implementation targets a conditional treatment effect as opposed to a
marginal
– As per STC, “standardization” step required for population-level
reimbursement decisions
49. Software
49
– R software package (multinma)
– Active maintenance and development
– Freely available and easy to use
53. MAIC vs. ML-NMR – current situation
53
MAIC ML-NMR
Can only remove bias when the aggregate data
population is entirely contained within the
population of the IPD study
Method of integrating over the covariate
distribution is more flexible and has conceptual
advantages
Only applicable to two-study scenarios Generalizable to larger treatment networks
Limited to target population of aggregate data
trial
Comparisons may be provided in any target
population given sufficient information on the
covariate distribution
Well-established and well-known to decision-
makers
Decisions-makers have limited/no experience
with this new method
As a weigting method easily applicable to any
outcome data, including time-to-event
Further research required to extend ML-NMR to
time-to-event data
Applicable to unanchored comparisons Not (yet) applicable to incorporate data from
single-arm studies, but extension is conceputally
possible
54. Conclusion
54
– Handling sparse data, heterogeneity and disconnected studies in evidence
synthesis poses challenges to researchers and requires advanced
methodology
– If the network is sparse, informative priors on between-study heterogeneity
parameters represent alternatives to the use of non-informative priors in the
standard case
– In pairwise comparisons, MAICs use IPD from one trial to adjust for effect
modifying (in the anchored case) and population imbalance (in the
unanchored case)
– ML-NMR is a novel method and a direct extension of the standard network
meta-analysis framework which uses IPD to adjust for differences in effect
modifiers between studies