Jason Hassenstab discusses selecting cognitive tests for a clinical trial involving participants with autosomal dominant Alzheimer's disease (AD). He recommends using measures from the Dominantly Inherited Alzheimer Network (DIAN) that assess six cognitive domains. The Clinical Dementia Rating (CDR) Sum of Boxes score provided the best sensitivity to change. Additional outcome variables could include neuropsychological tests of episodic memory and composite cognitive measures. Other metrics like reaction time distributional analysis may also provide useful data. The goals are to develop tests capturing the pathological course of AD and that are sensitive to detecting treatment effects versus impairment. Tests should demonstrate construct validity and links to neural circuits, cognitive mechanisms, and functional outcomes in AD.
The early diagnosis of neurodegenerative diseases is crucial, as it could lead to early treatment and better chances of stopping the process. Machine learning algorithms provide an opportunity to diagnose these diseases earlier through automatic diagnosis, but their application to the medical domain is not straightforward. From data set size to interpretability, we will see why the beautiful, trendy and complex solution we can first think about might not be the best one.
This study examined the relationship between working memory and ADHD symptoms in children aged 6-12 years old. Working memory was assessed using subscales of the Digit Span and Arithmetic subtests from the WISC-IV. ADHD symptoms were measured using the Conners 3 questionnaire completed by parents. Results found that the Arithmetic and Digit Span Backward subscales significantly predicted inattentive and hyperactive/impulsive ADHD symptoms. Individual working memory subscales were better predictors of dimensional ADHD symptoms than the overall Working Memory Index score. This suggests differential relationships between specific working memory tasks and ADHD symptom levels.
The document discusses using regression for association testing and prediction. It provides an example analyzing factors associated with memory scores in HIV patients. A linear regression found larger household size was associated with higher memory scores when controlling for age and clinic. This provides evidence that household size may contribute to memory loss in HIV patients. The document also discusses using regression for prediction, comparing models, and measures for predictive accuracy like AUC.
Innovative Strategies For Successful Trial Design - Webinar SlidesnQuery
Full webinar available here: https://www.statsols.com/webinar/innovative-strategies-for-successful-trial-design
[Webinar] Innovative Strategies For Successful Trial Design- In this free webinar, you will learn about:
- The challenges facing your trials
- How to calculate the correct sample size
- Worked examples including Mixed/Hierarchical Models
- Posterior Error
- Adaptive Designs For Survival
www.statsols.com
Histogram-weighted cortical thickness networks for the detection of Alzheimer...Pradeep Redddy Raamana
Presentation delivered by Pradeep Reddy Raamana at 2016 international workshop on Pattern Recognition in Neuroimaging on the topic of histogram-weighted cortical thickness networks for the detection of Alzheimer's disease.
neuropredict: a proposal and a tool towards standardized and easy assessment ...Pradeep Redddy Raamana
Proper application of machine learning to accurately evaluate the accuracy of biomarkers is challenging and error-prone for those without expertise in machine learning or programming. We offer an easy to use tool which implements the best practices and produces a comprehensive yet clinically-relevant report when comparing several biomarkers or different methods/studies. It is called neuropredict, which is open source and applicable to any domain whose biomarkers can be represented by numbers.
PROBLEM - Uncertainty about treatment effects impedes accurate clinical trial design and predictions of success.
SOLUTION - A Bayesian approach using East® software allows incorporation of prior information on treatment effects through probability distributions, providing more formal and realistic assessments.
BENEFITS - Bayesian calculations like probability of success and predictive power account for uncertainty, leading to more accurate power and decision-making at design and interim stages compared to point estimates. This improves trial success rates.
Jason Hassenstab discusses selecting cognitive tests for a clinical trial involving participants with autosomal dominant Alzheimer's disease (AD). He recommends using measures from the Dominantly Inherited Alzheimer Network (DIAN) that assess six cognitive domains. The Clinical Dementia Rating (CDR) Sum of Boxes score provided the best sensitivity to change. Additional outcome variables could include neuropsychological tests of episodic memory and composite cognitive measures. Other metrics like reaction time distributional analysis may also provide useful data. The goals are to develop tests capturing the pathological course of AD and that are sensitive to detecting treatment effects versus impairment. Tests should demonstrate construct validity and links to neural circuits, cognitive mechanisms, and functional outcomes in AD.
The early diagnosis of neurodegenerative diseases is crucial, as it could lead to early treatment and better chances of stopping the process. Machine learning algorithms provide an opportunity to diagnose these diseases earlier through automatic diagnosis, but their application to the medical domain is not straightforward. From data set size to interpretability, we will see why the beautiful, trendy and complex solution we can first think about might not be the best one.
This study examined the relationship between working memory and ADHD symptoms in children aged 6-12 years old. Working memory was assessed using subscales of the Digit Span and Arithmetic subtests from the WISC-IV. ADHD symptoms were measured using the Conners 3 questionnaire completed by parents. Results found that the Arithmetic and Digit Span Backward subscales significantly predicted inattentive and hyperactive/impulsive ADHD symptoms. Individual working memory subscales were better predictors of dimensional ADHD symptoms than the overall Working Memory Index score. This suggests differential relationships between specific working memory tasks and ADHD symptom levels.
The document discusses using regression for association testing and prediction. It provides an example analyzing factors associated with memory scores in HIV patients. A linear regression found larger household size was associated with higher memory scores when controlling for age and clinic. This provides evidence that household size may contribute to memory loss in HIV patients. The document also discusses using regression for prediction, comparing models, and measures for predictive accuracy like AUC.
Innovative Strategies For Successful Trial Design - Webinar SlidesnQuery
Full webinar available here: https://www.statsols.com/webinar/innovative-strategies-for-successful-trial-design
[Webinar] Innovative Strategies For Successful Trial Design- In this free webinar, you will learn about:
- The challenges facing your trials
- How to calculate the correct sample size
- Worked examples including Mixed/Hierarchical Models
- Posterior Error
- Adaptive Designs For Survival
www.statsols.com
Histogram-weighted cortical thickness networks for the detection of Alzheimer...Pradeep Redddy Raamana
Presentation delivered by Pradeep Reddy Raamana at 2016 international workshop on Pattern Recognition in Neuroimaging on the topic of histogram-weighted cortical thickness networks for the detection of Alzheimer's disease.
neuropredict: a proposal and a tool towards standardized and easy assessment ...Pradeep Redddy Raamana
Proper application of machine learning to accurately evaluate the accuracy of biomarkers is challenging and error-prone for those without expertise in machine learning or programming. We offer an easy to use tool which implements the best practices and produces a comprehensive yet clinically-relevant report when comparing several biomarkers or different methods/studies. It is called neuropredict, which is open source and applicable to any domain whose biomarkers can be represented by numbers.
PROBLEM - Uncertainty about treatment effects impedes accurate clinical trial design and predictions of success.
SOLUTION - A Bayesian approach using East® software allows incorporation of prior information on treatment effects through probability distributions, providing more formal and realistic assessments.
BENEFITS - Bayesian calculations like probability of success and predictive power account for uncertainty, leading to more accurate power and decision-making at design and interim stages compared to point estimates. This improves trial success rates.
Why are most clinical research findings not spectacular?scanFOAM
A presentation by Morten Hylander Møller at the 2017 meeting of the Scandinavian Society of Anaestesiology and Intensive Care Medicine.
All available content from SSAI2017: https://scanfoam.org/ssai2017/
Delivered in collaboration between scanFOAM, SSAI & SFAI.
2014-10-22 EUGM | WEI | Moving Beyond the Comfort Zone in Practicing Translat...Cytel USA
1. The document discusses moving beyond conventional practices in translational statistics to obtain more robust and clinically meaningful results from clinical studies.
2. Several methodology issues are discussed, including how to define primary endpoints when there are multiple outcomes, how to handle dropouts and competing risks, and how to quantify treatment contrasts in a model-free way.
3. Alternative approaches are proposed for various types of studies, such as using restricted mean survival times instead of hazard ratios for survival analyses and performing meta-analyses for evaluating safety issues using large amounts of data.
Machine Learning for Preclinical ResearchPaul Agapow
This document summarizes a presentation on machine learning for preclinical research. It discusses how biomedical data sets are often small and discusses challenges in applying deep learning and other machine learning techniques with limited data. It proposes combining multiple smaller datasets using standards to create larger datasets for analysis. The document also notes issues with noise and bias in biomedical data and proposes careful curation and appropriate analysis methods. In conclusion, it advocates for carefully curated combined datasets, integrating different data types and sources, and validated application of machine learning to support preclinical research.
Workshop on "Building Successful Pipelines for Predictive Analytics in Healthcare" delivered by Danielle Belgrave, PhD, Researcher at Microsoft Research, Cambridge, UK.
Introduction to evidence based practice slp6030sahughes
This document discusses evidence-based practice in speech-language pathology. It defines evidence-based practice as integrating clinical expertise, patient values, and the best research evidence. Lower levels of research evidence are still useful if they are the best available. Treatment efficacy focuses on controlled studies while effectiveness looks at outcomes under typical clinical conditions. Clinicians should have an open and honest approach when considering different treatment options and be guided by principles of beneficence, autonomy, nonmaleficence, and justice. Forming answerable clinical questions is important to evidence-based practice.
What is the future of personal brain health? SharpBrains
Accelerating innovation is poised to enable systematic brain health self-monitoring and self-care, which in turn can transform what it means to live healthy and fulfilling lives. What concrete steps can individuals take to manage and enhance brain health and heal illness throughout the various stages of life?
- Chair: Alvaro Fernandez, CEO of SharpBrains, YGL Class of 2012
- Barbara Arrowsmith Young, author of The Woman Who Changed Her Brain
- Alexandra Morehouse, VP Brand Management at Kaiser Permanente
This session took place at the 2013 SharpBrains Virtual Summit: http://sharpbrains.com/summit-2013/agenda/
1. The memorandum summarizes a review of medical records for a client who suffered a closed head injury and vertebral fracture in a 2011 motor vehicle accident.
2. The client was admitted to WakeMed hospital and diagnosed with a closed head injury with severe concussive symptoms and a C6 spinous process fracture, which did not require surgery.
3. Neuropsychological testing a few months post-accident found no significant cognitive deficits, though the client reports ongoing difficulties with mathematical calculations compared to her pre-injury abilities.
Webinar slides sample size for survival analysis - a guide to planning succ...nQuery
Determining the appropriate number of events needed for survival analysis is a complex task as study planners try to predict what sample size will be needed after accounting for the complications of unequal follow-up, drop-out and treatment crossover.
The statistical, logistical and ethical considerations all complicate life for biostatisticians as issues to balance in planning a survival analysis. However, this complexity has created a need for new analyses and procedures to help the planning process for survival analysis trials.
The wider move from fixed to flexible designs has opened up opportunities for advanced methods such as adaptive design and Bayesian analysis to help deal with the unique complications of planning for survival data but these methods have their own complications that need to be explored too.
Cell el company presentation dec17 for website 17 jan18Eliyahu Schuman
Cell-El Ltd is developing a diagnostic kit for autism spectrum disorder (ASD) that uses proteomic and genetic biomarkers to enable early, objective diagnosis. The presentation outlines Cell-El's progress in collecting over 100 samples from children with ASD and controls to identify statistically significant biomarkers. If successful, the kit could allow for easier, earlier diagnosis of ASD compared to current behavioral methods, with the goal of facilitating early intervention. The team has published on immunological aspects of ASD and has an international network of experts collaborating on further research.
A Proof-of-Concept Visualization to Increase Comprehension of Personal Medica...Robin De Croon
In this paper, we investigate how information visualization techniques can be leveraged to increase patient comprehension of personal medication schemes in order to make it easier for them to explore, explain and understand drug information. Using computer vision techniques, our solution is able to recognize medication boxes, or so-called pharmaceutical packages, which are laid on an ordinary table. A projector visualizes drug information such as interactions, adverse drug reactions, intolerances and the dosage regimen around corresponding boxes. Five prototypes are designed and evaluated following a user-centered, rapid-prototyping methodology. Test participants in our study included both general practitioners (GPs) and patients. Results are promising and clearly indicate that information visualization techniques are an effective means to explore and understand drug information. Even if this system was originally envisaged to be used as a means to improve `therapy dialogue' between GPs and their patients during consultations, our results show that both GPs and patients think it would be highly beneficial if patients were able to use the system at home.
Dose response and efficacy of spinal manipulation for care of chronic low bac...Younis I Munshi
This randomized controlled trial studied the dose-response relationship between spinal manipulation therapy (SMT) sessions and chronic low back pain outcomes. 400 participants with chronic low back pain were randomized to receive 0, 6, 12, or 18 SMT sessions over 6 weeks from a chiropractor, with additional non-SMT light massage sessions to control for provider attention. The primary outcomes of pain and disability were evaluated at 12 and 24 weeks. Results showed modest linear dose-response effects, with 12 SMT sessions producing the greatest reduction in pain and disability at 12 weeks and 18 sessions producing the greatest effects at 52 weeks. Overall, SMT produced clinically meaningful improvements in pain and disability that were sustained to 52 weeks, with 12 visits appearing
Evidence-based Patient Assignments: How Using Automated and Intelligent Softw...Gene Pinder
Hospitals are under increasing pressure to improve care, lower costs and avoid nurse burnout and turnover. One overlooked area is the patient assignment process, which could benefit from intelligent software. This slide presentation lays out the case for it.
1. The study compared the sensitivity to change of the standard 42-item Obsessive Compulsive Inventory (OCI), the revised 18-item version (OCI-R), and a shorter version focusing on the highest subscale (OCI-R Main) in two cohorts of patients with different OCD severity who received cognitive behavioral therapy.
2. The results showed that the OCI-R is a valid self-report measure for assessing change and is less burdensome for patients than the full OCI. However, questions remain about whether the OCI or OCI-R are sensitive enough to detect changes for service evaluation purposes.
3. All versions of the OCI were less sensitive to changes
8.Calculate samplesize for clinical trials (continuous outcome)Azmi Mohd Tamil
This document discusses calculating sample sizes for studies involving continuous outcome data from two independent groups. It provides the formula for calculating the standardized difference between groups given the clinically relevant difference and population standard deviation. It then shows how to use power, alpha level, and standardized difference to determine sample size using tables or software. Examples are provided to demonstrate calculating the combined standard deviation when two standard deviations are provided, and what to do if prior information is not available to determine sample size.
D6 transforming oncology development with adaptive studies - 2011-04therealreverendbayes
This document discusses adaptive clinical trial designs, which allow modifications to trials based on interim data analysis. It provides an example of using a promising zone design for a phase 3 oncology trial testing a new treatment for metastatic non-small cell lung cancer. The design calls for an interim analysis when 50% of required events are reached, with the option to increase the sample size if results fall within a promising zone, defined as a conditional power between 30-90%, indicating the treatment may be beneficial. Simulations show this adaptive approach maintains high power even if the treatment effect is smaller than initially estimated.
Bob Olsson - Alzforum Live Webinar April 29, 2016Alzforum
This document summarizes information from the AlzBiomarker Database on biomarkers for Alzheimer's disease (AD). It describes the large variation seen in biomarker levels between labs, batches, and assays. It then lists various biomarkers related to neurodegeneration, amyloid-beta metabolism, tau tangles, blood-brain barrier function, and glial activation. The inclusion and exclusion criteria for studies in the database are provided. The database contains over 150 comparisons of biomarkers like tau, phospho-tau, and amyloid-beta 42 in the cerebrospinal fluid of AD patients versus controls and mild cognitive impairment patients who progressed to AD versus those who remained stable.
Why are most clinical research findings not spectacular?scanFOAM
A presentation by Morten Hylander Møller at the 2017 meeting of the Scandinavian Society of Anaestesiology and Intensive Care Medicine.
All available content from SSAI2017: https://scanfoam.org/ssai2017/
Delivered in collaboration between scanFOAM, SSAI & SFAI.
2014-10-22 EUGM | WEI | Moving Beyond the Comfort Zone in Practicing Translat...Cytel USA
1. The document discusses moving beyond conventional practices in translational statistics to obtain more robust and clinically meaningful results from clinical studies.
2. Several methodology issues are discussed, including how to define primary endpoints when there are multiple outcomes, how to handle dropouts and competing risks, and how to quantify treatment contrasts in a model-free way.
3. Alternative approaches are proposed for various types of studies, such as using restricted mean survival times instead of hazard ratios for survival analyses and performing meta-analyses for evaluating safety issues using large amounts of data.
Machine Learning for Preclinical ResearchPaul Agapow
This document summarizes a presentation on machine learning for preclinical research. It discusses how biomedical data sets are often small and discusses challenges in applying deep learning and other machine learning techniques with limited data. It proposes combining multiple smaller datasets using standards to create larger datasets for analysis. The document also notes issues with noise and bias in biomedical data and proposes careful curation and appropriate analysis methods. In conclusion, it advocates for carefully curated combined datasets, integrating different data types and sources, and validated application of machine learning to support preclinical research.
Workshop on "Building Successful Pipelines for Predictive Analytics in Healthcare" delivered by Danielle Belgrave, PhD, Researcher at Microsoft Research, Cambridge, UK.
Introduction to evidence based practice slp6030sahughes
This document discusses evidence-based practice in speech-language pathology. It defines evidence-based practice as integrating clinical expertise, patient values, and the best research evidence. Lower levels of research evidence are still useful if they are the best available. Treatment efficacy focuses on controlled studies while effectiveness looks at outcomes under typical clinical conditions. Clinicians should have an open and honest approach when considering different treatment options and be guided by principles of beneficence, autonomy, nonmaleficence, and justice. Forming answerable clinical questions is important to evidence-based practice.
What is the future of personal brain health? SharpBrains
Accelerating innovation is poised to enable systematic brain health self-monitoring and self-care, which in turn can transform what it means to live healthy and fulfilling lives. What concrete steps can individuals take to manage and enhance brain health and heal illness throughout the various stages of life?
- Chair: Alvaro Fernandez, CEO of SharpBrains, YGL Class of 2012
- Barbara Arrowsmith Young, author of The Woman Who Changed Her Brain
- Alexandra Morehouse, VP Brand Management at Kaiser Permanente
This session took place at the 2013 SharpBrains Virtual Summit: http://sharpbrains.com/summit-2013/agenda/
1. The memorandum summarizes a review of medical records for a client who suffered a closed head injury and vertebral fracture in a 2011 motor vehicle accident.
2. The client was admitted to WakeMed hospital and diagnosed with a closed head injury with severe concussive symptoms and a C6 spinous process fracture, which did not require surgery.
3. Neuropsychological testing a few months post-accident found no significant cognitive deficits, though the client reports ongoing difficulties with mathematical calculations compared to her pre-injury abilities.
Webinar slides sample size for survival analysis - a guide to planning succ...nQuery
Determining the appropriate number of events needed for survival analysis is a complex task as study planners try to predict what sample size will be needed after accounting for the complications of unequal follow-up, drop-out and treatment crossover.
The statistical, logistical and ethical considerations all complicate life for biostatisticians as issues to balance in planning a survival analysis. However, this complexity has created a need for new analyses and procedures to help the planning process for survival analysis trials.
The wider move from fixed to flexible designs has opened up opportunities for advanced methods such as adaptive design and Bayesian analysis to help deal with the unique complications of planning for survival data but these methods have their own complications that need to be explored too.
Cell el company presentation dec17 for website 17 jan18Eliyahu Schuman
Cell-El Ltd is developing a diagnostic kit for autism spectrum disorder (ASD) that uses proteomic and genetic biomarkers to enable early, objective diagnosis. The presentation outlines Cell-El's progress in collecting over 100 samples from children with ASD and controls to identify statistically significant biomarkers. If successful, the kit could allow for easier, earlier diagnosis of ASD compared to current behavioral methods, with the goal of facilitating early intervention. The team has published on immunological aspects of ASD and has an international network of experts collaborating on further research.
A Proof-of-Concept Visualization to Increase Comprehension of Personal Medica...Robin De Croon
In this paper, we investigate how information visualization techniques can be leveraged to increase patient comprehension of personal medication schemes in order to make it easier for them to explore, explain and understand drug information. Using computer vision techniques, our solution is able to recognize medication boxes, or so-called pharmaceutical packages, which are laid on an ordinary table. A projector visualizes drug information such as interactions, adverse drug reactions, intolerances and the dosage regimen around corresponding boxes. Five prototypes are designed and evaluated following a user-centered, rapid-prototyping methodology. Test participants in our study included both general practitioners (GPs) and patients. Results are promising and clearly indicate that information visualization techniques are an effective means to explore and understand drug information. Even if this system was originally envisaged to be used as a means to improve `therapy dialogue' between GPs and their patients during consultations, our results show that both GPs and patients think it would be highly beneficial if patients were able to use the system at home.
Dose response and efficacy of spinal manipulation for care of chronic low bac...Younis I Munshi
This randomized controlled trial studied the dose-response relationship between spinal manipulation therapy (SMT) sessions and chronic low back pain outcomes. 400 participants with chronic low back pain were randomized to receive 0, 6, 12, or 18 SMT sessions over 6 weeks from a chiropractor, with additional non-SMT light massage sessions to control for provider attention. The primary outcomes of pain and disability were evaluated at 12 and 24 weeks. Results showed modest linear dose-response effects, with 12 SMT sessions producing the greatest reduction in pain and disability at 12 weeks and 18 sessions producing the greatest effects at 52 weeks. Overall, SMT produced clinically meaningful improvements in pain and disability that were sustained to 52 weeks, with 12 visits appearing
Evidence-based Patient Assignments: How Using Automated and Intelligent Softw...Gene Pinder
Hospitals are under increasing pressure to improve care, lower costs and avoid nurse burnout and turnover. One overlooked area is the patient assignment process, which could benefit from intelligent software. This slide presentation lays out the case for it.
1. The study compared the sensitivity to change of the standard 42-item Obsessive Compulsive Inventory (OCI), the revised 18-item version (OCI-R), and a shorter version focusing on the highest subscale (OCI-R Main) in two cohorts of patients with different OCD severity who received cognitive behavioral therapy.
2. The results showed that the OCI-R is a valid self-report measure for assessing change and is less burdensome for patients than the full OCI. However, questions remain about whether the OCI or OCI-R are sensitive enough to detect changes for service evaluation purposes.
3. All versions of the OCI were less sensitive to changes
8.Calculate samplesize for clinical trials (continuous outcome)Azmi Mohd Tamil
This document discusses calculating sample sizes for studies involving continuous outcome data from two independent groups. It provides the formula for calculating the standardized difference between groups given the clinically relevant difference and population standard deviation. It then shows how to use power, alpha level, and standardized difference to determine sample size using tables or software. Examples are provided to demonstrate calculating the combined standard deviation when two standard deviations are provided, and what to do if prior information is not available to determine sample size.
D6 transforming oncology development with adaptive studies - 2011-04therealreverendbayes
This document discusses adaptive clinical trial designs, which allow modifications to trials based on interim data analysis. It provides an example of using a promising zone design for a phase 3 oncology trial testing a new treatment for metastatic non-small cell lung cancer. The design calls for an interim analysis when 50% of required events are reached, with the option to increase the sample size if results fall within a promising zone, defined as a conditional power between 30-90%, indicating the treatment may be beneficial. Simulations show this adaptive approach maintains high power even if the treatment effect is smaller than initially estimated.
Bob Olsson - Alzforum Live Webinar April 29, 2016Alzforum
This document summarizes information from the AlzBiomarker Database on biomarkers for Alzheimer's disease (AD). It describes the large variation seen in biomarker levels between labs, batches, and assays. It then lists various biomarkers related to neurodegeneration, amyloid-beta metabolism, tau tangles, blood-brain barrier function, and glial activation. The inclusion and exclusion criteria for studies in the database are provided. The database contains over 150 comparisons of biomarkers like tau, phospho-tau, and amyloid-beta 42 in the cerebrospinal fluid of AD patients versus controls and mild cognitive impairment patients who progressed to AD versus those who remained stable.
Live discussion intro end slides takaomi higher resolutionAlzforum
The document discusses new APP knock-in mouse models that were presented at an Alzforum event as potentially better models of Alzheimer's disease than previous overexpression models. It notes that the new APP knock-in mice and an updated research models database on the Alzforum website now include interactive diagrams allowing visual comparisons of core phenotypes between different mouse models.
This document summarizes research on mouse models of Alzheimer's disease conducted by the RIKEN Brain Science Institute Proteolytic Neuroscience Laboratory in Japan. It discusses several generations of mouse models that were developed, including those that overexpress mutant amyloid precursor protein (APP) alone or crossed with other transgenic mice, as well as knock-in models incorporating specific APP mutations. The models have provided insights into amyloid-beta deposition and inflammation with aging. Ongoing work aims to further elucidate mechanisms of pathology and evaluate modulating factors like proteases, vaccines, and genes.
Dr. Saido's group generated three lines of mice that do not overproduce APP but overexpress Aβ with distinct primary sequences. These mouse lines can help advance understanding of the temporal and spatial patterns of Aβ aggregation in the living brain, including diffuse versus neuritic plaques. They can also provide insight into the role of different Aβ species on cognition, though initial water maze tests with these lines found no cognitive deficits.
Dominic Walsh - A Critical Appraisal of the Pathogenic Protein Spread Hypothe...Alzforum
Presentation made April 8, 2016 at the live webinar hosted by Alzforum - http://www.alzforum.org/webinars/webinar-pathogenic-protein-spread-lets-think-again
Presentation made at the live webinar of April 8, 2016 hosted by Alzforum - http://www.alzforum.org/webinars/webinar-pathogenic-protein-spread-lets-think-again
Patrik Brundin - Are Synucleinopathies Prion Diseases?Alzforum
Presentation made April 8, 2016 at the live webinar hosted by Alzforum - http://www.alzforum.org/webinars/webinar-pathogenic-protein-spread-lets-think-again
Presentation made April 8, 2016 at the live webinar hosted by Alzforum - http://www.alzforum.org/webinars/webinar-pathogenic-protein-spread-lets-think-again
Virginia Lee - Cell-to-Cell Spread of Pathological TauAlzforum
Presentation made April 8, 2016 at the live webinar hosted by Alzforum - http://www.alzforum.org/webinars/webinar-pathogenic-protein-spread-lets-think-again
This document outlines how the National Institute for Health and Care Excellence (NICE) uses cost-effectiveness analysis to inform reimbursement decisions in the UK. It discusses NICE's process and how it generally accepts interventions with an incremental cost-effectiveness ratio of less than £20,000-30,000 per quality-adjusted life year (QALY). The document emphasizes the important role of the EQ-5D questionnaire in NICE's decisions by allowing comparison of health outcomes. It addresses issues like collecting EQ-5D data, mapping from other measures, and potential limitations of EQ-5D for certain conditions.
The document discusses how state veteran's homes can use clinical informatics and predictive modeling software to improve resident care, quality management, and regulatory compliance. It highlights how the software (called EQUIP) analyzes MDS data to identify at-risk residents, target interventions, evaluate outcomes, and benchmark performance against appropriate peer facilities like other state veterans' homes. The software is presented as helping facilities improve quality of care while reducing costs through preventative, evidence-based approaches.
An overview of clinical healthcare data analytics from the perspective of an interventional cardiology registry. This was initially presented as part of a workshop at the University of Illinois College of Computer Science on April 20, 2017.
This document discusses big data in healthcare and physical therapy. It provides an overview of ATI's use of big data through its large patient outcomes registry, which includes over 800 variables and has been accepted into federal registries. ATI leverages data on patient demographics, referrals, outcomes, satisfaction surveys, and costs to enhance care and outcomes. The challenges of evidence-based medicine in an era of big data are also examined, highlighting the need to reconcile evidence-based and precision approaches through standardized sharing of data.
1) The role of health care data analysts is evolving as the volume of available data grows exponentially. With zettabytes of data being generated, analysts must make sense of both structured and unstructured information.
2) Data analytics can provide insights to improve patient outcomes, lower costs, and enhance the health care experience. Examples show how visualizing data helps health systems better understand utilization and identify at-risk patients.
3) As incentives shift from fee-for-service to value-based models, health systems must transform to focus on population health. Advanced analytics and predictive modeling will be crucial to achieving the goals of better care, lower costs, and improved health.
Choosing an Analytics Solution in HealthcareDale Sanders
This document provides guidance on evaluating and choosing an analytics solution for healthcare. It discusses general criteria for assessment, including completeness of vision, ability to execute, culture and values alignment, technology adaptability, total cost of ownership, and company viability. It also frames the analytic environment and needs in healthcare. Key factors are the evolving data ecosystem, analytic motives shifting from billing to quality and prevention, and lessons from EMR adoption. The best solutions will provide a closed-loop analytic experience with integrated knowledge systems, deployment processes, and analytic capabilities.
Webinar slides how to reduce sample size ethically and responsiblynQuery
[Webinar] How to reduce sample size...ethically and responsibly | In this free webinar, you will learn various design strategies to help reduce the sample size of your study in an ethical and responsible manner. Practical examples will be used throughout.
Microsoft: A Waking Giant In Healthcare Analytics and Big DataHealth Catalyst
In 2005, Northwestern Memorial Healthcare embarked upon a strategic Enterprise Data Warehousing (EDW) initiative with the Microsoft technology platform as the foundation. Dale Sanders was CIO at Northwestern and led the development of Northwestern’s Microsoft-based EDW. At that time, Microsoft as an EDW platform was not en vogue and there were many who doubted the success of the Northwestern project. While other organizations were spending millions of dollars and years developing EDW’s and analytics on other platforms, Northwestern achieved great and rapid value at a fraction of the cost of the more typical technology platforms. Now, there are more healthcare data warehouses built around Microsoft products than any other vendor. The risky bet on Microsoft in 2005 paid off.
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
In this context, Dale will talk about:
His up and down journey with Microsoft as an Air Force and healthcare CIO, and why he is now more bullish on Microsoft like never before
A quick review of the Healthcare Analytics Adoption Model and Closed Loop Analytics in healthcare, and how Microsoft products relate to both
The rise of highly specialized, cloud-based analytic services and their value to healthcare organizations’ analytics strategies
Microsoft’s transformation from a closed-system, desktop PC company to an open-system consumer and business infrastructure company
The current transition period of enterprise data warehouses between the decline of relational databases and the rise of non-relational databases, and the new Microsoft products, notably Azure and the Analytic Platform System (APS), that bridge the transition of skills and technology while still integrating with core products like Office, Active Directory, and System Center
Microsoft’s strategy with its PowerX product line, and geospatial analysis and machine learning visualization tools
Presented at Artificial Intelligence and Machine Learning for Advanced Drug Discovery & Development 2019 on 28th May 2019 by Dr Ed Griffen of MedChemica Ltd
Android Based Questionnaires Application for Heart Disease Prediction Systemijtsrd
Today classification techniques in data mining are most popular to prediction and data exploration. This Heart Disease Prediction System HDPS is using Naive Bayesian Classification with a comparison for simple probability and that of Jelinek Mercer JM Smoothing. It is implemented as an Android based application user must be feedback and answers the questions then can be seen the result as user desired in different ways exactly heart disease is present or not and then with predictions No, Low, Average, High, Very High . And the system will be provided required suggestions such as doctor details and medications to patients could be able. It will be also proved that enhanced Naive Bayes with Jelinek Mercer smoothing technique is also effective to eliminate the noise for prediction the heart disease. This system can also calculate classifier accuracy by using precision and recall. Nan Yu Hlaing | Phyu Pyar Moe "Android Based Questionnaires Application for Heart Disease Prediction System" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, URL: https://www.ijtsrd.com/papers/ijtsrd26750.pdfPaper URL: https://www.ijtsrd.com/computer-science/data-miining/26750/android-based-questionnaires-application-for-heart-disease-prediction-system/nan-yu-hlaing
Microsoft: A Waking Giant in Healthcare Analytics and Big DataDale Sanders
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
Robust Methods for Health-related Quality-of-life Assessmentdylanturner22
This document summarizes a presentation on robust methods for assessing health-related quality of life (HRQoL). It discusses using quality-adjusted life years (QALYs) to collapse multi-dimensional HRQoL data but notes this can result in biased estimates. A two-step methodology is proposed where HRQoL domains are first estimated separately, then coefficients are transformed to the QALY scale based on predicted values. Simulations show the two-step approach provides less biased estimates than existing single-stage methods. The methodology is applied to SF-6D HRQoL data.
Seven steps for Use Routine Information to Improve HIV/AIDS Program_Snyder_5....CORE Group
This document outlines a 7 step approach to using routine data to improve HIV/AIDS programs. The 7 steps are: 1) identify questions of interest, 2) prioritize key questions, 3) identify data needs and sources, 4) transform data into information, 5) interpret information and draw conclusions, 6) craft solutions and take action, and 7) continue to monitor key indicators. The approach aims to facilitate using existing data to answer important questions and inform decision making through collaborative work between data users and producers. Overall, the 7 step approach provides a framework to strategically use routine monitoring data to strengthen HIV/AIDS programs and policies.
ICU Patient Deterioration Prediction : A Data-Mining Approachcsandit
A huge amount of medical data is generated every da
y, which presents a challenge in analysing
these data. The obvious solution to this challenge
is to reduce the amount of data without
information loss. Dimension reduction is considered
the most popular approach for reducing
data size and also to reduce noise and redundancies
in data. In this paper, we investigate the
effect of feature selection in improving the predic
tion of patient deterioration in ICUs. We
consider lab tests as features. Thus, choosing a su
bset of features would mean choosing the
most important lab tests to perform. If the number
of tests can be reduced by identifying the
most important tests, then we could also identify t
he redundant tests. By omitting the redundant
tests, observation time could be reduced and early
treatment could be provided to avoid the risk.
Additionally, unnecessary monetary cost would be av
oided. Our approach uses state-of-the-art
feature selection for predicting ICU patient deteri
oration using the medical lab results. We
apply our technique on the publicly available MIMIC
-II database and show the effectiveness of
the feature selection. We also provide a detailed a
nalysis of the best features identified by our
approach.
ICU PATIENT DETERIORATION PREDICTION: A DATA-MINING APPROACHcscpconf
A huge amount of medical data is generated every day, which presents a challenge in analysing
these data. The obvious solution to this challenge is to reduce the amount of data without
information loss. Dimension reduction is considered the most popular approach for reducing
data size and also to reduce noise and redundancies in data. In this paper, we investigate the
effect of feature selection in improving the prediction of patient deterioration in ICUs. We
consider lab tests as features. Thus, choosing a subset of features would mean choosing the
most important lab tests to perform. If the number of tests can be reduced by identifying the
most important tests, then we could also identify the redundant tests. By omitting the redundant
tests, observation time could be reduced and early treatment could be provided to avoid the risk.
Additionally, unnecessary monetary cost would be avoided. Our approach uses state-of-the-art
feature selection for predicting ICU patient deterioration using the medical lab results. We
apply our technique on the publicly available MIMIC-II database and show the effectiveness of
the feature selection. We also provide a detailed analysis of the best features identified by our
approach.
This document discusses Roberta Balcytyte's research using machine learning for early detection of rare hereditary diseases from large, imbalanced datasets. Specifically, the research aims to develop models to detect Hereditary Angioedema (HAE) using data on 1,200 HAE cases and 165 million controls. Initial results found Random Forest and AdaBoost classifiers performed best, accurately detecting HAE cases 88-89% of the time on average. The research seeks to supplement medical diagnosis by making rare disease detection faster and more accurate through machine learning.
AI in Healthcare: Real-World Machine Learning Use CasesHealth Catalyst
Levi Thatcher, PhD, VP of Data Science at Health Catalyst will share practical AI use cases and distill the lessons into a framework you can use when evaluating AI healthcare projects. Specifically, Levi will answer these questions:
What are great healthcare business cases for AI/ML?
What kind of data do you need?
What tools / talent do you need?
How do you integrate AI/ML into the daily workflow?
1. The document describes a proposed medicine recommendation system that would apply data mining techniques to analyze diagnosis data and provide personalized medication recommendations to reduce medical errors.
2. It would consist of modules for a database, recommendation models, model evaluation, and data visualization. Different recommendation algorithms like SVM, neural networks, and decision trees would be investigated and tested on an open diagnosis data set.
3. The goal is to build an accurate and efficient recommendation framework to help doctors prescribe the right medications, especially for inexperienced doctors and rare diseases, by leveraging patterns found in historical medical records and diagnosis data.
First Identification of Role TMEM106B in FTDAlzforum
Presentation made by Rosa Rademakers on the 20th of April, 2017, at the live webinar hosted by Alzforum: http://www.alzforum.org/webinars/webinar-cortex-aging-too-fast-blame-tmem106b-and-progranulin
Presentation made by Jernej Ule on the 20th of April, 2017, at the live webinar hosted by Alzforum: http://www.alzforum.org/webinars/webinar-cortex-aging-too-fast-blame-tmem106b-and-progranulin
Presentation made by Tony Wyss-Coray on the 20th of April, 2017, at the live webinar hosted by Alzforum: http://www.alzforum.org/webinars/webinar-cortex-aging-too-fast-blame-tmem106b-and-progranulin
Presentation made by Abeliovich and Rhinn on the 20th of April, 2017, at the live webinar hosted by Alzforum: http://www.alzforum.org/webinars/webinar-cortex-aging-too-fast-blame-tmem106b-and-progranulin
Marc Dhenain Alzforum Webinar - Dec 7, 2016Alzforum
Presentation made at the Alzforum's live webinar of December 5, 2016, titled "Is Alzheimer’s Disease a Uniquely Human Disorder?" - review additional information and recording at www.alzforum.org/
Peter Nelson Alzforum Webinar - Dec 7, 2016Alzforum
Presentation made at the Alzforum's live webinar of December 5, 2016, titled "Is Alzheimer’s Disease a Uniquely Human Disorder?" - review additional information and recording at www.alzforum.org/
Elizabeth Head Alzforum Webinar - Dec 7, 2016Alzforum
Presentation made at the Alzforum's live webinar of December 5, 2016, titled "Is Alzheimer’s Disease a Uniquely Human Disorder?" - review additional information and recording at www.alzforum.org/
Patrick Hof Alzforum Webinar - Dec 7, 2016Alzforum
Presentation made at the Alzforum's live webinar of December 5, 2016, titled "Is Alzheimer’s Disease a Uniquely Human Disorder?" - review additional information and recording at www.alzforum.org/
Presentation made at the Alzforum's live webinar of December 5, 2016, titled "Is Alzheimer’s Disease a Uniquely Human Disorder?" - review additional information and recording at www.alzforum.org/
Presentation made by Dr. Cliff Brangwynne on October 30, 2015 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Presentation made by Dr. Paul Taylor on October 30, 2015 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Presentation made by Dr. Markus Zweckstetter on October 30, 2015 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Presentation made by Dr. Simon Alberti on October 30, 2015 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Presentation made by Dr. Peter St. George-Hyslop on October 30, 2015 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Nicolas Fawzi - Residue-by-Residue View of the In Vitro FUS GranulesAlzforum
Presentation made by Dr. Nicolas Lux Fawzi on October 30 at the Alzforum-hosted live webinar titled "Fluid Business: Could “Liquid” Protein Herald Neurodegeneration?"
More information and the recording of the session available at http://www.alzforum.org/webinars/fluid-business-could-liquid-protein-herald-neurodegeneration
Computational modeling was used to integrate data from three clinical studies of MK-8931, an investigational BACE inhibitor for Alzheimer's disease. Modeling of biomarker data from the studies simultaneously predicted changes in amyloid beta 40, amyloid beta 42, and sAPPβ levels following MK-8931 administration. The model predicted patient amyloid beta reduction distributions that informed dose selection for ongoing Phase 2/3 trials testing the amyloid hypothesis. Close collaboration between modelers and experimentalists enhanced understanding of Alzheimer's disease pathophysiology and trial design.
Presentation made May 13, 2015 at live webinar titled Computational Modeling—Will it Rescue AD Clinical Trials? and hosted by Alzforum - http://www.alzforum.org/webinars/computational-modeling-will-it-rescue-ad-clinical-trials
Presentation made May 13, 2015 at live webinar titled Computational Modeling—Will it Rescue AD Clinical Trials? and hosted by Alzforum - http://www.alzforum.org/webinars/computational-modeling-will-it-rescue-ad-clinical-trials
Presentation made May 13, 2015 at live webinar titled Computational Modeling—Will it Rescue AD Clinical Trials? and hosted by Alzforum - http://www.alzforum.org/webinars/computational-modeling-will-it-rescue-ad-clinical-trials
Leaky Blood-Brain Barrier a Harbinger of Alzheimer s Disease - presentation made at Alzforum's live webinar of February 17, 2015. See details at: http://www.alzforum.org/webinars/leaky-blood-brain-barrier-harbinger-alzheimers
Osteoporosis - Definition , Evaluation and Management .pdfJim Jacob Roy
Osteoporosis is an increasing cause of morbidity among the elderly.
In this document , a brief outline of osteoporosis is given , including the risk factors of osteoporosis fractures , the indications for testing bone mineral density and the management of osteoporosis
Integrating Ayurveda into Parkinson’s Management: A Holistic ApproachAyurveda ForAll
Explore the benefits of combining Ayurveda with conventional Parkinson's treatments. Learn how a holistic approach can manage symptoms, enhance well-being, and balance body energies. Discover the steps to safely integrate Ayurvedic practices into your Parkinson’s care plan, including expert guidance on diet, herbal remedies, and lifestyle modifications.
These lecture slides, by Dr Sidra Arshad, offer a quick overview of the physiological basis of a normal electrocardiogram.
Learning objectives:
1. Define an electrocardiogram (ECG) and electrocardiography
2. Describe how dipoles generated by the heart produce the waveforms of the ECG
3. Describe the components of a normal electrocardiogram of a typical bipolar lead (limb II)
4. Differentiate between intervals and segments
5. Enlist some common indications for obtaining an ECG
6. Describe the flow of current around the heart during the cardiac cycle
7. Discuss the placement and polarity of the leads of electrocardiograph
8. Describe the normal electrocardiograms recorded from the limb leads and explain the physiological basis of the different records that are obtained
9. Define mean electrical vector (axis) of the heart and give the normal range
10. Define the mean QRS vector
11. Describe the axes of leads (hexagonal reference system)
12. Comprehend the vectorial analysis of the normal ECG
13. Determine the mean electrical axis of the ventricular QRS and appreciate the mean axis deviation
14. Explain the concepts of current of injury, J point, and their significance
Study Resources:
1. Chapter 11, Guyton and Hall Textbook of Medical Physiology, 14th edition
2. Chapter 9, Human Physiology - From Cells to Systems, Lauralee Sherwood, 9th edition
3. Chapter 29, Ganong’s Review of Medical Physiology, 26th edition
4. Electrocardiogram, StatPearls - https://www.ncbi.nlm.nih.gov/books/NBK549803/
5. ECG in Medical Practice by ABM Abdullah, 4th edition
6. Chapter 3, Cardiology Explained, https://www.ncbi.nlm.nih.gov/books/NBK2214/
7. ECG Basics, http://www.nataliescasebook.com/tag/e-c-g-basics
- Video recording of this lecture in English language: https://youtu.be/Pt1nA32sdHQ
- Video recording of this lecture in Arabic language: https://youtu.be/uFdc9F0rlP0
- Link to download the book free: https://nephrotube.blogspot.com/p/nephrotube-nephrology-books.html
- Link to NephroTube website: www.NephroTube.com
- Link to NephroTube social media accounts: https://nephrotube.blogspot.com/p/join-nephrotube-on-social-media.html
Histololgy of Female Reproductive System.pptxAyeshaZaid1
Dive into an in-depth exploration of the histological structure of female reproductive system with this comprehensive lecture. Presented by Dr. Ayesha Irfan, Assistant Professor of Anatomy, this presentation covers the Gross anatomy and functional histology of the female reproductive organs. Ideal for students, educators, and anyone interested in medical science, this lecture provides clear explanations, detailed diagrams, and valuable insights into female reproductive system. Enhance your knowledge and understanding of this essential aspect of human biology.
Hiranandani Hospital in Powai, Mumbai, is a premier healthcare institution that has been serving the community with exceptional medical care since its establishment. As a part of the renowned Hiranandani Group, the hospital is committed to delivering world-class healthcare services across a wide range of specialties, including kidney transplantation. With its state-of-the-art facilities, advanced medical technology, and a team of highly skilled healthcare professionals, Hiranandani Hospital has earned a reputation as a trusted name in the healthcare industry. The hospital's patient-centric approach, coupled with its focus on innovation and excellence, ensures that patients receive the highest standard of care in a compassionate and supportive environment.
Cell Therapy Expansion and Challenges in Autoimmune DiseaseHealth Advances
There is increasing confidence that cell therapies will soon play a role in the treatment of autoimmune disorders, but the extent of this impact remains to be seen. Early readouts on autologous CAR-Ts in lupus are encouraging, but manufacturing and cost limitations are likely to restrict access to highly refractory patients. Allogeneic CAR-Ts have the potential to broaden access to earlier lines of treatment due to their inherent cost benefits, however they will need to demonstrate comparable or improved efficacy to established modalities.
In addition to infrastructure and capacity constraints, CAR-Ts face a very different risk-benefit dynamic in autoimmune compared to oncology, highlighting the need for tolerable therapies with low adverse event risk. CAR-NK and Treg-based therapies are also being developed in certain autoimmune disorders and may demonstrate favorable safety profiles. Several novel non-cell therapies such as bispecific antibodies, nanobodies, and RNAi drugs, may also offer future alternative competitive solutions with variable value propositions.
Widespread adoption of cell therapies will not only require strong efficacy and safety data, but also adapted pricing and access strategies. At oncology-based price points, CAR-Ts are unlikely to achieve broad market access in autoimmune disorders, with eligible patient populations that are potentially orders of magnitude greater than the number of currently addressable cancer patients. Developers have made strides towards reducing cell therapy COGS while improving manufacturing efficiency, but payors will inevitably restrict access until more sustainable pricing is achieved.
Despite these headwinds, industry leaders and investors remain confident that cell therapies are poised to address significant unmet need in patients suffering from autoimmune disorders. However, the extent of this impact on the treatment landscape remains to be seen, as the industry rapidly approaches an inflection point.
Histopathology of Rheumatoid Arthritis: Visual treat
Hendrix 2015 composite endpoints redacted
1. Construction of Composite
Endpoints for Early Stage Trials
and other ways to improve power
CTAD 2015
Suzanne Hendrix, PhD
Pentara Corporation
Disclosure: President and CEO of Pentara Corporation through which I am a
paid consultant for several public, private and non-profit organizations .
2. Disclosures and Thanks
• Rush ROS, ADCS and ADNI for data
• Eisai and API for the first composite projects
• Roche, Janssen, Affiris for additional
composite research support
• Stephanie Stanworth, Noel Ellison and Leah
Garriott (Pentara)
• Bruce Brown (BYU) – Data mining tools
3. Outline
• History – ADAS-cog 8, APCC, ADCOMS, Roche
composite, RBANS composite, Affiris
composite
• Why? What is a composite? Co-primary
endpoints
• Type 1 error vs Type 2 error
4. History of Quantitative Composites
(As seen through the eyes of Suzanne Hendrix)
• 2008 and earlier – NTB composite (Bapi / Harrison), Hobart/Cano research
showing problems with ADAS-cog in Mild AD, Rescoring algorithm from
Wouters et. al.
• 2008 – MCI - Eisai collaboration – ADCOMS* (Veronika Logovinsky),
presented early composite in 2010 (cog only, cog +global, cog+function)
• 2009 – MCI/Mild -ADAS-cog 8* – 8 item ADAS-cog, ADAS-8+NTB
• 2010 – APCC* – for pre-MCI population developed with API – presented
AAIC 2011, AARoundtable 2012 (MMSE shows up!)
• 2011 – Pre-MCI Composite* for Columbian cohort
• 2012 – Roche composite* (Glenn Morrison), RBANS composite* (Michael
Ropacki) J & J Early Composites (Raghavan), ADNI cognitive core,
PROADAS- AZ
• 2013 – PACC presented, Lilly early composites
• 2013 – Mild AD composite developed with Affiris*
• 2014 – Affiris composite outperforms CDR-sb
* Pentara was involved in this project
5. Signal to Noise Improves with Fewer
Items, Then Worsens with Too Many
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0.45
1 3 5 7 9 11 13
Best Mean to Standard Deviation Ratio for ADAS-cog Items
MSDR
Number of Items
6. ADAS-cog 13
Items for MCI
Mean to SD
Ratio
ADAS-cog 13
Items for Mild AD
Mean to
SD Ratio
All Items 0.285 All Items 0.714
Excludes Word Finding
Difficulty
0.357 Excludes Word Recognition 0.743
Also Excludes Number
Cancellation
0.400 Also Excludes Remembering Test
Instructions
0.779
Also Excludes Ideational
Praxis
0.409 Also Excludes Commands 0.800
Also Excludes
Immediate Word Recall
0.412 Also Excludes Constructional
Praxis
0.817
Also Excludes Word
Recognition Task
0.414 Also Excludes Number
Cancellation
0.821
ADAS-cog 8 + 4 NTB Items (RAVLT,
RAVLT Delayed, Clock Drawing,
and Digit Span)
0.946
ADAS-cog13 Can Be Improved
In MCI and Mild AD by Removing Items
7. Original Method – Find Good Individual
items and Put Them Together
(MSDRs = 1/CV in Pre-MCI population)
• Logical Memory IIa (Delayed) - 0.128
• Category fluency – Fruits - 0.123
• Logical Memory Ia (Immediate) - 0.110
• Mini-Mental Status Examination - 0.109
• Word list memory (Delayed recall) - 0.104
• Word list recall (Immediate) - 0.102
8. Why Use a Composite?
• It can’t perform worse than the worst component
in the composite
• It often performs better than the best component
in the composite
• We’re already using composites!
• Do we believe that all of the points on the ADAS-
cog or CDR-sb are equivalent? Are they all on the
Disease trajectory?
• Patients may change more on Cognition than
Function or vice versa – Composite allows both
changes to be relevant.
11. Continuous Scales Improve Power
Over Discrete Scales
• Validation data looks better for ADCOMS than CDR-sb - ADCOMS is
a more granular version of CDR-sb
• If you power a study at 80% for CDR-sb, Type 2 error is 20% (chance
of failing a good treatment) with CDR-sb, but 6% with ADCOMS
• P=0.05 on CDR-sb => p=0.0092 on ADCOMS
• P=0.178 on CDR-sb => 0.05 on ADCOMS
• CDR-sb requires 40% more subjects (wastes 28.6% of subjects)
• Using a discrete scale does NOT ensure clinical relevance. In many
situations, it hides it! Make sure you know whether your treatment
works or not, then address clinical relevance (several clinically
irrelevant effects together might be relevant)
12. Why Now?
• Studies are failing – only 0.4% success rate – lots
of equivocal results in phase 2
• What happened to type 1 errors? We should be
succeeding 5% of the time by chance alone.
• We want to go earlier – statistically harder to
power in this population
• Power is the key to seeing real differences
• We can’t be casual about losing power – it wastes
patient and caregiver time and energy
14. Who are we fighting against?
• We have met our enemy and it is us
• Regulatory /Clinical concerns vs. Power/
Statistical concerns
• We all want to make good decisions
– Effective treatments should result in positive trials
– Ineffective treatments should fail convincingly
– Equivocal results are the worst outcome
15. Questions
• Does the composite endpoint really measure a
disease? Or is it off of the primary disease
path?
• Are the individual components of the
composite endpoint valid, biologically
plausible, and of importance for patients?
• Does a composite reflect “How a Patient Feels,
Functions, or Survives”?
16. Ways to Improve Power
• Measure a continuous disease with . . .
a continuous outcome!
– Time to event is more powerful only when event
rates are very low
• Clinical relevance can’t be addressed by
choosing a coarse scale
• Combine endpoints to get 1 answer by using
composites or combined p-values for overall
significance
18. Methods
• Dimension reduction technique applied to:
• Two MCI groups (n=650)
– ADNI I MCI population
– ADCS placebo group (from donepezil, vit E trial)
• Three Mild AD groups (n=320)
– ADNI I Mild AD population
– ADCS NSAID study placebo group
– ADCS Homocysteine study placebo group
22. Comments
• AD is nearly unidimensional (axis 1) = progression
path
• Learning Effects and Normal Aging are opposite
ends of the next dimension ( axis 2)
• Composites are identifying a latent variable
associated with dimension 1 – Progression of
Alzheimer’s Disease
– Factor Analysis does not achieve this same goal.
• Cognition represents the Core Disease Symptoms
– it is not a biomarker(!)
24. Composites That Reflect Disease Pathway Result in
Larger Disease Related Treatment Effects
• Are we treating a “shadow” of the disease?
• We assume proportional treatment effects, but we
usually see larger ones for good composites (it hurts
us twice)
• If ADLs and Cognition are both required and both
only correlate with disease progression then both
have weaker power, and significance on both
requires a very large effect
• Phase 2 should be more certain without having to
have enormous samples
25. Co-Primary Outcomes
What do they really cost us?
True Alpha Level
Scenario
Low 0.35
Correlation
Medium 0.4
Correlation
High 0.45
Correlation
Co-primaries at 0.05 0.0057 0.0067 0.0079
One significant, one a
"trend" 0.0095 0.0111 0.0128
One significant, other
"same direction" 0.0280 0.0305 0.0331
• Even in Mild disease, where cognition and function
are both changing, co-primary endpoints reduce
power from 80% to 54% - chance of failing a
successful treatment is more than double (46%
instead of 20%)
26. Conclusions
• True alpha with co-primary outcomes is 0.007
• Co-primaries require many more patients (20% to
80% more)
• Requiring significance on one outcome and a
‘trend’ on a second results in alpha=0.011
• Traditional scales require many more patients
than optimized composites (12% to 327% more)
• Not measuring the true disease trajectory costs
at least 28% of our subjects – is it worth it?