This document discusses measuring and controlling variability in a core facility environment. It notes that increased variability decreases statistical power to detect real effects. Common sources of variability include biological differences, sample preparation inconsistencies, and technical issues. The document asks how best to communicate these issues to customers given limited funding to directly measure variability. It discusses challenges like analyzing samples over long periods when preparation methods may differ, having no control over initial sample collection, and potential sources of variability like pipetting errors. Finally, it provides an example method using pooled and individual samples to empirically measure variability introduced through sample processing.
KCAS' flow cytometry team supports assay development, validation, and sample processing for preclinical and clinical studies. They helped a client design and perform a pharmacodynamic assay to measure T cell, B cell, and NK cell counts in rats treated with an antibody therapeutic within 4 weeks. KCAS recommended using CD45 lymphocyte gating and BD Trucount tubes for absolute counts. Results showed preserved samples had less than 20% variability in cell counts and percentages compared to unpreserved samples over 5 days. KCAS successfully completed the study in under 4 weeks as requested by the client.
Eddie Halbisch is a senior chemist currently working at Lancaster in Indianapolis, Indiana. He has over 5 years of experience in stability and analytical testing of pharmaceutical products at Lancaster and Pharmaceutical Product Development. His skills include proficiency with HPLC, UPLC, GC, dissolution testing and other laboratory instruments. He is computer skilled in Empower, Word and Excel. He has received recognition as mentor of the year in 2014. He holds a Bachelor of Science in Chemistry from University of Wisconsin-Oshkosh and has experience as a chemistry/statistics tutor and research lab assistant. References are provided from his past supervisor at Pharmaceutical Product Development.
Metabolomics is the newest hype in the 'omics' family. Although defined as highly important to recent biological research, the analytical science applied is far from acceptable for the majority of publications that appear nowadays. Take a look at our approach to tackle the metabolomics issue, which remains a new name for an old science.
This document discusses concepts related to bioequivalence studies for generic drugs, including:
- Generic drugs must be pharmaceutically equivalent and bioequivalent to the reference brand name drug to be considered substitutable.
- Bioequivalence studies evaluate the rate and extent of absorption of the generic drug compared to the brand name drug and ensure the generic drug performs in the same manner.
- Acceptance criteria for bioequivalence studies specify that the 90% confidence intervals for AUC and Cmax of the generic must fall within 80-125% of the reference drug.
- Special study designs may be needed for certain drugs based on their pharmacokinetic properties or ability to be measured in plasma.
This document provides guidance on optimizing and validating antibodies for immunohistochemistry in formalin-fixed paraffin-embedded tissues. It discusses preparing for antibody validation including obtaining product information, references, and tissues. It also covers testing different heat induced epitope retrieval solutions, pretreatments like protease digestion, antibody dilutions, detection systems, verification of results, and documentation needed for full validation. Proper validation involving pathologists is emphasized to ensure confidence in test performance and results.
dkNET Webinar: The Mouse Metabolic Phenotyping Centers: Services and Data 01/...dkNET
The Mouse Metabolic Phenotyping Centers (MMPC) is a National Institutes of Health-Sponsored resource that provides experimental testing services to scientists studying diabetes, obesity, diabetic complications, and other metabolic diseases in mice. Dr. Richard McIndoe will introduce resources and tools that are available at MMPC.
Abstract
A common strategy to dissect the etiology, genetics and underlying physiology of a disease is to create mouse models using gene targeting and manipulation techniques. These mouse models were developed by targeting one or more candidate genes or by using a whole genome mutagenesis strategy. The careful and reproducible characterization of these animal models is important for the advancement of biomedical research. The expense, expertise and time required to develop state-of-the-art phenotyping technologies is beyond the reach of many investigators. The Mouse Metabolic Phenotyping Centers (MMPC) were created to provide the scientific community with cost effective, high quality, standardized metabolic and phenotyping services. The focus of the MMPC is on experiments that characterize living animals as well as providing technologies that are important for understanding metabolism and physiology. The MMPC provides state-of-the-art technologies to investigators for a fee, with their services including characterization of mouse metabolism, blood composition (including hormones), energy balance, eating and exercise, organ function and morphology, physiology and histology. There are currently five MMPC Centers located at Vanderbilt University, University of California Davis, University of Cincinnati, University of Massachusetts and the University of Michigan. Investigators using the MMPC services agree to release the data generated by the MMPC to the general public via the national website database. This talk will review the structure of the MMPC, the services it provides and the data generated by the consortium for public use.
Presenter: Dr. Richard McIndoe, Professor, College of Graduate Studies and the College of Allied Health Sciences, Medical College of Georgia.
More information: https://dknet.org/about/webinar
This document discusses measuring and controlling variability in a core facility environment. It notes that increased variability decreases statistical power to detect real effects. Common sources of variability include biological differences, sample preparation inconsistencies, and technical issues. The document asks how best to communicate these issues to customers given limited funding to directly measure variability. It discusses challenges like analyzing samples over long periods when preparation methods may differ, having no control over initial sample collection, and potential sources of variability like pipetting errors. Finally, it provides an example method using pooled and individual samples to empirically measure variability introduced through sample processing.
KCAS' flow cytometry team supports assay development, validation, and sample processing for preclinical and clinical studies. They helped a client design and perform a pharmacodynamic assay to measure T cell, B cell, and NK cell counts in rats treated with an antibody therapeutic within 4 weeks. KCAS recommended using CD45 lymphocyte gating and BD Trucount tubes for absolute counts. Results showed preserved samples had less than 20% variability in cell counts and percentages compared to unpreserved samples over 5 days. KCAS successfully completed the study in under 4 weeks as requested by the client.
Eddie Halbisch is a senior chemist currently working at Lancaster in Indianapolis, Indiana. He has over 5 years of experience in stability and analytical testing of pharmaceutical products at Lancaster and Pharmaceutical Product Development. His skills include proficiency with HPLC, UPLC, GC, dissolution testing and other laboratory instruments. He is computer skilled in Empower, Word and Excel. He has received recognition as mentor of the year in 2014. He holds a Bachelor of Science in Chemistry from University of Wisconsin-Oshkosh and has experience as a chemistry/statistics tutor and research lab assistant. References are provided from his past supervisor at Pharmaceutical Product Development.
Metabolomics is the newest hype in the 'omics' family. Although defined as highly important to recent biological research, the analytical science applied is far from acceptable for the majority of publications that appear nowadays. Take a look at our approach to tackle the metabolomics issue, which remains a new name for an old science.
This document discusses concepts related to bioequivalence studies for generic drugs, including:
- Generic drugs must be pharmaceutically equivalent and bioequivalent to the reference brand name drug to be considered substitutable.
- Bioequivalence studies evaluate the rate and extent of absorption of the generic drug compared to the brand name drug and ensure the generic drug performs in the same manner.
- Acceptance criteria for bioequivalence studies specify that the 90% confidence intervals for AUC and Cmax of the generic must fall within 80-125% of the reference drug.
- Special study designs may be needed for certain drugs based on their pharmacokinetic properties or ability to be measured in plasma.
This document provides guidance on optimizing and validating antibodies for immunohistochemistry in formalin-fixed paraffin-embedded tissues. It discusses preparing for antibody validation including obtaining product information, references, and tissues. It also covers testing different heat induced epitope retrieval solutions, pretreatments like protease digestion, antibody dilutions, detection systems, verification of results, and documentation needed for full validation. Proper validation involving pathologists is emphasized to ensure confidence in test performance and results.
dkNET Webinar: The Mouse Metabolic Phenotyping Centers: Services and Data 01/...dkNET
The Mouse Metabolic Phenotyping Centers (MMPC) is a National Institutes of Health-Sponsored resource that provides experimental testing services to scientists studying diabetes, obesity, diabetic complications, and other metabolic diseases in mice. Dr. Richard McIndoe will introduce resources and tools that are available at MMPC.
Abstract
A common strategy to dissect the etiology, genetics and underlying physiology of a disease is to create mouse models using gene targeting and manipulation techniques. These mouse models were developed by targeting one or more candidate genes or by using a whole genome mutagenesis strategy. The careful and reproducible characterization of these animal models is important for the advancement of biomedical research. The expense, expertise and time required to develop state-of-the-art phenotyping technologies is beyond the reach of many investigators. The Mouse Metabolic Phenotyping Centers (MMPC) were created to provide the scientific community with cost effective, high quality, standardized metabolic and phenotyping services. The focus of the MMPC is on experiments that characterize living animals as well as providing technologies that are important for understanding metabolism and physiology. The MMPC provides state-of-the-art technologies to investigators for a fee, with their services including characterization of mouse metabolism, blood composition (including hormones), energy balance, eating and exercise, organ function and morphology, physiology and histology. There are currently five MMPC Centers located at Vanderbilt University, University of California Davis, University of Cincinnati, University of Massachusetts and the University of Michigan. Investigators using the MMPC services agree to release the data generated by the MMPC to the general public via the national website database. This talk will review the structure of the MMPC, the services it provides and the data generated by the consortium for public use.
Presenter: Dr. Richard McIndoe, Professor, College of Graduate Studies and the College of Allied Health Sciences, Medical College of Georgia.
More information: https://dknet.org/about/webinar
This document discusses biological variation in clinical measurements. It aims to identify the nature of biological variation, appreciate its significance, and understand how to determine and apply indices of biological variation. Biological variation refers to components of variance in biochemical measurements determined by a subject's physiology. The sources, quantification, and practical applications of biological variation data are explored. Understanding biological variation is fundamental to developing reference data and interpreting clinical measurements over time.
Turning up the Compen-DIAL: Rapid Test Methods for Cell & Gene TherapiesMerck Life Sciences
Watch the presentation of this webinar here: https://bit.ly/3aeCPNB
Find out how we turn up the dial on quality control testing for cell and gene therapies through rapid methods for sterility, mycoplasma, and replication competent virus. We will review the current regulatory expectations as well as the benefits and limitations that come with each method.
Two of the biggest challenges with applying traditional quality control (QC) test methods to cell and gene therapies, is time to results, due to short shelf-life, and availability of sufficient sample, due to small production volumes.
So how can these challenges be overcome while still meeting regulatory expectations?
In this webinar we will discuss and review suitable methods for rapid testing of short-life cell and gene therapies that may also help conserve limited production material. We will look at benefits, limitations, and regulatory expectations for various QC needs including current and future rapid methods for sterility, mycoplasma and replication competent virus.
In this webinar, you will learn:
• Why the shelf life of a cell or gene therapy product may impact your QC testing strategy
• Current regulatory expectations surrounding rapid methods for sterility, mycoplasma and replication competent virus
• Potential impacts of pursuing a non-optimal QC testing strategy
Turning up the Compen-DIAL: Rapid Test Methods for Cell & Gene TherapiesMilliporeSigma
Watch the presentation of this webinar here: https://bit.ly/3aeCPNB
Find out how we turn up the dial on quality control testing for cell and gene therapies through rapid methods for sterility, mycoplasma, and replication competent virus. We will review the current regulatory expectations as well as the benefits and limitations that come with each method.
Two of the biggest challenges with applying traditional quality control (QC) test methods to cell and gene therapies, is time to results, due to short shelf-life, and availability of sufficient sample, due to small production volumes.
So how can these challenges be overcome while still meeting regulatory expectations?
In this webinar we will discuss and review suitable methods for rapid testing of short-life cell and gene therapies that may also help conserve limited production material. We will look at benefits, limitations, and regulatory expectations for various QC needs including current and future rapid methods for sterility, mycoplasma and replication competent virus.
In this webinar, you will learn:
• Why the shelf life of a cell or gene therapy product may impact your QC testing strategy
• Current regulatory expectations surrounding rapid methods for sterility, mycoplasma and replication competent virus
• Potential impacts of pursuing a non-optimal QC testing strategy
This document summarizes Radioimmunoassay (RIA) and Enzyme-Linked Immunosorbent Assay (ELISA) techniques. RIA was developed in 1959 and uses radioactive molecules to detect antigens or antibodies in biological samples. It is highly sensitive but requires special safety precautions due to radioactivity. ELISA was developed later and uses enzyme-linked antibodies to detect antigens or antibodies through a color change reaction. It has advantages over RIA like no radioactivity, higher sample throughput, and easier automation. Both techniques are widely used in clinical diagnostics, research, and other applications to detect various molecules.
Bioequivalence biowaiver and ivivc studies 2014 newAsra Hameed
The document discusses bioequivalence and biopharmaceutics classification system. It defines bioequivalence as the absence of a significant difference in the rate and extent to which the active ingredient becomes available at the site of drug action when administered at the same molar dose under similar conditions. It also defines pharmaceutical equivalents and alternatives. The document discusses approaches to determine bioequivalence including in vivo and in vitro methods. It provides details on bioequivalence study design, components, and considerations. Finally, it introduces the biopharmaceutics classification system and criteria for classifying drugs as highly soluble and highly permeable.
Lisa Grimm has over 15 years of experience developing, optimizing, and validating various immunoassays, cell-based assays, and coagulation assays. She has worked in immunology, oncology, haemostasis, and biologics at several pharmaceutical companies. Currently she is a research scientist at Tandem Labs developing and validating immunoassays like ADA and neutralizing antibody assays under GLP regulations.
This document provides an overview of how to use RELMA (Regenstrief LOINC Mapping Assistant) to map local laboratory test names and codes to standardized LOINC codes. It discusses loading a local observation file containing local test data into RELMA, searching for LOINC terms, mapping local terms to LOINC codes, and exporting the mapped terms. The goal is to improve data interoperability, comparability and quality by standardizing test names and codes using the LOINC system.
Physiological Based Biopharmaceutics Modelling in industrial practice. Curren...PhinC Development
This document discusses the use of physiological based biopharmaceutics modelling (PBBM) in evaluating drug formulations. It provides examples of how PBBM tools can be used to assess bioequivalence between batches, define acceptable product specifications, and determine the safe operating range for critical material attributes and process parameters. The document also highlights the need for biomarkers to help account for variability between individuals in PBBM simulations. It then presents a case study on the drug Calquence where PBBM and biomarkers were used to link in vitro dissolution to in vivo pharmacokinetics and interpret clinical data. Finally, it provides a vision for further advancing the use of PBBM in areas like personalized medicine.
Bioassay is defined as measuring the biological response of living tissues to determine the potency or concentration of an active principle in a preparation. There are various types of bioassays including quantal assays, graded assays, and multiple point assays. Bioassays can be performed on intact animals, isolated tissues, specific cells, or organisms and are useful for standardizing drugs obtained from natural sources and for measuring the activity of new or undefined substances. While powerful, bioassays can be time-consuming and expensive compared to physico-chemical methods.
Lisa Grimm has over 20 years of experience developing, optimizing, and validating cell-based, immuno, and coagulation assays across various therapeutic areas including immunology, oncology, and haemostasis. She has worked at several contract research organizations and pharmaceutical companies developing assays to evaluate drug candidates and biomarkers. Currently, she is a research scientist at Tandem Labs developing and validating immunoassays including ADA and neutralizing antibody assays under GLP regulations to screen pre-clinical and clinical samples.
The curriculum vitae outlines the professional experience and qualifications of Polagani Srinivasa Rao, who has over 11 years of experience developing and validating analytical methods using LC-MS/MS to quantify pharmaceutical drugs, metabolites, and other compounds in biological samples under GLP and FDA regulations. He currently works as a Manager of Bioanalytical Research at Indoco Remides, where he leads projects involving method development, validation, sample preparation, and regulatory compliance. Rao also has publications in peer-reviewed journals and presents technical information at meetings and seminars.
LabSolutions offers molecular risk tests that analyze multiple genes associated with increased risk of certain diseases. The tests use next generation sequencing and other techniques to identify variants in these genes. While the tests do not diagnose conditions, positive results may inform medical management and be relevant for relatives. The report should be reviewed with a healthcare provider trained to interpret genetic results.
Bioassay techniques involve measuring the biological response of a test system to determine the potency or concentration of a physical, chemical, or biological substance. There are three main types of bioassay techniques: in vitro, in vivo, and ex vivo. Bioassays can be qualitative, to assess effects, or quantitative to estimate concentration/potency by measuring biological responses. Common bioassay methods include graded response assays, endpoint assays, and multi-point assays using interpolation. ELISA, microbioassays, and radioimmunoassays are also important specialized bioassay techniques.
Bioassay techniques are used to estimate the concentration or potency of substances by measuring biological responses. There are three main types of bioassays: in vitro uses cell cultures, in vivo uses live animals, and ex vivo uses isolated tissues. Bioassays can be qualitative, observing effects, or quantitative, measuring concentrations. Methods include graded response assays, endpoint assays, and multi-point assays using interpolation. Other techniques include ELISA using antibodies, microbioassays on microbes, radioimmunoassay using radiolabeled antigens, and applications of biotechnology in fields like medicine, agriculture, and industry.
Systematic Review Workflows and Semantic Solutions for Integrating Biological...Michelle Angrish
You tube video available: https://www.youtube.com/channel/UCrTXH6Yh-djmbmoluzgI_2w
Presentation describing how systematic review workflows, evidence maps, and semantics can be used to explore and evidence base and prioritize information for answering science questions.
USP 621 Allowable Adjustment to Chromatography HPLC MethodsSandy Simmons
Effective August 1st 2014, the United States Pharmacopoeia (USP) published the latest revision to General Chapter <621> mapping out the "allowable adjustments" that can be made to USP methods without having to re-validate these methods. Articles provided by industry leaders in separation sciences, pharmacology and chemistry.
Bioassay ,its types for theory & practicalHeena Parveen
The document discusses bioassays, which are techniques used to determine the potency or concentration of an active ingredient in a preparation. There are two main types of bioassays: quantal/direct endpoint bioassays which measure all-or-none biological responses, and graded response bioassays which measure graded responses to different doses. Graded response bioassays include methods like matching, bracketing, and interpolation bioassays which plot dose-response curves to calculate the potency of a test substance compared to a standard. Multiple point bioassays that use statistical analysis of responses to multiple doses are typically more precise and reliable.
USFDA guidelines for bioanalytical method validationbhatiaji123
The document discusses guidelines for bioanalytical method validation from the USFDA. It describes key parameters that must be validated for a bioanalytical method, including selectivity, accuracy, precision, recovery, calibration curves, sensitivity, reproducibility and stability. Accuracy and precision are determined by analyzing quality control samples in replicates across multiple runs. Recovery experiments compare extracted samples to unextracted standards. A calibration curve consisting of multiple concentrations over the expected range must be precise and reproducible.
Not the best presentation, mostly a stream a consciousness about analyzing samples in a Core facility, although I do list some of my favorite methods. So maybe it will be useful to some? I repacakged this into a better talk (see my Genomeweb 2019 talk on SlideShare)
1) The document describes how to set up a Google Cloud virtual machine to use Prosit, a tool for peptide MS/MS and retention time prediction. It provides step-by-step instructions for installing the necessary software, downloading pre-trained Prosit models, and running examples.
2) Setup is estimated to take around 20 minutes. The document recommends using the cheapest GPU option (Tesla P100) as Prosit does not heavily utilize the GPU during prediction. At least 8 CPU cores and 100GB RAM are suggested.
3) Benchmarking showed that 100,000 peptides can be predicted in 10 minutes on a Tesla P100 VM, while 1 million peptides would take around 100 minutes.
This document discusses biological variation in clinical measurements. It aims to identify the nature of biological variation, appreciate its significance, and understand how to determine and apply indices of biological variation. Biological variation refers to components of variance in biochemical measurements determined by a subject's physiology. The sources, quantification, and practical applications of biological variation data are explored. Understanding biological variation is fundamental to developing reference data and interpreting clinical measurements over time.
Turning up the Compen-DIAL: Rapid Test Methods for Cell & Gene TherapiesMerck Life Sciences
Watch the presentation of this webinar here: https://bit.ly/3aeCPNB
Find out how we turn up the dial on quality control testing for cell and gene therapies through rapid methods for sterility, mycoplasma, and replication competent virus. We will review the current regulatory expectations as well as the benefits and limitations that come with each method.
Two of the biggest challenges with applying traditional quality control (QC) test methods to cell and gene therapies, is time to results, due to short shelf-life, and availability of sufficient sample, due to small production volumes.
So how can these challenges be overcome while still meeting regulatory expectations?
In this webinar we will discuss and review suitable methods for rapid testing of short-life cell and gene therapies that may also help conserve limited production material. We will look at benefits, limitations, and regulatory expectations for various QC needs including current and future rapid methods for sterility, mycoplasma and replication competent virus.
In this webinar, you will learn:
• Why the shelf life of a cell or gene therapy product may impact your QC testing strategy
• Current regulatory expectations surrounding rapid methods for sterility, mycoplasma and replication competent virus
• Potential impacts of pursuing a non-optimal QC testing strategy
Turning up the Compen-DIAL: Rapid Test Methods for Cell & Gene TherapiesMilliporeSigma
Watch the presentation of this webinar here: https://bit.ly/3aeCPNB
Find out how we turn up the dial on quality control testing for cell and gene therapies through rapid methods for sterility, mycoplasma, and replication competent virus. We will review the current regulatory expectations as well as the benefits and limitations that come with each method.
Two of the biggest challenges with applying traditional quality control (QC) test methods to cell and gene therapies, is time to results, due to short shelf-life, and availability of sufficient sample, due to small production volumes.
So how can these challenges be overcome while still meeting regulatory expectations?
In this webinar we will discuss and review suitable methods for rapid testing of short-life cell and gene therapies that may also help conserve limited production material. We will look at benefits, limitations, and regulatory expectations for various QC needs including current and future rapid methods for sterility, mycoplasma and replication competent virus.
In this webinar, you will learn:
• Why the shelf life of a cell or gene therapy product may impact your QC testing strategy
• Current regulatory expectations surrounding rapid methods for sterility, mycoplasma and replication competent virus
• Potential impacts of pursuing a non-optimal QC testing strategy
This document summarizes Radioimmunoassay (RIA) and Enzyme-Linked Immunosorbent Assay (ELISA) techniques. RIA was developed in 1959 and uses radioactive molecules to detect antigens or antibodies in biological samples. It is highly sensitive but requires special safety precautions due to radioactivity. ELISA was developed later and uses enzyme-linked antibodies to detect antigens or antibodies through a color change reaction. It has advantages over RIA like no radioactivity, higher sample throughput, and easier automation. Both techniques are widely used in clinical diagnostics, research, and other applications to detect various molecules.
Bioequivalence biowaiver and ivivc studies 2014 newAsra Hameed
The document discusses bioequivalence and biopharmaceutics classification system. It defines bioequivalence as the absence of a significant difference in the rate and extent to which the active ingredient becomes available at the site of drug action when administered at the same molar dose under similar conditions. It also defines pharmaceutical equivalents and alternatives. The document discusses approaches to determine bioequivalence including in vivo and in vitro methods. It provides details on bioequivalence study design, components, and considerations. Finally, it introduces the biopharmaceutics classification system and criteria for classifying drugs as highly soluble and highly permeable.
Lisa Grimm has over 15 years of experience developing, optimizing, and validating various immunoassays, cell-based assays, and coagulation assays. She has worked in immunology, oncology, haemostasis, and biologics at several pharmaceutical companies. Currently she is a research scientist at Tandem Labs developing and validating immunoassays like ADA and neutralizing antibody assays under GLP regulations.
This document provides an overview of how to use RELMA (Regenstrief LOINC Mapping Assistant) to map local laboratory test names and codes to standardized LOINC codes. It discusses loading a local observation file containing local test data into RELMA, searching for LOINC terms, mapping local terms to LOINC codes, and exporting the mapped terms. The goal is to improve data interoperability, comparability and quality by standardizing test names and codes using the LOINC system.
Physiological Based Biopharmaceutics Modelling in industrial practice. Curren...PhinC Development
This document discusses the use of physiological based biopharmaceutics modelling (PBBM) in evaluating drug formulations. It provides examples of how PBBM tools can be used to assess bioequivalence between batches, define acceptable product specifications, and determine the safe operating range for critical material attributes and process parameters. The document also highlights the need for biomarkers to help account for variability between individuals in PBBM simulations. It then presents a case study on the drug Calquence where PBBM and biomarkers were used to link in vitro dissolution to in vivo pharmacokinetics and interpret clinical data. Finally, it provides a vision for further advancing the use of PBBM in areas like personalized medicine.
Bioassay is defined as measuring the biological response of living tissues to determine the potency or concentration of an active principle in a preparation. There are various types of bioassays including quantal assays, graded assays, and multiple point assays. Bioassays can be performed on intact animals, isolated tissues, specific cells, or organisms and are useful for standardizing drugs obtained from natural sources and for measuring the activity of new or undefined substances. While powerful, bioassays can be time-consuming and expensive compared to physico-chemical methods.
Lisa Grimm has over 20 years of experience developing, optimizing, and validating cell-based, immuno, and coagulation assays across various therapeutic areas including immunology, oncology, and haemostasis. She has worked at several contract research organizations and pharmaceutical companies developing assays to evaluate drug candidates and biomarkers. Currently, she is a research scientist at Tandem Labs developing and validating immunoassays including ADA and neutralizing antibody assays under GLP regulations to screen pre-clinical and clinical samples.
The curriculum vitae outlines the professional experience and qualifications of Polagani Srinivasa Rao, who has over 11 years of experience developing and validating analytical methods using LC-MS/MS to quantify pharmaceutical drugs, metabolites, and other compounds in biological samples under GLP and FDA regulations. He currently works as a Manager of Bioanalytical Research at Indoco Remides, where he leads projects involving method development, validation, sample preparation, and regulatory compliance. Rao also has publications in peer-reviewed journals and presents technical information at meetings and seminars.
LabSolutions offers molecular risk tests that analyze multiple genes associated with increased risk of certain diseases. The tests use next generation sequencing and other techniques to identify variants in these genes. While the tests do not diagnose conditions, positive results may inform medical management and be relevant for relatives. The report should be reviewed with a healthcare provider trained to interpret genetic results.
Bioassay techniques involve measuring the biological response of a test system to determine the potency or concentration of a physical, chemical, or biological substance. There are three main types of bioassay techniques: in vitro, in vivo, and ex vivo. Bioassays can be qualitative, to assess effects, or quantitative to estimate concentration/potency by measuring biological responses. Common bioassay methods include graded response assays, endpoint assays, and multi-point assays using interpolation. ELISA, microbioassays, and radioimmunoassays are also important specialized bioassay techniques.
Bioassay techniques are used to estimate the concentration or potency of substances by measuring biological responses. There are three main types of bioassays: in vitro uses cell cultures, in vivo uses live animals, and ex vivo uses isolated tissues. Bioassays can be qualitative, observing effects, or quantitative, measuring concentrations. Methods include graded response assays, endpoint assays, and multi-point assays using interpolation. Other techniques include ELISA using antibodies, microbioassays on microbes, radioimmunoassay using radiolabeled antigens, and applications of biotechnology in fields like medicine, agriculture, and industry.
Systematic Review Workflows and Semantic Solutions for Integrating Biological...Michelle Angrish
You tube video available: https://www.youtube.com/channel/UCrTXH6Yh-djmbmoluzgI_2w
Presentation describing how systematic review workflows, evidence maps, and semantics can be used to explore and evidence base and prioritize information for answering science questions.
USP 621 Allowable Adjustment to Chromatography HPLC MethodsSandy Simmons
Effective August 1st 2014, the United States Pharmacopoeia (USP) published the latest revision to General Chapter <621> mapping out the "allowable adjustments" that can be made to USP methods without having to re-validate these methods. Articles provided by industry leaders in separation sciences, pharmacology and chemistry.
Bioassay ,its types for theory & practicalHeena Parveen
The document discusses bioassays, which are techniques used to determine the potency or concentration of an active ingredient in a preparation. There are two main types of bioassays: quantal/direct endpoint bioassays which measure all-or-none biological responses, and graded response bioassays which measure graded responses to different doses. Graded response bioassays include methods like matching, bracketing, and interpolation bioassays which plot dose-response curves to calculate the potency of a test substance compared to a standard. Multiple point bioassays that use statistical analysis of responses to multiple doses are typically more precise and reliable.
USFDA guidelines for bioanalytical method validationbhatiaji123
The document discusses guidelines for bioanalytical method validation from the USFDA. It describes key parameters that must be validated for a bioanalytical method, including selectivity, accuracy, precision, recovery, calibration curves, sensitivity, reproducibility and stability. Accuracy and precision are determined by analyzing quality control samples in replicates across multiple runs. Recovery experiments compare extracted samples to unextracted standards. A calibration curve consisting of multiple concentrations over the expected range must be precise and reproducible.
Not the best presentation, mostly a stream a consciousness about analyzing samples in a Core facility, although I do list some of my favorite methods. So maybe it will be useful to some? I repacakged this into a better talk (see my Genomeweb 2019 talk on SlideShare)
1) The document describes how to set up a Google Cloud virtual machine to use Prosit, a tool for peptide MS/MS and retention time prediction. It provides step-by-step instructions for installing the necessary software, downloading pre-trained Prosit models, and running examples.
2) Setup is estimated to take around 20 minutes. The document recommends using the cheapest GPU option (Tesla P100) as Prosit does not heavily utilize the GPU during prediction. At least 8 CPU cores and 100GB RAM are suggested.
3) Benchmarking showed that 100,000 peptides can be predicted in 10 minutes on a Tesla P100 VM, while 1 million peptides would take around 100 minutes.
Phinney 2019 ASMS Proteome software Users group TalkUC Davis
The document summarizes different methods for optimizing data-independent acquisition (DIA) workflows in a proteomics core facility. It compares the 2x green fluorescent protein (GFP) method to using a narrow chromatogram library and wide experimental window, finding the 2x GFP method works well for 2-5 samples. An example experiment profiling several proteins across conditions is described. Different permutations are tested on a subset, comparing public libraries to in-house libraries for building chromatogram libraries. The 2x GFP method using public libraries works best for depth and missing values. Challenges with ScaffoldDIA software are noted. In conclusion, DIA provides better coverage than data-dependent acquisition, especially for human samples, and can work
This document describes the work done at the UC Davis Proteomics Core Facility, which uses suspension trapping and peptide-centric DIA with deep learning to develop more universal proteomics methods. The core analyzes diverse sample types from various species using techniques like AP-MS, TMT, DIA, and PTM analysis. Peptide-centric DIA with ScaffoldDIA software provides better quantification than DDA, especially for human samples. Using Prosit deep learning to generate in silico libraries improves pathogen and protein identification from samples like infected grape sap. The core finds that DIA with gas phase fractionation works well for smaller sample sets and provides advantages over DDA for applications like undepleted serum proteomics.
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆Sérgio Sacani
Context. The early-type galaxy SDSS J133519.91+072807.4 (hereafter SDSS1335+0728), which had exhibited no prior optical variations during the preceding two decades, began showing significant nuclear variability in the Zwicky Transient Facility (ZTF) alert stream from December 2019 (as ZTF19acnskyy). This variability behaviour, coupled with the host-galaxy properties, suggests that SDSS1335+0728 hosts a ∼ 106M⊙ black hole (BH) that is currently in the process of ‘turning on’. Aims. We present a multi-wavelength photometric analysis and spectroscopic follow-up performed with the aim of better understanding the origin of the nuclear variations detected in SDSS1335+0728. Methods. We used archival photometry (from WISE, 2MASS, SDSS, GALEX, eROSITA) and spectroscopic data (from SDSS and LAMOST) to study the state of SDSS1335+0728 prior to December 2019, and new observations from Swift, SOAR/Goodman, VLT/X-shooter, and Keck/LRIS taken after its turn-on to characterise its current state. We analysed the variability of SDSS1335+0728 in the X-ray/UV/optical/mid-infrared range, modelled its spectral energy distribution prior to and after December 2019, and studied the evolution of its UV/optical spectra. Results. From our multi-wavelength photometric analysis, we find that: (a) since 2021, the UV flux (from Swift/UVOT observations) is four times brighter than the flux reported by GALEX in 2004; (b) since June 2022, the mid-infrared flux has risen more than two times, and the W1−W2 WISE colour has become redder; and (c) since February 2024, the source has begun showing X-ray emission. From our spectroscopic follow-up, we see that (i) the narrow emission line ratios are now consistent with a more energetic ionising continuum; (ii) broad emission lines are not detected; and (iii) the [OIII] line increased its flux ∼ 3.6 years after the first ZTF alert, which implies a relatively compact narrow-line-emitting region. Conclusions. We conclude that the variations observed in SDSS1335+0728 could be either explained by a ∼ 106M⊙ AGN that is just turning on or by an exotic tidal disruption event (TDE). If the former is true, SDSS1335+0728 is one of the strongest cases of an AGNobserved in the process of activating. If the latter were found to be the case, it would correspond to the longest and faintest TDE ever observed (or another class of still unknown nuclear transient). Future observations of SDSS1335+0728 are crucial to further understand its behaviour. Key words. galaxies: active– accretion, accretion discs– galaxies: individual: SDSS J133519.91+072807.4
PPT on Direct Seeded Rice presented at the three-day 'Training and Validation Workshop on Modules of Climate Smart Agriculture (CSA) Technologies in South Asia' workshop on April 22, 2024.
_Extraction of Ethylene oxide and 2-Chloroethanol from alternate matrices Li...LucyHearn1
How do you know your food is safe?
Last Friday was world World Food Safety Day, facilitated by the Food and Agriculture Organization of the United Nations (FAO) and the World Health Organization (WHO) in which the slogan rightly says, 'food safety is everyone's business'. Due to this, I thought it would be worth sharing some data that I have worked on in this field!
Working at Markes International has really opened my eyes (and unfortunately my friends and family 🤣) to food safety and quality, especially with my recent application work on ethylene oxide and 2-chloroethanol residues in foodstuffs, as of the biggest global food recalls in history was and is still being implemented by the Rapid alert system for food and feed (RASFF) in 2021, for high levels of these carcinogenic compounds.
Authoring a personal GPT for your research and practice: How we created the Q...Leonel Morgado
Thematic analysis in qualitative research is a time-consuming and systematic task, typically done using teams. Team members must ground their activities on common understandings of the major concepts underlying the thematic analysis, and define criteria for its development. However, conceptual misunderstandings, equivocations, and lack of adherence to criteria are challenges to the quality and speed of this process. Given the distributed and uncertain nature of this process, we wondered if the tasks in thematic analysis could be supported by readily available artificial intelligence chatbots. Our early efforts point to potential benefits: not just saving time in the coding process but better adherence to criteria and grounding, by increasing triangulation between humans and artificial intelligence. This tutorial will provide a description and demonstration of the process we followed, as two academic researchers, to develop a custom ChatGPT to assist with qualitative coding in the thematic data analysis process of immersive learning accounts in a survey of the academic literature: QUAL-E Immersive Learning Thematic Analysis Helper. In the hands-on time, participants will try out QUAL-E and develop their ideas for their own qualitative coding ChatGPT. Participants that have the paid ChatGPT Plus subscription can create a draft of their assistants. The organizers will provide course materials and slide deck that participants will be able to utilize to continue development of their custom GPT. The paid subscription to ChatGPT Plus is not required to participate in this workshop, just for trying out personal GPTs during it.
Sexuality - Issues, Attitude and Behaviour - Applied Social Psychology - Psyc...PsychoTech Services
A proprietary approach developed by bringing together the best of learning theories from Psychology, design principles from the world of visualization, and pedagogical methods from over a decade of training experience, that enables you to: Learn better, faster!
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
JAMES WEBB STUDY THE MASSIVE BLACK HOLE SEEDSSérgio Sacani
The pathway(s) to seeding the massive black holes (MBHs) that exist at the heart of galaxies in the present and distant Universe remains an unsolved problem. Here we categorise, describe and quantitatively discuss the formation pathways of both light and heavy seeds. We emphasise that the most recent computational models suggest that rather than a bimodal-like mass spectrum between light and heavy seeds with light at one end and heavy at the other that instead a continuum exists. Light seeds being more ubiquitous and the heavier seeds becoming less and less abundant due the rarer environmental conditions required for their formation. We therefore examine the different mechanisms that give rise to different seed mass spectrums. We show how and why the mechanisms that produce the heaviest seeds are also among the rarest events in the Universe and are hence extremely unlikely to be the seeds for the vast majority of the MBH population. We quantify, within the limits of the current large uncertainties in the seeding processes, the expected number densities of the seed mass spectrum. We argue that light seeds must be at least 103 to 105 times more numerous than heavy seeds to explain the MBH population as a whole. Based on our current understanding of the seed population this makes heavy seeds (Mseed > 103 M⊙) a significantly more likely pathway given that heavy seeds have an abundance pattern than is close to and likely in excess of 10−4 compared to light seeds. Finally, we examine the current state-of-the-art in numerical calculations and recent observations and plot a path forward for near-future advances in both domains.
Microbial interaction
Microorganisms interacts with each other and can be physically associated with another organisms in a variety of ways.
One organism can be located on the surface of another organism as an ectobiont or located within another organism as endobiont.
Microbial interaction may be positive such as mutualism, proto-cooperation, commensalism or may be negative such as parasitism, predation or competition
Types of microbial interaction
Positive interaction: mutualism, proto-cooperation, commensalism
Negative interaction: Ammensalism (antagonism), parasitism, predation, competition
I. Mutualism:
It is defined as the relationship in which each organism in interaction gets benefits from association. It is an obligatory relationship in which mutualist and host are metabolically dependent on each other.
Mutualistic relationship is very specific where one member of association cannot be replaced by another species.
Mutualism require close physical contact between interacting organisms.
Relationship of mutualism allows organisms to exist in habitat that could not occupied by either species alone.
Mutualistic relationship between organisms allows them to act as a single organism.
Examples of mutualism:
i. Lichens:
Lichens are excellent example of mutualism.
They are the association of specific fungi and certain genus of algae. In lichen, fungal partner is called mycobiont and algal partner is called
II. Syntrophism:
It is an association in which the growth of one organism either depends on or improved by the substrate provided by another organism.
In syntrophism both organism in association gets benefits.
Compound A
Utilized by population 1
Compound B
Utilized by population 2
Compound C
utilized by both Population 1+2
Products
In this theoretical example of syntrophism, population 1 is able to utilize and metabolize compound A, forming compound B but cannot metabolize beyond compound B without co-operation of population 2. Population 2is unable to utilize compound A but it can metabolize compound B forming compound C. Then both population 1 and 2 are able to carry out metabolic reaction which leads to formation of end product that neither population could produce alone.
Examples of syntrophism:
i. Methanogenic ecosystem in sludge digester
Methane produced by methanogenic bacteria depends upon interspecies hydrogen transfer by other fermentative bacteria.
Anaerobic fermentative bacteria generate CO2 and H2 utilizing carbohydrates which is then utilized by methanogenic bacteria (Methanobacter) to produce methane.
ii. Lactobacillus arobinosus and Enterococcus faecalis:
In the minimal media, Lactobacillus arobinosus and Enterococcus faecalis are able to grow together but not alone.
The synergistic relationship between E. faecalis and L. arobinosus occurs in which E. faecalis require folic acid
MICROBIAL INTERACTION PPT/ MICROBIAL INTERACTION AND THEIR TYPES // PLANT MIC...
Asms qc Will Thompson Duke
1. The GOAL…
“That our differential expression data will
be an accurate representation of the
biological system measured, and that our
colleagues who develop clinical diagnostics
will find the data credible.”
2. The GOAL…
my BELIEF…
“That if I do not pay enough attention to
experimental design and quality control,
those colleagues will never believe me.”
“That our differential expression data will
be an accurate representation of the
biological system measured, and that our
colleagues who develop clinical diagnostics
will find the data credible.”
3. Variability in Label-Free Proteomics
Workflows
Sample Prep
Data Collection
Minimize
and
Measure
Variability!
4. Sample Prep: Procedure and Pitfalls
Lysis
Normalization
Digestion
SPE/Desalting
Dry and Reconstitute
Detergent/Chaotrope
Physical Stress
mL/mg volume ratios
Bradford/BCA/A280 Assay
Constant protein content
Constant volume
Substrate concentration
Constant Enz/Substrate Ratio
Fresh reagents
Mixing/Temperature/Time
Enzyme Quality?
Online/offline
Commercial/Homemade
Avoid where possible
Speedvac/Lyophilize
Solubility?
Avoid where possible
5. Sample Prep: Controlling for Variability
Lysis
Normalization
Digestion
SPE/Desalting
Dry and Reconstitute
Controlling for variability
Constant mass/volume ratios
Constant Volume and [Protein]
Protein Standards (SIL or “surrogate”)
Peptide Standards (SIL or “surrogate”)
6. How to Assess Variability of a Workflow
Zhang, Fenyo, and Neubert. J. Proteome
Res. 2009, 8(3): 1285-1292
Method 2 (Neubert et al)
𝑆2
𝑡𝑜𝑡𝑎𝑙
= 𝑆2
1
+ 𝑆2
2
+ 𝑆2
3
…+𝑆2
𝑛
Method 1 (‘old school’)
Biological
Technical
(Preparation)
Analytical
.
.
.
.
7. Limiting and Measuring Variability in LC-MS/MS
Analysis (Exemplar with n=6 Samples)
“Technical-Replicate Heavy”
Incorrect (Underestimation)
Correct
“Moderate Technical Replication” (“90/10 approach”)
“Singles + QC Pool”
Analysis Order:
1. Randomization (if unknowns)
2. “Blocking” + Randomization
(if known sample grouping)