Prepared By: Shruti Vij (Senior Analyst) , Geeta Mathur(Senior Scientist) ,Khushbu ( Analyst)
This slide show contains detailed explanation of three characteristics of method validation- Limit of detection, Quantitation limits and Robustness. Limit of detection is the minimum amount of substance that can be detected but not measured, quantitation limit is the minimum amount of substance which can be detected and measured. Common approach to these procedures- signal to noise ratio has also been covered. Robustness is a characteristic which determines a method’s reliability when deliberate variations are induced in parameters.
Ion pair chromatography for pharmacy studentsabhishek rai
Ion-PairChromatography
A GENERALISED OVERVIEW
Chromatography
HPLC
Reverse Phase Chromatography
Ion Pair Chromatography
Ion Pair Reagent
Mechanism of Ion Pair Chromatography
Ion Pair Wash Procedure
Impurity profiling and degradent characterization {presented by shameer m.pha...ShameerAbid
these slides discuss
Impurity profiling
Degradation characterization
Stability testing & Accelerated stability testing (ICH)
Evaluation of the test (shelf life)
analytical method development
ICH vs USP definition
methods for identification
method for the isolation of the impurity
factors affecting the degradation of formulation
What is degradation characterization
general protocol of degradation conditions used for drug substance and drug product
Degradation conditions
Stress testing
Container closure system
In this slide contains Interference In Atomic Absorption Spectroscopy and applications.
Presented by: Shaik Gouse ul azam. ( department of pharmaceutical analysis.)
RIPER, anantpur.
Ion pair chromatography for pharmacy studentsabhishek rai
Ion-PairChromatography
A GENERALISED OVERVIEW
Chromatography
HPLC
Reverse Phase Chromatography
Ion Pair Chromatography
Ion Pair Reagent
Mechanism of Ion Pair Chromatography
Ion Pair Wash Procedure
Impurity profiling and degradent characterization {presented by shameer m.pha...ShameerAbid
these slides discuss
Impurity profiling
Degradation characterization
Stability testing & Accelerated stability testing (ICH)
Evaluation of the test (shelf life)
analytical method development
ICH vs USP definition
methods for identification
method for the isolation of the impurity
factors affecting the degradation of formulation
What is degradation characterization
general protocol of degradation conditions used for drug substance and drug product
Degradation conditions
Stress testing
Container closure system
In this slide contains Interference In Atomic Absorption Spectroscopy and applications.
Presented by: Shaik Gouse ul azam. ( department of pharmaceutical analysis.)
RIPER, anantpur.
The objective of any chemical analytical measurement is to get consistent, reliable and accurate data.
Proper functioning and performance of analytical instruments and computer systems plays a major role in achieving this goal.
Therefore, analytical instrument qualification (AIQ) and calibration should be part of any good analytical practice.
University Institute of Pharmaceutical Sciences is a flag bearer of excellence in Pharmaceutical education and research in the country. Here is another initiative to make study material available to everyone worldwide. Based on the new PCI guidelines and syllabus here we have a presentation dealing with basics impurity profiling and degradent characterization.
Thank you for reading.
Hope it was of help to you.
UIPS,PU team
Quadrupole and Time of Flight Mass analysers.Gagangowda58
Description about important mass analysers Quadrupole and TOF: Principle, Construction and Working, Advantages and Disadvantages and their Applications.
Analytical method validation as per ich and usp shreyas B R
Analytical method validation is a process of documenting/ proving that an analytical method provides analytical data acceptable for the intended use.After the development of an analytical procedure, it is must important to assure that the procedure will consistently produce the intended a precise result with high degree of accuracy. The method should give a specific result that may not be affected by external matters. This creates a requirement to validate the analytical procedures. The validation procedures consists of some characteristics parameters that makes the method acceptable with addition of statistical tools.
The objective of any chemical analytical measurement is to get consistent, reliable and accurate data.
Proper functioning and performance of analytical instruments and computer systems plays a major role in achieving this goal.
Therefore, analytical instrument qualification (AIQ) and calibration should be part of any good analytical practice.
University Institute of Pharmaceutical Sciences is a flag bearer of excellence in Pharmaceutical education and research in the country. Here is another initiative to make study material available to everyone worldwide. Based on the new PCI guidelines and syllabus here we have a presentation dealing with basics impurity profiling and degradent characterization.
Thank you for reading.
Hope it was of help to you.
UIPS,PU team
Quadrupole and Time of Flight Mass analysers.Gagangowda58
Description about important mass analysers Quadrupole and TOF: Principle, Construction and Working, Advantages and Disadvantages and their Applications.
Analytical method validation as per ich and usp shreyas B R
Analytical method validation is a process of documenting/ proving that an analytical method provides analytical data acceptable for the intended use.After the development of an analytical procedure, it is must important to assure that the procedure will consistently produce the intended a precise result with high degree of accuracy. The method should give a specific result that may not be affected by external matters. This creates a requirement to validate the analytical procedures. The validation procedures consists of some characteristics parameters that makes the method acceptable with addition of statistical tools.
Method validation for drug substances and drug product _remodified_2014Ramalingam Badmanaban
Method validation is the process of proving that an analytical method is acceptable for its intended purposes.
METHOD VALIDATION = ERROR ASSESSMENT
Method validation is the process of demonstrating that analytical procedures are suitable for their intended use and that they support the identity, strength, quality, purity and potency of the drug substances and drug products
Validation: Prior ConsiderationsSuitability of Instrument Status of Qualification and Calibration Suitability of Materials Status of Reference Standards, Reagents, Placebo Lots Suitability of Analyst Status of Training and Qualification Records Suitability of Documentation Written and approved standard test procedure and proper approved protocol with pre-established acceptance criteria
Compendial vs. Non-compendial Methods
Compendial methods-Verification
Regulatory analytical procedure in USP/NF
Non- compendial methods-Validation
Alternative analytical procedure proposed by the applicant for use instead of the regulatory analytical procedure
Chromatographic Methods
Demonstrate Resolution
Impurities/Degradants Available
Spike with impurities/degradants
Show resolution and a lack of interference
Impurities/Degradants Not Available
Stress SamplesFor assay, Stressed and Unstressed Samples should be compared.
Ability of an analytical method to measure the analyte free from interference due to other components.
Selectivity describes the ability of an analytical method to differentiate various substances in a sample
Original term used in USP
Also Preferred by IUPAC and AOAC
Also used to characterize chromatographic columns
Degree of Bias (Used in USP)
The difference in assay results between the two groups
the sample containing added impurities, degradation products, related chemical compounds, placebo ingredients
Selectivity: For impurity test, impurity profiles should be compared.
Temperature (50-60℃)
Humidity (70-80%)
Acid Hydrolysis (0.1 N HCl)
Base Hydrolysis (0.1 N NaOH)
Oxidation (3-30%)
Light (UV/Vis/Fl)
Intent is to create 10 to 30 % Degradation
Change in the analytical procedure, drug substance, drug product, the changes, may necessitate revalidation of the analytical procedures.
“The degree of revalidation depends on the nature of the change.”
“FDA intends to provide guidance in the future on post-approval changes in analytical procedures.”
By Visual Inspection of plot of signals vs. analyte concentration
By Appropriate statistical methods
Linear Regression (y = mx + b)
Correlation Coefficient, y-intercept (b), slope (m)
Acceptance criteria: Linear regression r2 > 0.999
Requires a minimum of 6 concentration levels
Normally derived from Linearity studies.
Established by confirming that the method provides acceptable degree of linearity, accuracy, and precision.
Specific range dependent upon intended application of the procedure.
Uncertainty in Petrophysics From Bayes To Monte CarloSimon Stromberg
Increasingly, reduced petrophysical data sets are being used to make critical decisions about the location of deepwater exploration wells. In addition, while drilling deepwater wells, reduced data sets are being used to make well design and completion decisions that may impact the safe operation of the well. However, despite numerous examples in the literature of how data is used to make decisions, very little attention has been given to reliability of the data and its impact on uncertainty, diagnosis, and risk. This presentation will act as a catalyst for starting a discussion on the impact of data reliability on risk assessment. The talk will cover the fundamental principles of uncertainty risk and the impact of data reliability on decision making. Two examples will be used to illustrate the issues. The first will look at risking of exploration wells using AVO response that has low/marginal reliability. The second example will be the risk analysis of a published paper: ‘Detecting Shallow Drilling Hazard in Large Boreholes Using LWD Acoustics’. This paper avoided the issue of data reliability. The potential impact of not considering data reliability will be explored and discussed with the group.
Limit of Detection of Rare Targets Using Digital PCR | ESHG 2015 Poster PS14.031Thermo Fisher Scientific
Detection and quantification of mutant alleles in tumor tissue is important to cancer research. Testing for the presence of mutations in circulating free DNA (cfDNA) is one of the less invasive research methods available at this time. Digital PCR presents a research tool for mutation detection in cfDNA at a sensitivity level of 1% and below. Challenges associated with digital PCR experiments for rare allele detection include understanding the limit of detection of the assay and platform. This work compares false positive assessment strategies using the signal levels of the no-amplification cluster. Once the false positive call rate is established, this work outlines a method to determine the limit of detection of the assay and platform, at a given level of confidence. Given the number of partitions, the interrogated volume and the false call rate, the tradeoffs between sample load and sensitivity are also discussed.
The mathematics outlined to calculate the theoretical limit of detection is applied on a set of assays from Thermo Fisher Scientific covering the KRAS codon mutations commonly found in tumor tissues. Experimental results showing a detection of at least 0.1% mutation rate are presented as examples. Test samples were created using both mutant plasmid and mutant genomic DNA mixed with wild-type genomic DNA at a predefined percentage.
Slope at Zero Crossings (ZC) of Speech Signal for Multi-Speaker Activity Dete...ijcisjournal
Multi-Speaker activity (MSA) detection helps in detecting the presence of whether the speech signals has a
single speaker or multiple speaker speeches in the speech signal. It is easy to calculate the slope at ZCs
(zero crossings) of the speech signal and makes a comparison with a suitable threshold (Th). Multi-speaker
is declared as and when the zero crossing value exceeds the threshold. The impact of the proposed
technique is compared to the existing technique by calculating the sample-by-sample ZCR (Zero crossing
rate) value is demonstrated. Experimental results prove that the proposed ZCR technique achieves accurate
results than the traditional techniques for MSA detection that uses the cepstrum resynthesis residual
magnitude (CRRM) in the literature.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
2. Ratio of the height of the analyte to the
height of the noise measured on a blank
OR
S/N ratio is a measure that compares the
level of a desired signal to the level of a
background noise
7. Detection limit is determined by
the analysis of sample with
known concentration of analyte
and by establishing the
minimum level at which the
analyte can be reliably detected
8. S/N ratio is performed by comparing
measured signals from samples with
known low concentration of analyte with
those of blank samples and by establishing
the minimum concentration at which the
analyte can be reliably detected.
A S/N ratio between 3:1 or 2:1 is generally
accepted.
9. LOD can be determined as a signal to
noise ratio ≥ 3
Where S=Height of Signal
N=Height of Noise
11. Minimum amount of analyte in a
sample that can be quantified
with acceptable precision and
accuracy under the stated
operational conditions of the
method.
13. Quantitation limit is determined by
the analysis of sample with known
concentration of analyte and by
establishing the minimum level at
which the analyte can be reliably
quantitated
14. S/N ratio is performed by comparing
measured signals from samples with
known low concentration of analyte with
those of blank samples and by establishing
the minimum concentration at which the
analyte can be reliably quantitated.
A S/N ratio ≥ 10 is accepted.
15. LOQ can be determined as a signal to
noise ratio ≥ 10
Where S=Height of Signal
N=Height of Noise
16. LOQ can also be calculated
by this formula:-
LOQ = 3.3 x LOD
19. CAPACITY OF A METHOD TO REMAIN
UNAFFECTED BY SMALL DELIBERATE
VARIATIONS IN METHOD PARAMETER
20. THE EVALUATION OF ROBUSTNESS
SHOULD BE CONSIDERED DURING THE
DEVELOPMENT PHASE AND DEPENDS
ON THE TYPE OF PROCEDURE UNDER
STUDY
21. 1) FLOW RATE: It can be adjusted by as
much as 50%
2) TEMPERATURE: It can be adjusted by
as much as 10%
3) PH OF MOBILE PHASE: PH of an
aqueous buffer used in the preparation
of the mobile phase can be adjusted to
within 0.2 units of the volume of
range specified