Analytical Chemistry & Role in pharmaceutical industry
Different techniques of analysis
Significant Figures
Errors - Types & Minimization
Calibration of glasswares - pipette, burette & Volumetric flask
Hello friends,
Abhilasha this side, i am going to update my new slide of glassware calibration . for any query you can drop me a mail.
Regards
Abhilasha Pandey
Analytical chemist
Analytical Chemistry & Role in pharmaceutical industry
Different techniques of analysis
Significant Figures
Errors - Types & Minimization
Calibration of glasswares - pipette, burette & Volumetric flask
Hello friends,
Abhilasha this side, i am going to update my new slide of glassware calibration . for any query you can drop me a mail.
Regards
Abhilasha Pandey
Analytical chemist
Today's Topic Errors - Introduction, Sources of Errors, Types of Errors, Minimization of Errors, Accuracy, Precision, Significant Figures in Pharmaceutical Analysis subject in B.pharmacy 1st year as per JNTUA Syllabus...
Lecture-02.Classifications of Qualitative and Quantitative AnalysisUniversity of Okara
https://www.youtube.com/watch?v=wObwXIt1ZQc&t=123s
Basic Concept of Analytical Chemistry
Meaning: The word analytical comes from the Ancient Greek ana- "up, and lysis "a loosening"). Collectively it means breaking-up" or "an untying.
Definition: The branch of chemistry which deals with the analysis of matter, its identification, and its components. Thus, the process of chemical analysis are of two type;
(1) Qualitative Analysis (2) Quantitative Analysis
Classifications of Analytical Techniques
There are two types of techniques
(1) Classical technique (2) Instrumental techniques
The classical techniques are qualitative as well as quantitative. The qualitative analysis is based on identifying and determining the analyte based on some properties specific to the analyte like boiling point, melting point, optical activities or refractive index, solubilities, and color. E.g., the Boling point of water is 100oC, the melting point of sugar is 186 °C, the refractive index of water is 1.333, test color of K is purple or the color of litmus. paper indicating the acidity or basicity of a compound. When sulphuretted hydrogen (H2S) is passed through a solution containing Arsenic, a yellowish precipitate is formed indicating the presence of arsenic. If the precipitate is brown, is brown, it indicates Tin.
The quantitative analysis is based on the quantity of the analyte. Like determining the volume of the analyte ( volumetric and gasometric analysis) and weight of the analyte (gravimetric analysis.
2) Instrumental methods can be both qualitative and quantitative. The qualitative analysis likewise relies on detecting and determining the analyte based on certain characteristics. Elements (C, H, N, S) of organic compounds using a CHNS analyzer, heavy metals using an atomic absorption spectrophotometer, and alkali and alkaline earth metals (K, Na, Ca, Mg) using a flame photometer. At the molecular level, infrared (IR) spectroscopy, Nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry, and thin-layer chromatography are used to examine substances. These techniques tell us the nature of a compound. Some of these techniques can also be used for quantitative purposes as well.
Reference Books:
Skoog, D. A., West, P. M., Holler, F. J., Crouch, S. R., Fundamentals of AnalyticalChemistry, 9th ed., Brooks Cole Publishing Company, (2013).
Christian, G. D., Analytical Chemistry. 6th ed., John-Wiley & Sons, New York, (2006).
Harris, D. C., Quantitative Chemical Analysis, 8th ed., W. H. Freeman and Company, New York, USA, (2011).
Bender, G.T. 1987. “Principles of Chemical Instrumentation” W.B. Saunders Co., London.
Reilley, C. 1993. Laboratory Manual of Analytical Chemistry. Allyn& Bacon, London.
Hargis, L.G. 1988. “Analytical Chemistry: Printice Hall Publishers, London.
Today's Topic Errors - Introduction, Sources of Errors, Types of Errors, Minimization of Errors, Accuracy, Precision, Significant Figures in Pharmaceutical Analysis subject in B.pharmacy 1st year as per JNTUA Syllabus...
Lecture-02.Classifications of Qualitative and Quantitative AnalysisUniversity of Okara
https://www.youtube.com/watch?v=wObwXIt1ZQc&t=123s
Basic Concept of Analytical Chemistry
Meaning: The word analytical comes from the Ancient Greek ana- "up, and lysis "a loosening"). Collectively it means breaking-up" or "an untying.
Definition: The branch of chemistry which deals with the analysis of matter, its identification, and its components. Thus, the process of chemical analysis are of two type;
(1) Qualitative Analysis (2) Quantitative Analysis
Classifications of Analytical Techniques
There are two types of techniques
(1) Classical technique (2) Instrumental techniques
The classical techniques are qualitative as well as quantitative. The qualitative analysis is based on identifying and determining the analyte based on some properties specific to the analyte like boiling point, melting point, optical activities or refractive index, solubilities, and color. E.g., the Boling point of water is 100oC, the melting point of sugar is 186 °C, the refractive index of water is 1.333, test color of K is purple or the color of litmus. paper indicating the acidity or basicity of a compound. When sulphuretted hydrogen (H2S) is passed through a solution containing Arsenic, a yellowish precipitate is formed indicating the presence of arsenic. If the precipitate is brown, is brown, it indicates Tin.
The quantitative analysis is based on the quantity of the analyte. Like determining the volume of the analyte ( volumetric and gasometric analysis) and weight of the analyte (gravimetric analysis.
2) Instrumental methods can be both qualitative and quantitative. The qualitative analysis likewise relies on detecting and determining the analyte based on certain characteristics. Elements (C, H, N, S) of organic compounds using a CHNS analyzer, heavy metals using an atomic absorption spectrophotometer, and alkali and alkaline earth metals (K, Na, Ca, Mg) using a flame photometer. At the molecular level, infrared (IR) spectroscopy, Nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry, and thin-layer chromatography are used to examine substances. These techniques tell us the nature of a compound. Some of these techniques can also be used for quantitative purposes as well.
Reference Books:
Skoog, D. A., West, P. M., Holler, F. J., Crouch, S. R., Fundamentals of AnalyticalChemistry, 9th ed., Brooks Cole Publishing Company, (2013).
Christian, G. D., Analytical Chemistry. 6th ed., John-Wiley & Sons, New York, (2006).
Harris, D. C., Quantitative Chemical Analysis, 8th ed., W. H. Freeman and Company, New York, USA, (2011).
Bender, G.T. 1987. “Principles of Chemical Instrumentation” W.B. Saunders Co., London.
Reilley, C. 1993. Laboratory Manual of Analytical Chemistry. Allyn& Bacon, London.
Hargis, L.G. 1988. “Analytical Chemistry: Printice Hall Publishers, London.
Introduction to Pharmaceutical ChemistryPriti Kokate
Chapter No. 1 from pharmaceutical chemistry , updated syllabus notes as per MSBTE
1.Introduction to pharmaceutical chemistry
Topic covers following bits
#Scope
#Objective
#Sources & Types Of Errors
#Impurities in Pharmaceuticals
#Limit Test For
*Chloride
*Sulphate
*Iron
*Heavy Metal
*Arsenic
Quality control (QC) is a procedure or set of procedures intended to ensure that a manufactured product or performed service adheres to a defined set of quality criteria or meets the requirements of the client or customer. QC is similar to, but not identical with, quality assurance (QA).
QC IN clinical biochemistry labs and hospitals
Errors - pharmaceutical analysis -1, bpharm 1st semester, notes, topic errors
full details and answer about error
TN DR MGR UNIVERSITY
by Kumaran.M.pharm, professor
Quality in clinical laboratory is a continuous journey of improving processes through team work, innovative solutions, regulatory compliance with final objective to meet the evolving needs of clinicians & patients.
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
Introduction:
RNA interference (RNAi) or Post-Transcriptional Gene Silencing (PTGS) is an important biological process for modulating eukaryotic gene expression.
It is highly conserved process of posttranscriptional gene silencing by which double stranded RNA (dsRNA) causes sequence-specific degradation of mRNA sequences.
dsRNA-induced gene silencing (RNAi) is reported in a wide range of eukaryotes ranging from worms, insects, mammals and plants.
This process mediates resistance to both endogenous parasitic and exogenous pathogenic nucleic acids, and regulates the expression of protein-coding genes.
What are small ncRNAs?
micro RNA (miRNA)
short interfering RNA (siRNA)
Properties of small non-coding RNA:
Involved in silencing mRNA transcripts.
Called “small” because they are usually only about 21-24 nucleotides long.
Synthesized by first cutting up longer precursor sequences (like the 61nt one that Lee discovered).
Silence an mRNA by base pairing with some sequence on the mRNA.
Discovery of siRNA?
The first small RNA:
In 1993 Rosalind Lee (Victor Ambros lab) was studying a non- coding gene in C. elegans, lin-4, that was involved in silencing of another gene, lin-14, at the appropriate time in the
development of the worm C. elegans.
Two small transcripts of lin-4 (22nt and 61nt) were found to be complementary to a sequence in the 3' UTR of lin-14.
Because lin-4 encoded no protein, she deduced that it must be these transcripts that are causing the silencing by RNA-RNA interactions.
Types of RNAi ( non coding RNA)
MiRNA
Length (23-25 nt)
Trans acting
Binds with target MRNA in mismatch
Translation inhibition
Si RNA
Length 21 nt.
Cis acting
Bind with target Mrna in perfect complementary sequence
Piwi-RNA
Length ; 25 to 36 nt.
Expressed in Germ Cells
Regulates trnasposomes activity
MECHANISM OF RNAI:
First the double-stranded RNA teams up with a protein complex named Dicer, which cuts the long RNA into short pieces.
Then another protein complex called RISC (RNA-induced silencing complex) discards one of the two RNA strands.
The RISC-docked, single-stranded RNA then pairs with the homologous mRNA and destroys it.
THE RISC COMPLEX:
RISC is large(>500kD) RNA multi- protein Binding complex which triggers MRNA degradation in response to MRNA
Unwinding of double stranded Si RNA by ATP independent Helicase
Active component of RISC is Ago proteins( ENDONUCLEASE) which cleave target MRNA.
DICER: endonuclease (RNase Family III)
Argonaute: Central Component of the RNA-Induced Silencing Complex (RISC)
One strand of the dsRNA produced by Dicer is retained in the RISC complex in association with Argonaute
ARGONAUTE PROTEIN :
1.PAZ(PIWI/Argonaute/ Zwille)- Recognition of target MRNA
2.PIWI (p-element induced wimpy Testis)- breaks Phosphodiester bond of mRNA.)RNAse H activity.
MiRNA:
The Double-stranded RNAs are naturally produced in eukaryotic cells during development, and they have a key role in regulating gene expression .
Richard's entangled aventures in wonderlandRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
2. ERROR
• Error is the difference between the true result
(or accepted true result) and the measured
result.
• If the error in an analysis is large, serious
consequences may result.
• A patient may undergo expensive and even
dangerous medical treatment based on an
incorrect laboratory result may implement costly
and incorrect modifications to a plant or process
because of an analytical error.
3. There are two principal types of error in analysis
determinate or systematic error
indeterminate or random error
Types of Errors
4. Determinate Error
• Determinate errors are caused by faults in
the analytical procedure or the instruments
used in the analysis.
• The name determinate error implies that the
cause of this type of error may be found out
and then either avoided or corrected.
• Determinate errors are systematic errors; that
is, they are not random.
5. A particular determinate error may cause the
Determinate Error
• analytical results produced by the
method to be always too high;
• Another determinate error may render all
results too low.
• Sometimes the error is constant;
• Allresults are too high (or too low)by
the same amount.
6. Sometimes the determinate error is proportional to
the true result, giving rise to proportional errors.
Other determinate errors may be variable in both sign
and magnitude, such as the change in the volume of a
solution as the temperature changes. Although this
variation can be positive or negative, it can be
identified and accounted for.
pharmauptoday@gmail.com
Determinate Error
7. Determinate errors can be additive or they can be
multiplicative. It depends on the error and how it
enters into the calculation of the final result.
This determinate error could be the result of an
incorrectly calibrated balance.
Determinate Error
8. If the balance is set so that the zero point is actually 0.5 mg
too high, all masses determined with this balance will be
0.5 mg too high.
If this balance was used to weigh any standard solution
used in the laboratory, the standard concentration will be
erroneously high, and all of the results obtained using this
standard will be erroneously low.
The error is reported as the absolute error, the absolute
value of the difference between the true and measured
values.
Determinate Error
9. Determinate errors arise from some faulty step in the
analytical process.
The faulty step is repeated every time the
determination is performed. Whether a sample is
analyzed 5 times or 50 times, the results may all agree
with each other (good precision) but differ widely
from the true answer (poor accuracy).
Although the replicate results are close to each
other, that tells us nothing about their accuracy.
Determinate Error
10. Systematic error is under the control of the analyst. It is the analyst’s
responsibility to recognize and correct for these systematic errors that cause
results to be biased, that is, offset in the average measured value from the
true value.
How are determinate errors identified and corrected?
Two methods are commonly used to identify the existence of systematic
errors.
One is to analyze the sample by a completely different analytical procedure
that is known to involve no systematic errors. Such methods are often called
“standard methods”; they have been evaluated extensively by many
laboratories and shown to be accurate and precise.
If the results from the two analytical methods agree, it is reasonable to
assume that both analytical procedures are free of determinate errors.
The second method is to run several analyses of a reference material of
known, accepted concentration of analyte. The difference between the
known (true) concentration and that measured by analysis should reveal the
error. If the results of analysis of a known reference standard are
consistently high (or consistently low), then a determinate error is involved
in the method.
Determinate Error
11. The cause of the error must be identified and either
eliminated or controlled if the analytical procedure is to
give accurate results.
Many clinical and analytical laboratories participate in
proficiency testing programs, where “unknown”
standard samples are sent to the laboratory on a regular
basis.
The results of these samples are sent to the government
or professional agency running the program. The
unknowns are of course known to the agency that sent
the test samples; the laboratory receives a report on the
accuracy and precision of its performance.
Determinate Error
12. Determinate errors can arise from uncalibrated
balances, improperly calibrated volumetric flasks or
pipettes, malfunctioning instrumentation, impure
chemicals, incorrect analytical procedures or
techniques, and analyst error.
Determinate Error
13. Analyst error : The person performing the analysis causes
these errors.
They may be the result of inexperience, insufficient
training, or being “in a hurry”.
An analyst may use the instrument incorrectly,
perhaps by placing the sample in the instrument incorrectly
each time.
Setting the instrument to the wrong conditions for analysis.
Consistently misreading a meniscus in a volumetric flask as
high (or low)
Improper use of pipettes, such as “blowing out” the liquid
from a volumetric pipette.
Determinate Error
14. Operational and personal errors
These are due to factors for which the individual analyst is responsible
and are not connected with the method or procedure: they form part of
the 'personal equation' of an observer.
The errors are mostly physical in nature and occur when sound analytical
technique is not followed.
Examples are:
mechanical loss of materials in various steps of an analysis
underwashing or overwashing of precipitates
ignition of precipitates at incorrect temperatures
insufficient cooling of crucibles before weighing
allowing hygroscopic materials to absorb moisture before or during
weighing
use of reagents containing harmful impurities.
Some analysts are unable to judge colour changes sharply in visual
titrations, which may result in a slight overstepping of the end point.
Determinate Error
15. Some other analyst-related errors are
Carelessness, which is not as common as is generally
believed
Transcription errors, that is, copying the wrong
information into a lab notebook or onto a label
Calculation errors.
Proper training, experience, and attention to detail on
the part of the analyst can correct these types of
errors.
Determinate Error
16. Reagents and instrumentation:
Contaminated or decomposed reagents can cause
determinate errors.
Impurities in the reagents may interfere with the
determination of the analyte, especially at the ppm
level or below.
Prepared reagents may also be improperly labeled.
The suspect reagent may be tested for purity using a
known procedure or the analysis should be redone
using a different set of reagents and the results
compared.
Determinate Error
17. Numerous errors involving instrumentation are possible, including
faulty construction of balances,
use of uncalibrated or improperly calibrated weights,
incorrect instrument alignment,
incorrect wavelength settings,
incorrect reading of values, and
incorrect settings of the readout (i.e., zero signal should read zero).
Any variation in proper instrument settings can lead to errors.
These problems can be eliminated by a systematic procedure to
check the instrument settings and operation before use. Such
procedures are called standard operating procedures (SOPs) in
many labs.
There should be a written SOP for each instrument and each
analytical method used in the laboratory.
Determinate Error
18. In instrumental analysis, electrical line voltage fluctuations are
a particular problem. This is especially true for automated
instruments running unattended overnight.
Instruments are often calibrated during the day, when
electrical power is in high demand.
At night, when power demand is lower, line voltage may
increase substantially, completely changing the relationship
between concentration of analyte and measured signal.
Regulated power supplies are highly recommended for
analytical instruments. The procedure for unattended analysis
should include sufficient calibration checks during the
analytical run to identify such problems.
Determinate Error
19. Analytical method
The most serious errors are those in the method itself.
Examples of method errors include
incorrect sampling
incomplete reaction for chemical methods,
unexpected interferences from the sample itself or reagents used
having the analyte in the wrong oxidation state for the measurement
loss of analyte during sample preparation by volatilization or precipitation
an error in calculation based on incorrect assumptions in the procedure
(errors can evolve from assignment of an incorrect formula or molecular
weight to the sample).
In titrimetric analysis errors may occur
owing to failure of reactions to proceed to completion,
occurrence of induced and side reactions,
reaction of substances other than the constituent being determined,
difference between the observed end point and the stoichiometric
equivalence point of a reaction.
Determinate Error
20. Contamination
Contamination of samples by external sources can be a
serious source of error and may be extremely variable.
Aluminum levels in the dust in a normal laboratory are so
high that dust prohibits the determination of low ppb levels
of aluminum in samples.
A special dust-free “clean lab” or “clean bench” with a filter
to remove small dust particles may be required, for
determination of traces of aluminum, silicon, and other
common elements such as iron.
When trace (< ppm level) or ultratrace (< ppb level) organic
and inorganic analysis is required, the laboratory
environment can be a significant source of contamination.
Determinate Error
21. Contamination
Another major source of contamination in an analysis can be the
analyst. It depends on what kind of analytes are being measured, but
when trace or ultratrace levels of elements or molecules are being
determined, the analyst can be a part of the analytical problem.
Many personal care items, such as hand
creams, shampoos, powders, and cosmetics, contain significant
amounts of chemicals that may be analytes.
The problem can be severe for volatile organic compounds in
aftershave, perfume, and many other scented products and for
silicone polymers, used in many health and beauty products.
Powdered gloves may contain a variety of trace elements and should
not be used by analysts performing trace element determinations.
Hair, skin, and clothing can shed cells or fibers that can contaminate a
sample.
Determinate Error
22. Indeterminate errors are not constant or biased.
They are random in nature and are the cause of slight
variations in results of replicate samples made by the same
analyst under the same conditions.
Sources of random error include the limitations of reading
balances, scales such as rulers or dials, and electrical “noise” in
instruments. For example, a balance that is capable of
measuring only to 0.001 g cannot distinguish between two
samples with masses of 1.0151 and 1.0149 g. In one case the
measured mass is low, in the other case it is high.
Indeterminate Error
23. Indeterminate errors are not constant or biased.
They are random in nature and are the cause of slight
variations in results of replicate samples made by the same
analyst under the same conditions.
Sources of random error include the limitations of reading
balances, scales such as rulers or dials, and electrical “noise”
in instruments. For example, a balance that is capable of
measuring only to 0.001 g cannot distinguish between two
samples with masses of 1.0151 and 1.0149 g. In one case the
measured mass is low, in the other case it is high.
Indeterminate Error
24. These random errors cause variation in results, some of which may
be too high and some too low.
The average of the replicate determinations is accurate, but each
individual determination may vary slightly from the true value.
Indeterminate errors arise from sources that cannot be
corrected, avoided, or even identified, in some cases.
All analytical procedures are subject to indeterminate error.
However, because indeterminate error is random, the errors will
follow a random distribution.
This distribution can be understood using the laws of probability
and basic statistics. The extent of indeterminate error can be
calculated mathematically.
Indeterminate Error
25. Gross errors differ from indeterminate and determinate errors.
They usually occur only occasionally, are often large, and may cause
a result to be either high or low.
They are often the product of human errors.
For example,
if part of a precipitate is lost before weighing, analytical results will be
low.
Touching a weighing bottle with your fingers after its empty mass is
determined will cause a high mass reading for a solid weighed in the
contaminated bottle.
Gross errors lead to outliers, results that appear to differ markedly
from all other data in a set of replicate measurements.
Gross Error
26. An absolute error is the numerical difference between a
measured value and a true or accepted value.
A relative error is the absolute error divided by the true or
accepted value.
Absolute & Relative errors
27. Concentration errors
Labeling errors
Calculation errors
Manual calculation
Using wrong formula
Computational calculation
Using wrong formula in excel
Using different location (wrong cell) in the excel sheet
Improper use of $, () symbols.
Rounding off errors
Common identified error
28. When a set of replicate results is obtained it may be the case that
one of the results appears to be “out of line”; such a result is called
an outlier.
While it is tempting to discard data that does not “look good” in
order to make the analysis seem more precise, it is never a good
practice unless there is justification for discarding the result.
If it is known that an error was made,
such as spillage of the sample,
use of the wrong size pipet,
incorrect dilution, or
allowing the sample to boil to dryness when it should not have done
so, the result should be rejected and not used in any compilation of
results.
In practice, if something of this sort is suspected, a good analyst will
discard the sample and start over if possible.
Rejection of Results
30. Text book of Quantitative Chemical Analysis- 5th Edition –Vogel.
Pharmaceutical Analysis : A Textbook for Pharmacy Students &
Pharmaceutical Chemists – David G. Watson
Handbook of instrumental techniques for analytical chemistry –
Frank Settle.
Instant Notes in Analytical Chemistry – D. Kealey & P.J. Haines.
Analytical Chemistry for Technicians 3rd edition (CRC, 2003) –
Kenkel.
pharmaceutical-drug-analysis book 2nd edition – Ashutoshkar.
8th
Fundamentals of Analytical Chemistry edition HQ
(Thomson, 2004) – Douglas A. Skoog.
Undergraduate instrumental analysis
Robinson.
6th edition – James W.
References