The document discusses various types of errors that can occur in quantitative chemical analysis, including random errors, systematic errors, determinate errors, indeterminate errors, and errors due to faulty instrumentation, impure reagents, or improper methodology. It also describes ways to minimize errors, such as calibrating apparatus, running blanks and controls, using multiple analytical techniques, and performing replicate measurements. Accuracy is defined as how close a measurement is to the true value, while precision refers to the reproducibility of measurements.
Today's Topic Errors - Introduction, Sources of Errors, Types of Errors, Minimization of Errors, Accuracy, Precision, Significant Figures in Pharmaceutical Analysis subject in B.pharmacy 1st year as per JNTUA Syllabus...
Introduction
error, accuracy, precision
Source of Errors
Types of Errors
Methods of minimizing errors
Test for rejection of data
Significant Level
Rounding of Figures
References
Today's Topic Errors - Introduction, Sources of Errors, Types of Errors, Minimization of Errors, Accuracy, Precision, Significant Figures in Pharmaceutical Analysis subject in B.pharmacy 1st year as per JNTUA Syllabus...
Introduction
error, accuracy, precision
Source of Errors
Types of Errors
Methods of minimizing errors
Test for rejection of data
Significant Level
Rounding of Figures
References
Errors - pharmaceutical analysis -1, bpharm 1st semester, notes, topic errors
full details and answer about error
TN DR MGR UNIVERSITY
by Kumaran.M.pharm, professor
Polarographic technique is applied for the qualitative or quantitative analysis of electroreducible or oxidisable elements or groups.
It is an electromechanical technique of analyzing solutions that measures the current flowing between two electrodes in the solution as well as the gradually increasing applied voltage to determine respectively the concentration of a solute and its nature.
The principle in polarography is that a gradually increasing negative potential (voltage) is applied between a polarisable and non-polarisable electrode and the corresponding current is recorded.
Polarisable electrode: Dropping Mercury electrode
Non-polarisable electrode: Saturated Calomel electrode
From the current-voltage curve (Sigmoid shape), qualitative and quantitative analysis can be performed. This technique is called as polarography, the instrument used is called as polarograph and the current-voltage curve recorded is called as polarogram
This presentation gives us idea about Gravimetric Analysis which is widely used in chemistry. Hope This Helps !
For More Information - 19103083@student.hindustanuniv.ac.in
It is an electrochemical method of analysis used for the determination or measurement of the electrical conductance of an electrolyte solution by means of a conductometer.
Electric conductivity of an electrolyte solution depends on :
Type of ions (cations, anions, singly or doubly charged
Concentration of ions
Temperature
Mobility of ions
The main principle involved in this method is that the movement of the ions creates the electrical conductivity. The movement of the ions is mainly depended on the concentration of the ions.
The electric conductance in accordance with ohms law which states that the strength of current (i) passing through conductor is directly proportional to potential difference & inversely to resistance.
i =V/R
What is Gravimetric analysis, stepes invloved in gravimetry, Filteration medium in gravimetry, gravimetric factor, application, organic and inorganic prepecating agents
Errors - pharmaceutical analysis -1, bpharm 1st semester, notes, topic errors
full details and answer about error
TN DR MGR UNIVERSITY
by Kumaran.M.pharm, professor
Polarographic technique is applied for the qualitative or quantitative analysis of electroreducible or oxidisable elements or groups.
It is an electromechanical technique of analyzing solutions that measures the current flowing between two electrodes in the solution as well as the gradually increasing applied voltage to determine respectively the concentration of a solute and its nature.
The principle in polarography is that a gradually increasing negative potential (voltage) is applied between a polarisable and non-polarisable electrode and the corresponding current is recorded.
Polarisable electrode: Dropping Mercury electrode
Non-polarisable electrode: Saturated Calomel electrode
From the current-voltage curve (Sigmoid shape), qualitative and quantitative analysis can be performed. This technique is called as polarography, the instrument used is called as polarograph and the current-voltage curve recorded is called as polarogram
This presentation gives us idea about Gravimetric Analysis which is widely used in chemistry. Hope This Helps !
For More Information - 19103083@student.hindustanuniv.ac.in
It is an electrochemical method of analysis used for the determination or measurement of the electrical conductance of an electrolyte solution by means of a conductometer.
Electric conductivity of an electrolyte solution depends on :
Type of ions (cations, anions, singly or doubly charged
Concentration of ions
Temperature
Mobility of ions
The main principle involved in this method is that the movement of the ions creates the electrical conductivity. The movement of the ions is mainly depended on the concentration of the ions.
The electric conductance in accordance with ohms law which states that the strength of current (i) passing through conductor is directly proportional to potential difference & inversely to resistance.
i =V/R
What is Gravimetric analysis, stepes invloved in gravimetry, Filteration medium in gravimetry, gravimetric factor, application, organic and inorganic prepecating agents
This content is suitable for medical technologists/technicians/lab assistants/scientists writing the SMLTSA board exam. The content is also suitable for biomedical technology students and people also interested in learning about test methodologies used in medical technology. This chapter describes test quality assurance (QA) and quality control (QC). Please note that these notes are a collection I used to study for my board exam and train others who got distinctions using these.
Disclaimer: Credit goes to those who wrote the notes and the examiners of each exam question. Please use only as a reference guide and use your prescribed textbook for the latest and most accurate notes and ranges. The material here is not referenced as it is a collection of pieces of study notes from multiple people, and thus will not be held viable for any misinterpretations. Please use at your own discretion.
The significant figures in a numerical expression are defined as all those whose values are known with certainty with one additional digit whose value is uncertain.
Systematic error means that your measurements of the same thing will vary in predictable ways: every measurement will differ from the true measurement in the same direction, and even by the same amount in some cases
Random error is a chance difference between the observed and true values of something (e.g., a researcher misreading a weighing scale records an incorrect measurement).
In manufacturing operations, production management includes responsibility for product and process design, planning and control issues involving capacity and ...
ME 313 Mechanical Measurements and Instrumentation is a followup course on ME-312 Machine Design. Design and implementation of measurement systems, signal conditioning and formatting. Dr. Bilal Siddiqui teaches this course every spring at DHA Suffa University.
Steps to consider when developing analytical methods in your laboratory. Most important validation criteria to consider, including tips on how to remain relevant.
How to Make a Field invisible in Odoo 17Celine George
It is possible to hide or invisible some fields in odoo. Commonly using “invisible” attribute in the field definition to invisible the fields. This slide will show how to make a field invisible in odoo 17.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
A Strategic Approach: GenAI in EducationPeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
3. Determinate / constant error
• Unsuspected
• May be avoided / determined
• Corrected
• Example
• Errors in calibration
• Operation of measuring instrument
• Impurities in the reagent.
4. Biased personal error
• Reading of meniscus.
• In weighing
• In matching colours
• In calculation
5. Determinate error can be combated by
• Use of calibrated apparatus.
• Use of blanks and controls.
• Several analaytical procedures.
• By eliminating impurities.
• By carrying out the experiment
under various condition.
6. Operational or personal error
• Individual Analyst is
responsible.
• Errors are physical in nature.
• It may occur when sound
analytical technique is not
followed.
7. Example of personal error
• Mechanical loss of material in various steps of an
analysis.
• Underwashing or overwashing of precipitate.
• Ignition of precipitates at incorrect temperature.
• Insufficient cooling of crucibles before weighing.
• Allowing hygroscopic materials to absorb moisture
before or during weighing.
• Use of reagents containing impurities.
• Person unable to judge changes sharply in visual
titrations which may result in slight overstepping of the
endpoint.
9. Reagent error
• The attack of reagents upon glassware result
in introduction of foreign materials.
• Volatilization of platinum at very high
temperatures.
10. Errors due to methodology
• Originate from incorrect sampling and
incompleteness of a reaction.
• In gravimetric analysis arise due to solubility
of precipitate, co-precipitation and post
precipitation.
• Volatilisation and decomposition on ignition.
• Precipitation of substance other than
intended one.
11. Errors due to methodology
• In titrimetric analysis , error may due to
occurrence of induced and side reaction.
• Reaction of substances other than the
constituent being determined.
• A difference in the observed and the
stoichiometric end point of a reaction.
12. Additive and proportional error
• Proportional error arise by the impurities
present in a standard substance.
• It leads to an incorrect value in the
normality of the standard solution.
• Loss in weight of crucible in which
precipitate is ignited and errors of
weights - additive error.
13. Indeterminate / Accidental error
• Successive measurements made by the same
observer under identical conditions produce
slight variations – indeterminate error
• It ‘s elimination by the analyst is impossible.
• Errors occur by accident or chance.
• It cannot be allowed for correction because of
the natural fluctuations that occur in all
measurements.
14. pseudo accidental or variable
determinate errors.
• Errors that arise from random fluctuation in
temperature or other external factors belong to
the determinate error are often called as pseudo
accidental or variable determinate errors.
• These errors may be reduced by controlling
conditions through the use of constant
temperature baths and ovens.
• The employment of buffers, maintenance of
constant humidity and pressure, reading fraction
of units on graduates, balances and apparatus
may reduce pseudo accidental errors.
16. Calibration of apparatus
• All apparatus like weights, flasks, burettes and
pipettes should be calibrated.
• The appropriate corrections applied to the
original measurements.
• In some case errors cannot be eliminated.
• Apply a correction for that effect.
• An impurity in a weighed precipitate may be
determined and its weight deducted
17. Running a blank determination
• It is carried out as a separate determination,
the sample being omitted, under exactly, the
same experimental conditions as employed in
the actual analysis of the sample.
• The object is to find out the effect of the
impurities introduced through the reagents
and vessels.
18. Running a control determination
• Determination carried out as nearly as
possible identical experimental conditions
upon a quantity of a standard substance
which contains the same weight of the
constituent.
19. Use of independent methods of
analysis
• Determination of strength of HCl both by
titration with solution of a strong base and by
precipitation and weighed as AgCl.
• If the results obtained by the two radically
different methods are concordant . It is highly
probable that the values are correct within
small limits of error.
20. Running parallel determination
• It serve as a check on the result of a single
determination and indicate only the precision
of the analysis.
• The values obtained should not less than
three parts per thousand.
• If larger variation is there then it must be
repeated until satisfactory concordance is
obtained.
• Duplicate / triplicate determination is suffice.
21. Standard addition
• A known amount of the constituent being is
added to the sample, which is then analysed
for the total amount of constituent present.
• The difference between the analytical results
for samples with and without added
constituent gives the recovery of the amount
of added constituent.
• If the recovery is satisfactory our confidence in
the accuracy of the procedure is enhanced.
22. • Even under constant experimental conditions
(same operator, same tools, and same
laboratory, short time intervals between the
measurements), repeated measurements of
series of identical samples always lead to
results which differ among themselves and
from the true value of the sample. Therefore,
quantitative measurements cannot be
reproduced with absolute reliability.
23. Random Errors.
• Random errors are the components of
measurement errors that vary in an
unpredictable manner in replicated
measurements.
• measuring techniques (e.g. noise),
• sample properties (e.g. In homogeneities),
• and chemical effects (e.g. equilibrium).
• Even under carefully controlled conditions
random errors cannot, in principle, be avoided,
they can only be minimized and evaluated with
statistical methods.
24. Systematic Errors.
• the closeness of agreement between the
expectation of a test result or measurement
result and a true value.
• According to their character and magnitude it
is classified as random and systematic.
26. Definition
• The concordance between the data and the
true value.
• It is an agreement between the data and true
value.
• If true value is not known the mean calculated
from results obtained from several different
analytical methods which are very precise and
in close agreement with one another may be
considered the true value in practical sense.
27. • The difference between the mean and the
true value is known as absolute error.
• The relative error is found by dividing the
absolute error by the true value.
• Relative error is usually reported on a
percentage basis by multiplying the relative
error by 100.
• If reported in parts by multiplying the relative
error by 1000
28.
29. Absolute method
• The substance must be of known purity.
• The test of the accuracy of the method under
consideration is carried out by taking varying
amounts of the constituent and proceedings
according to specified instruction.
• The amount of the constituent must be varied,
because the determinate errors in the
procedure may be a function of the amount
used.
30. Method I
• It is a measure of accuracy in the absence of
foreign substance
• The difference between the mean of an
adequate number of results and the amount
of the constituent actually present.
• It is usually expressed as parts per thousand.
31. Method II
• The constituent in the presence of other
substance.
• It require testing the influence of a large
number of elements each in varying the
amounts.
• Separation is required before determination.
• The accuracy of the method is largely
controlled by separation.
32. Comparative method
• In analysis it is impossible to prepare solid
synthetic samples of desired composition.
• Necessary to sort a standard sample is in
question.
• It is determined by one or more accurate
method of analysis.
• It involves secondary standard not satisfactory
from theoretical standpoint.
33. • It is useful in applied analysis.
• If fundamentally different methods of analysis
for a given constituent [ gravimetric,
titrimetric and spectrometric]. The agreement
between at least two methods of essentially
different character can usually be accepted as
indicating the absence of an appreciable
determinate error .
35. Precision
• The concordance of a series of measurements
of the same quantity.
• The mean deviation or the relative mean
deviation is a measure of precision.
• It is a measure of reproducibility of data
within a series of result.
• Results within a series which agree closely
with one another are said to be precise.
36. • Precise results are not necessarily accurate for
a determined error may be responsible for the
inaccuracy of each result in a series of
measurement.
• It is usually reported as the average deviation,
standard deviation or range.
• Precision is a measure of the agreement
among the values in a group of data.
37. • Accuracy is the agreement between the data
the true value.
• In quantitative analysis the precision of
measurements rarely exceeds 1 to 2 parts per
thousand.
• Accuracy expresses the correctness of a
measurement
• Precision the reproducibility of a
measurement.
38. • Precision always accompany accuracy, but a
high degree of precision does not imply
accuracy.
• Example
• A substance contain 49.06 + 0.02 of
constituent.
• Analyst I = 49.01
• Analyst II = 49.42
39. • The analyst I is accurate and precise but
analyst II is precise but less accurate than
Analyst I .
• It is established during the development stage
• It wont include
• Day – to – day fluctuation, lab to lab variation,
small modification in technique varying skills
of analysis, undetected, operational or
instrumental factors.
40.
41.
42.
43.
44.
45.
46.
47.
48. Quality Assurance vs. Quality
Control
Quality Assurance
An overall
management plan to
guarantee the
integrity of data
(The “system”)
Quality Control
A series of
analytical
measurements used
to assess the
quality of the
analytical data
(The “tools”)
49. True Value vs. Measured Value
True Value
The known,
accepted value of
a quantifiable
property
Measured Value
The result of an
individual’s
measurement of
a quantifiable
property
50. Accuracy vs. Precision
Accuracy
How well a
measurement agrees
with an accepted
value
Precision
How well a series of
measurements agree
with each other
51. Systematic vs.
Random Errors
Systematic Error
Avoidable error due
to controllable
variables in a
measurement.
Random Errors
Unavoidable errors
that are always
present in any
measurement.
Impossible to
eliminate
52. Internal Standards
• A compound chemically similar to the
analyte
• Not expected to be present in the
sample
• Cannot interfere in the analysis
• Added to the calibration standards
and to the samples in identical
amounts.
53. Internal Standards
• Refines the calibration process
• Analytical signals for calibration
standards are compared to those for
internal standards
• Eliminates differences in random and
systematic errors between samples
and standards