- Modern laboratory balances use magnetic force restoration, where an electromagnet opposes the downward force of the object being weighed. A detector measures the current needed to maintain the object's position.
- Checking a balance's accuracy involves testing for reproducibility, linearity, calibration, and cornerload errors. Reproducibility ensures consistent readings, linearity checks accuracy across the weighing range, calibration compares readings to standards, and cornerload checks for position-dependent errors.
- Proper handling and storage of weights and controlled temperature and airflow are important for accurate balance measurements.
Analytical balances are highly sensitive weighing devices used to measure small masses in the sub-milligram range. They have an enclosed weighing pan inside a transparent draft shield to prevent dust and air currents from affecting measurements. Analytical balances must be calibrated regularly and located in areas free of vibration and electromagnetic interference to provide accurate readings. Proper weighing technique requires taring the balance, centering samples on the pan, and allowing readings to stabilize before recording results.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
LC-MS is a hyphenated technique that combines liquid chromatography with mass spectrometry to separate and analyze mixtures of compounds. LC is used to resolve complex mixtures, while MS ionizes and analyzes individual resolved components based on their mass-to-charge ratio. Common interfaces like electrospray ionization are used to transfer samples from LC into the mass spectrometer without degrading thermally labile compounds. LC-MS has various applications including quantitative bioanalysis, clinical drug monitoring, pharmacokinetic studies, and impurity profiling.
The document discusses hyphenated techniques, specifically liquid chromatography-nuclear magnetic resonance (LC-NMR). It begins by introducing hyphenated techniques as the combination of two analytical methods, usually a separation technique coupled with a spectroscopic technique. It then describes LC-NMR in more detail, explaining how it works to separate components with liquid chromatography and then uses NMR for identification. Key aspects covered include instrumentation, modes of data acquisition (continuous flow, stopped flow, time sliced), and advantages such as automation, reproducibility, and simultaneous separation and quantification.
Derivative spectroscopy and applications of uv vis spectroscopyNayeemaKhowser
The main obejectives of derivative spectroscopy
Derivative spectra and its measurements
Orders of derivative spectra
Noise to signal ratio
Instrumentation of derivative spectroscopy
Advantages and disadvantages of derivative spectroscopy
Applications of Derivative and UV-Vis spectroscopy
QA and QC are related but distinct concepts in quality management. QA refers to the overall system that aims to prevent defects through processes, while QC tests products to identify defects. QA is a preventative system involving all employees to ensure quality standards are met throughout development. In contrast, QC is reactive and conducted by a specialized team to detect defects in finished products before release. Both work together to continually meet customer requirements, with QA focusing on building quality in from the start and QC checking for quality along the way.
Analytical balances are highly sensitive weighing devices used to measure small masses in the sub-milligram range. They have an enclosed weighing pan inside a transparent draft shield to prevent dust and air currents from affecting measurements. Analytical balances must be calibrated regularly and located in areas free of vibration and electromagnetic interference to provide accurate readings. Proper weighing technique requires taring the balance, centering samples on the pan, and allowing readings to stabilize before recording results.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
LC-MS is a hyphenated technique that combines liquid chromatography with mass spectrometry to separate and analyze mixtures of compounds. LC is used to resolve complex mixtures, while MS ionizes and analyzes individual resolved components based on their mass-to-charge ratio. Common interfaces like electrospray ionization are used to transfer samples from LC into the mass spectrometer without degrading thermally labile compounds. LC-MS has various applications including quantitative bioanalysis, clinical drug monitoring, pharmacokinetic studies, and impurity profiling.
The document discusses hyphenated techniques, specifically liquid chromatography-nuclear magnetic resonance (LC-NMR). It begins by introducing hyphenated techniques as the combination of two analytical methods, usually a separation technique coupled with a spectroscopic technique. It then describes LC-NMR in more detail, explaining how it works to separate components with liquid chromatography and then uses NMR for identification. Key aspects covered include instrumentation, modes of data acquisition (continuous flow, stopped flow, time sliced), and advantages such as automation, reproducibility, and simultaneous separation and quantification.
Derivative spectroscopy and applications of uv vis spectroscopyNayeemaKhowser
The main obejectives of derivative spectroscopy
Derivative spectra and its measurements
Orders of derivative spectra
Noise to signal ratio
Instrumentation of derivative spectroscopy
Advantages and disadvantages of derivative spectroscopy
Applications of Derivative and UV-Vis spectroscopy
QA and QC are related but distinct concepts in quality management. QA refers to the overall system that aims to prevent defects through processes, while QC tests products to identify defects. QA is a preventative system involving all employees to ensure quality standards are met throughout development. In contrast, QC is reactive and conducted by a specialized team to detect defects in finished products before release. Both work together to continually meet customer requirements, with QA focusing on building quality in from the start and QC checking for quality along the way.
This document provides information on calibrating and qualifying various analytical instruments. It discusses the importance of calibration and qualification to ensure proper functioning and accurate results. It describes the different types of qualification including design, installation, operational and performance qualification. It then provides details on specific calibration procedures for various instruments like electronic balances, pH meters, UV-Vis and IR spectrophotometers, and HPLC. The calibration procedures ensure the instruments meet parameters for accuracy, resolution, wavelength verification and flow rate consistency.
This document provides an overview of Good Laboratory Practice (GLP) standards. It defines GLP as a set of principles that provide a framework for conducting laboratory studies. GLP was established by the FDA in the 1970s after discovering fraudulent activities and poor practices in laboratories. GLP applies to nonclinical safety studies on chemicals, pharmaceuticals, and other products submitted to regulatory authorities. The document outlines the key components of GLP, including requirements for organization, personnel, facilities, equipment, testing operations, record keeping, and quality assurance. It also summarizes the various subparts and responsibilities defined in the 21 CFR part 58 GLP regulations.
This document discusses differential thermal analysis (DTA), which measures the difference in temperature between a sample and a reference material as both are heated. It describes phenomena like physical changes (melting, vaporization) and chemical reactions that cause temperature changes detectable by DTA. Instrumentation for DTA is also outlined, including furnaces, temperature programmers, and amplifiers. Factors that can affect DTA curves like heating rate, atmosphere, sample mass, and particle size are examined. Differential scanning calorimetry (DSC) is also introduced as a related technique.
This document discusses process validation for solid oral dosage forms such as tablets and capsules. It begins by defining process validation and outlining the types, including prospective, concurrent, and retrospective validation. For tablets, it covers validation of raw materials, analytical methods, equipment, and process parameters. For capsules, it discusses validation of the capsule composition, encapsulation process, and quality control tests. The conclusion emphasizes that validation ensures reproducibility and compliance.
The document provides guidelines on validation of analytical procedures from the International Conference on Harmonisation (ICH) and the World Health Organization (WHO). It discusses validation characteristics like accuracy, precision, specificity, linearity, range, detection limit and quantitation limit that should be considered when validating identification tests, assays, and tests for impurities. It provides definitions for key terms and recommendations on how validation of these characteristics should be performed.
PHARMACEUTICAL PRODUCTION MANAGEMENT & INVENTORY CONTROLArunpandiyan59
1. The document outlines requirements for modern pharmaceutical production facilities including location, building design, waste disposal, storage, production, and quality control. Key requirements include preventing contamination, ensuring hygienic conditions, and separating different categories of drugs.
2. Specific sections cover water treatment systems, warehousing, production area layout, quality control laboratory independence, personnel training, and inventory/raw material management. Equipment must be properly located, maintained and defective units removed.
3. Quality assurance systems must ensure pharmaceutical products meet GMP standards and are developed to requirements for design, development, manufacturing and quality management.
This document discusses analytical method validation. It provides guidance on validation from various organizations. It describes the types of analytical procedures that require validation including chromatographic, spectroscopic, and dissolution methods. Key validation characteristics that should be considered include specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness, and system suitability. The document provides details on how these characteristics should be determined during the validation process. It also discusses circumstances under which revalidation may be necessary such as changes to the drug synthesis or analytical method.
A seminar on applications of various analytical techniquePatel Parth
This document provides information on preformulation studies for new drug development. It discusses:
1. The purpose of preformulation studies is to understand the characteristics of drug components and optimize dosage form manufacturing.
2. Preformulation studies establish the identity, properties, and compatibility of new drug substances to support formulation development and regulatory filings.
3. A variety of analytical techniques are used in preformulation studies including spectroscopy, chromatography, and thermal analysis to characterize drug substances and excipients.
General considerations and method development in ce,ChowdaryPavani
This document provides an overview of capillary electrophoresis (CE). It defines CE, describes its principle and instrumentation. CE involves separating components of a sample based on their differential rate of migration in an applied electric field. Key points covered include electrophoretic mobility, electroosmotic flow, sample introduction techniques, and common applications such as protein, DNA and pharmaceutical analysis. CE provides high resolution separations due to its small capillary diameter and long separation length.
Drug transporters play an important role in drug absorption, distribution, metabolism, and excretion. They are located in organs like the intestine, liver, and kidney and can influence the pharmacokinetics and pharmacodynamics of drugs through drug-drug interactions. Transporters can be either uptake transporters that move drugs into cells or efflux transporters that move drugs out of cells. Inhibition or induction of these transporters by coadministered drugs can increase or decrease the intracellular concentrations of victim drugs, altering their effects. This is an important consideration for drug interactions.
The document discusses developing a validation master plan (VMP) for a new pharmaceutical facility. Key points:
- A VMP comprehensively describes validation requirements and plans for meeting them. It covers production, storage, utilities, and staff areas.
- The VMP sets goals and limits for validation projects. It defines the scope and systems included.
- Developing the VMP involves determining standards, qualifications for design, installation, operation, and performance, documentation requirements, and change control procedures.
- User requirement specifications (URS) are critical documents that validation is dependent on. Developing clear, testable URS in multiple levels is important, especially for software.
This document discusses scale up considerations for semi-solid manufacturing. It begins by defining semi-solid dosage forms as ointments, pastes, creams, emulsions, gels and foams that cling to the skin. The raw materials can include hydrocarbons, silicones, oils and water-based ingredients. Key unit operations include addition of components, heating, mixing, homogenization and filling. Process parameters like temperature, mixing rates and particle size must be controlled during scale up. Proper plant layout and equipment selection is also important for effective semi-solid manufacturing.
This document discusses types of errors that can occur in chemical analysis and sampling. There are two main categories of error: determinate and indeterminate. Determinate errors can be further broken down into constant and proportional determinate errors. Indeterminate errors are random errors that occur due to unpredictable fluctuations. Various statistical analyses can be used to evaluate precision and accuracy in chemical analysis, including calculating the mean, median, standard deviation, and using control charts. Proper statistical analysis is important for understanding the reliability of results from chemical experiments.
Inventory management involves planning production and ordering materials to maintain appropriate stock levels. It aims to balance holding costs and stockouts. Key aspects include sales forecasting, production planning, ABC analysis to classify items, and techniques like economic order quantity to determine optimal order sizes. Inventory control then monitors stock levels and fulfills orders. The objectives are to meet demand while minimizing costs and capital tied up in inventory.
The document outlines common problems encountered during HPLC troubleshooting and provides potential causes and solutions for each. It discusses five major categories of symptoms: pressure abnormalities, leaks, chromatogram issues, injector problems, and other issues detected by smell, sight or sound. Correcting any problems should be recorded in the system record book to aid in resolving future issues.
Analytical balances are precise weighing devices used to measure small masses in clinical and laboratory sciences. There are two main types - mechanical balances which use lever systems to balance unknown masses against calibrated weights, and electronic balances which use electromagnetic forces controlled by a null detector to restore the balance beam to equilibrium. Both must be accurate, sensitive, and calibrated regularly to ensure proper performance.
1. Balances are instruments that measure mass using gravity to act on an object. There are two main types: mechanical balances using springs or sliding weights, and electronic balances using load cells or electromagnetic force.
2. Electronic balances operate using either a load cell that detects strain to measure weight, or an electromagnetic force restoration method that precisely measures current needed to balance a lever.
3. Routine maintenance of balances includes verifying level and zero settings, cleaning weighing surfaces, and annual calibration. Faults like unstable readings may result from the balance not being leveled or calibrated correctly.
This document provides information on calibrating and qualifying various analytical instruments. It discusses the importance of calibration and qualification to ensure proper functioning and accurate results. It describes the different types of qualification including design, installation, operational and performance qualification. It then provides details on specific calibration procedures for various instruments like electronic balances, pH meters, UV-Vis and IR spectrophotometers, and HPLC. The calibration procedures ensure the instruments meet parameters for accuracy, resolution, wavelength verification and flow rate consistency.
This document provides an overview of Good Laboratory Practice (GLP) standards. It defines GLP as a set of principles that provide a framework for conducting laboratory studies. GLP was established by the FDA in the 1970s after discovering fraudulent activities and poor practices in laboratories. GLP applies to nonclinical safety studies on chemicals, pharmaceuticals, and other products submitted to regulatory authorities. The document outlines the key components of GLP, including requirements for organization, personnel, facilities, equipment, testing operations, record keeping, and quality assurance. It also summarizes the various subparts and responsibilities defined in the 21 CFR part 58 GLP regulations.
This document discusses differential thermal analysis (DTA), which measures the difference in temperature between a sample and a reference material as both are heated. It describes phenomena like physical changes (melting, vaporization) and chemical reactions that cause temperature changes detectable by DTA. Instrumentation for DTA is also outlined, including furnaces, temperature programmers, and amplifiers. Factors that can affect DTA curves like heating rate, atmosphere, sample mass, and particle size are examined. Differential scanning calorimetry (DSC) is also introduced as a related technique.
This document discusses process validation for solid oral dosage forms such as tablets and capsules. It begins by defining process validation and outlining the types, including prospective, concurrent, and retrospective validation. For tablets, it covers validation of raw materials, analytical methods, equipment, and process parameters. For capsules, it discusses validation of the capsule composition, encapsulation process, and quality control tests. The conclusion emphasizes that validation ensures reproducibility and compliance.
The document provides guidelines on validation of analytical procedures from the International Conference on Harmonisation (ICH) and the World Health Organization (WHO). It discusses validation characteristics like accuracy, precision, specificity, linearity, range, detection limit and quantitation limit that should be considered when validating identification tests, assays, and tests for impurities. It provides definitions for key terms and recommendations on how validation of these characteristics should be performed.
PHARMACEUTICAL PRODUCTION MANAGEMENT & INVENTORY CONTROLArunpandiyan59
1. The document outlines requirements for modern pharmaceutical production facilities including location, building design, waste disposal, storage, production, and quality control. Key requirements include preventing contamination, ensuring hygienic conditions, and separating different categories of drugs.
2. Specific sections cover water treatment systems, warehousing, production area layout, quality control laboratory independence, personnel training, and inventory/raw material management. Equipment must be properly located, maintained and defective units removed.
3. Quality assurance systems must ensure pharmaceutical products meet GMP standards and are developed to requirements for design, development, manufacturing and quality management.
This document discusses analytical method validation. It provides guidance on validation from various organizations. It describes the types of analytical procedures that require validation including chromatographic, spectroscopic, and dissolution methods. Key validation characteristics that should be considered include specificity, linearity, range, accuracy, precision, detection limit, quantitation limit, robustness, and system suitability. The document provides details on how these characteristics should be determined during the validation process. It also discusses circumstances under which revalidation may be necessary such as changes to the drug synthesis or analytical method.
A seminar on applications of various analytical techniquePatel Parth
This document provides information on preformulation studies for new drug development. It discusses:
1. The purpose of preformulation studies is to understand the characteristics of drug components and optimize dosage form manufacturing.
2. Preformulation studies establish the identity, properties, and compatibility of new drug substances to support formulation development and regulatory filings.
3. A variety of analytical techniques are used in preformulation studies including spectroscopy, chromatography, and thermal analysis to characterize drug substances and excipients.
General considerations and method development in ce,ChowdaryPavani
This document provides an overview of capillary electrophoresis (CE). It defines CE, describes its principle and instrumentation. CE involves separating components of a sample based on their differential rate of migration in an applied electric field. Key points covered include electrophoretic mobility, electroosmotic flow, sample introduction techniques, and common applications such as protein, DNA and pharmaceutical analysis. CE provides high resolution separations due to its small capillary diameter and long separation length.
Drug transporters play an important role in drug absorption, distribution, metabolism, and excretion. They are located in organs like the intestine, liver, and kidney and can influence the pharmacokinetics and pharmacodynamics of drugs through drug-drug interactions. Transporters can be either uptake transporters that move drugs into cells or efflux transporters that move drugs out of cells. Inhibition or induction of these transporters by coadministered drugs can increase or decrease the intracellular concentrations of victim drugs, altering their effects. This is an important consideration for drug interactions.
The document discusses developing a validation master plan (VMP) for a new pharmaceutical facility. Key points:
- A VMP comprehensively describes validation requirements and plans for meeting them. It covers production, storage, utilities, and staff areas.
- The VMP sets goals and limits for validation projects. It defines the scope and systems included.
- Developing the VMP involves determining standards, qualifications for design, installation, operation, and performance, documentation requirements, and change control procedures.
- User requirement specifications (URS) are critical documents that validation is dependent on. Developing clear, testable URS in multiple levels is important, especially for software.
This document discusses scale up considerations for semi-solid manufacturing. It begins by defining semi-solid dosage forms as ointments, pastes, creams, emulsions, gels and foams that cling to the skin. The raw materials can include hydrocarbons, silicones, oils and water-based ingredients. Key unit operations include addition of components, heating, mixing, homogenization and filling. Process parameters like temperature, mixing rates and particle size must be controlled during scale up. Proper plant layout and equipment selection is also important for effective semi-solid manufacturing.
This document discusses types of errors that can occur in chemical analysis and sampling. There are two main categories of error: determinate and indeterminate. Determinate errors can be further broken down into constant and proportional determinate errors. Indeterminate errors are random errors that occur due to unpredictable fluctuations. Various statistical analyses can be used to evaluate precision and accuracy in chemical analysis, including calculating the mean, median, standard deviation, and using control charts. Proper statistical analysis is important for understanding the reliability of results from chemical experiments.
Inventory management involves planning production and ordering materials to maintain appropriate stock levels. It aims to balance holding costs and stockouts. Key aspects include sales forecasting, production planning, ABC analysis to classify items, and techniques like economic order quantity to determine optimal order sizes. Inventory control then monitors stock levels and fulfills orders. The objectives are to meet demand while minimizing costs and capital tied up in inventory.
The document outlines common problems encountered during HPLC troubleshooting and provides potential causes and solutions for each. It discusses five major categories of symptoms: pressure abnormalities, leaks, chromatogram issues, injector problems, and other issues detected by smell, sight or sound. Correcting any problems should be recorded in the system record book to aid in resolving future issues.
Analytical balances are precise weighing devices used to measure small masses in clinical and laboratory sciences. There are two main types - mechanical balances which use lever systems to balance unknown masses against calibrated weights, and electronic balances which use electromagnetic forces controlled by a null detector to restore the balance beam to equilibrium. Both must be accurate, sensitive, and calibrated regularly to ensure proper performance.
1. Balances are instruments that measure mass using gravity to act on an object. There are two main types: mechanical balances using springs or sliding weights, and electronic balances using load cells or electromagnetic force.
2. Electronic balances operate using either a load cell that detects strain to measure weight, or an electromagnetic force restoration method that precisely measures current needed to balance a lever.
3. Routine maintenance of balances includes verifying level and zero settings, cleaning weighing surfaces, and annual calibration. Faults like unstable readings may result from the balance not being leveled or calibrated correctly.
This document discusses good weighing practices in quality control laboratories. It emphasizes the importance of accurate weighing and describes the types of balances needed, including their minimum weights and calibration requirements. Factors that can influence weighing accuracy, such as vibration, temperature, sample properties, and location are examined. Calibration tests including repeatability, linearity, eccentricity and sensitivity are defined.
This document discusses analytical balances and their classification, accuracy, and performance testing. It provides:
1) A classification system for balances based on their number of decimal places and accuracy class. The smallest and most precise are ultra microbalances while technical balances have the lowest precision.
2) Requirements for accuracy and repeatability testing according to the USP General Chapters. Accuracy must be within 0.1% of the test weight and repeatability must have a standard deviation below 0.1% of the smallest intended weight.
3) Guidelines for other performance tests including sensitivity, linearity, eccentricity, and criteria for acceptance. Tests should ensure the balance meets tolerance requirements for intended uses.
The document provides information on calibrating analytical balances. It discusses parameters to check such as accuracy, linearity, precision, and corner load tests. The calibration procedure involves placing standard weights on the balance and recording readings to ensure they are within acceptance criteria. Types of balances and common issues like drift due to temperature, static electricity, and air currents are also outlined. Regular evaluation of balance performance through tests of repeatability, cornerload, and linearity is recommended to identify any issues.
This document discusses calibrating measuring chains. It defines a measuring chain as the series of elements from a sensor to an output transducer that measures quantities. Calibration is comparing measurements to known standards and adjusting for errors. The document outlines sources of error, categories and procedures for calibration, including applying input signals at increments and adjusting readings. Performing calibration ensures accurate measurements by accounting for instrument drift over time.
This document discusses key principles of calibration, including defining calibration as comparing a measuring instrument to a higher accuracy standard. It describes characteristics like calibration tolerance, accuracy ratio, traceability, and uncertainty. The goal is for readers to understand calibration requirements and best practices for technicians.
The document discusses calibration procedures for an analytical balance, including drift check, performance check, and measurement uncertainty check. Key steps include using weights of 1mg, 2mg, 5mg, 10mg, and 20mg to ensure measurements are within 0.1% of the actual mass value, calculating measurement uncertainty as the standard deviation times 3 divided by the actual mass value, and ensuring calibration is performed daily and after maintenance or relocation. Environmental factors like temperature, humidity, and static electricity are also discussed as important to control drift.
Theory and Design for Mechanical Measurements solutions manual Figliola 4th edDiego Fung
Figliola and Beasley’s 6th edition of Theory and Design for Mechanical Measurements provides a time-tested and respected approach to the theory of engineering measurements. An emphasis on the role of statistics and uncertainty analysis in the measuring process makes this text unique. While the measurements discipline is very broad, careful selection of topical coverage, establishes the physical principles and practical techniques for quantifying many engineering variables that have multiple engineering applications.
In the sixth edition, Theory and Design for Mechanical Measurements continues to emphasize the conceptual design framework for selecting and specifying equipment, test procedures and interpreting test results. Coverage of topics, applications and devices has been updated—including information on data acquisition hardware and communication protocols, infrared imaging, and microphones. New examples that illustrate either case studies or interesting vignettes related to the application of measurements in current practice are introduced.
Qualification of analytical instrumentsFaris ameen
This document provides guidelines for qualifying analytical instruments including electronic balances, pH meters, and UV-Visible spectrophotometers. It discusses the various levels of qualification including: Level I which involves selecting instruments and suppliers; Level II which involves installation and releasing instruments for use; Level III which involves periodic checks; and Level IV which involves in-use checks. Specific guidelines are provided for qualifying balances, pH meters, and UV-Visible spectrophotometers, including recommended tolerance limits for various parameters, calibration procedures, and qualification frequencies.
This document discusses key principles of calibration, including defining calibration as comparing a measuring instrument to a higher accuracy standard. It describes maintaining traceability through an unbroken chain of comparisons to national standards and evaluating uncertainty through factors that affect accuracy. The key characteristics of a calibration discussed are specifying a tolerance, using an accuracy ratio of 4:1 between the instrument and standard, and documenting traceability.
The document provides information about calibrating various analytical instruments. It discusses:
1. Calibrating a UV-VIS spectrophotometer involves calibrating for wavelength accuracy, absorbance measurement, gratings performance/stray light test, and resolution power. Specific procedures are outlined for each.
2. Calibrating an HPLC involves parameters like pump calibration, injector calibration, system precision, and detector calibration. Details are given for calibrating the pump flow rate, injector linearity and volume accuracy, and detector linearity.
3. Calibrating a balance involves checking accuracy with standard weights, linearity, precision, and performing a corner load test. Acceptance criteria for each parameter are
This document discusses instrumentation and measurement. It defines measurement as a quantitative comparison between a standard and unknown quantity. There are direct and indirect methods of measurement. Instruments must be calibrated by comparing them to primary or secondary standards.
The key components of an instrument are the primary sensing element, variable conversion element, variable manipulation element, data transmission element, and data presentation element. Instrument performance is characterized by static and dynamic characteristics. Static characteristics include linearity, sensitivity, repeatability, hysteresis, threshold, resolution, and readability.
Methods of force measurement include direct methods using balances and indirect methods measuring acceleration. Proving rings can also measure force by relating deformation to applied force using transducers like strain gauges and LV
This presentation discusses the importance of testing meters and the best practices and test equipment to do so. This presentation was given at MEUA Meter School. 03.03.20
This document discusses the calibration process for tariff meters. Tariff meters are calibrated every six months using a solid state reference standard meter and phantom load kit. The phantom load kit supplies different voltages, currents, and loads to test the meter at no load, starting load, low load, rated load, and maximum load. The calibration process involves load tests where various power factors are applied and the percentage error is calculated. It also involves dial tests to check for error in the meter dial reading. For both tests, the percentage error should be within ±0.5%. Load and dial test results are recorded in tabular format showing readings and errors before and after adjustment.
A Comparative Analysis of Cell Balancing Techniques For Battery Management Sy...IRJET Journal
This document compares and analyzes different cell balancing techniques for battery management systems. It discusses both passive and active cell balancing methods. Passive techniques balance cells by discharging excess energy through resistors, wasting energy as heat. Active balancing transfers energy between cells using capacitors or inductors, from higher to lower voltage cells, improving efficiency. The document simulates these techniques using MATLAB and examines how circuit parameters affect system performance. It concludes that passive balancing is best for low-power applications while active balancing is more suitable for high-power applications like electric vehicles. Proper cell balancing extends battery life and ensures safe operation.
This document discusses the principles and proper use of balances. It explains that a balance measures the force of gravity on an object, or its weight. Weight is dependent on location, while mass remains constant. Electronic balances use electromagnetic forces rather than beams to counterbalance a sample and convert the electrical signal to a weight measurement. The document provides guidelines for operating balances accurately, including leveling, taring, and avoiding sources of error like vibration. It stresses the importance of calibration and quality control programs to ensure balance measurements are valid.
Similar to Laboratory Balance: How They Work, Checking Their Accuracy (20)
A study of correlation of field data and Simulation of cement raw mixes and estimation of potential kiln feed as well as formed clinker made using a computer program through fitting of results obtained with field data. This enables production control by controlling quality of clinker within required specifications
This document discusses optimizing cement grinding circuits through pre-crushing of clinker using a Barmac crusher prior to grinding in a two-compartment ball mill. Bond calculations and population balance modeling are used to analyze the potential benefits. Modeling suggests the total energy consumption for grinding can be reduced up to 10% by pre-crushing clinker to a finer size before ball milling. A case study of a cement plant found pre-crushing could lower the required ball mill power by 9-15% and increase grinding circuit capacity with relatively low capital investment compared to alternatives like high-pressure grinding rolls.
The pyroprocessing stage of cement manufacturing involves heating raw materials in a kiln to produce clinker. This is done using various kiln systems that transfer heat from hot exhaust gases to preheat the raw materials. Early systems included wet and long dry kilns, while improved systems like the Lepol and cyclone preheater kilns transfer heat more efficiently using mechanisms like traveling grate preheaters and cyclone separators to further reduce fuel consumption and increase production rates. The pyroprocessing stage is critical as it determines the clinker composition and involves the most operating costs.
RCA is a process analysis method used to identify the underlying causes of adverse events. It aims to determine what happened, why it occurred, and how to prevent recurrence. RCA investigations focus on systems and processes rather than individual performance. They should be initiated promptly for high risk incidents and completed within 2 months. The major steps include defining the problem, mapping timelines, identifying causes, developing recommendations, and writing a report with conclusions and risk reduction plans. The goal is to convert causal findings into actions to mitigate risks and ensure safety.
The document provides an example to illustrate how design of experiments (DOE) can be used to determine the sensitivity of an amplifier design to process variations. In the example, DOE is used to study the effects of three design elements - width (W), resistor (R), and capacitor (C) - on amplifier gain. Eight experiments are run by assigning values of -1 and +1 to represent variations in each element. Main effects and interaction plots are generated to determine that the resistor has the largest impact on gain variability compared to the other elements. DOE helps identify sensitive parts of the design that can then be adjusted to improve robustness and yield.
The document discusses process capability analysis using Cp and Cpk metrics. Cp measures how well a process fits within specification limits, while Cpk also considers centering. A Cpk over 1 indicates a capable and centered process. Values over 1.33 represent 4-sigma capability. Cp over 1 but Cpk below 1 means the process is capable but not centered. Histograms and Cp, Cpk values are used to compare baseline and improved processes.
The document discusses various statistical process control charts for detecting small shifts in processes, including CUSUM (Cumulative Sum) charts and EWMA (Exponentially Weighted Moving Average) charts. CUSUM charts cumulatively sum process deviations and are more effective at detecting small shifts than Shewhart charts. EWMA charts create a weighted average of recent process data where older data is downweighted using an exponential factor. Both CUSUM and EWMA charts can be designed using maximum likelihood estimation and control limits to balance type I and type II error rates for detecting shifts of a given size.
This document provides instructions for calibrating volumetric glassware used in a chemistry lab. Students will calibrate a 50 mL buret and volumetric pipet by weighing the amount of water delivered at a measured temperature. They will determine the "true volume" delivered and record the values to account for any errors in the glassware markings. Proper technique and care of glassware is emphasized to obtain precise quantitative measurements.
1) Bomb calorimetry is used to determine the heat of combustion and enthalpy of formation of substances by completely combusting samples in a sealed bomb surrounded by water and measuring the temperature change.
2) Standards like benzoic acid are combusted to determine the calorimeter constant, then samples like sucrose are combusted to calculate their enthalpies of formation.
3) Food products are also combusted to determine their energy content in kJ/gram, which can be compared to labeled calorie contents.
This document describes a procedure to use bomb calorimetry to measure the energy of combustion of stearic acid, as a model for camel fat. It aims to determine the molar enthalpy of combustion of stearic acid and estimate the amount of metabolic water produced from oxidizing the fat stored in a camel's hump. The experiment involves calibrating the bomb calorimeter with benzoic acid, then combusting samples of stearic acid to determine its energy of combustion. Calculations are made to find thermodynamic properties and estimate the energy stored and water produced by a camel oxidizing the fat in its hump.
Bomb calorimetry involves measuring the heat of combustion (calorific value) of fuels and other combustible materials. A bomb calorimeter consists of (1) a bomb vessel where combustion occurs, (2) a bucket containing water for absorbing heat, (3) an insulating jacket, and (4) a thermometer. The heat released by combusting a sample in oxygen is absorbed by water in the bucket, causing its temperature to rise. By comparing this rise to that of a standard material, the sample's calorific value can be calculated after applying corrections. Safety precautions must be followed when using high-pressure oxygen.
Practice guide stopwatches and timer calibrationsIngrid McKenzie
This document provides guidance on calibrating stopwatches and timers. It describes common timing devices that require calibration, such as stopwatches and timers. It discusses specifications and tolerances, outlining how to interpret manufacturer specifications and determine required tolerances for legal metrology. The document introduces various calibration methods, including direct comparison, totalize, and time base methods. It provides procedures for each method and examines associated measurement uncertainties. The guidance aims to assist metrologists in properly calibrating timing devices.
This document provides technical information about Pyrex glassware. It describes the chemical composition and physical properties of Pyrex glass. It also discusses standards that Pyrex glass meets and its resistance to heat, acids, bases, and hydrolysis. Quickfit glassware components like conical joints and screwthreads are also outlined.
This document describes using a bomb calorimeter to measure the energy of combustion of stearic acid, as a model for camel fat. The experiment aims to determine the molar enthalpy of combustion of stearic acid in order to estimate the amount of metabolic water produced from oxidizing fat stored in a camel's hump. The procedure involves calibrating the calorimeter with benzoic acid and then performing combustion runs with stearic acid samples. Calculations are done to determine energy and enthalpy of combustion values, which can provide insight into the role of a camel's hump fat in energy storage and water production.
Travis Hills' Endeavors in Minnesota: Fostering Environmental and Economic Pr...Travis Hills MN
Travis Hills of Minnesota developed a method to convert waste into high-value dry fertilizer, significantly enriching soil quality. By providing farmers with a valuable resource derived from waste, Travis Hills helps enhance farm profitability while promoting environmental stewardship. Travis Hills' sustainable practices lead to cost savings and increased revenue for farmers by improving resource efficiency and reducing waste.
ESR spectroscopy in liquid food and beverages.pptxPRIYANKA PATEL
With increasing population, people need to rely on packaged food stuffs. Packaging of food materials requires the preservation of food. There are various methods for the treatment of food to preserve them and irradiation treatment of food is one of them. It is the most common and the most harmless method for the food preservation as it does not alter the necessary micronutrients of food materials. Although irradiated food doesn’t cause any harm to the human health but still the quality assessment of food is required to provide consumers with necessary information about the food. ESR spectroscopy is the most sophisticated way to investigate the quality of the food and the free radicals induced during the processing of the food. ESR spin trapping technique is useful for the detection of highly unstable radicals in the food. The antioxidant capability of liquid food and beverages in mainly performed by spin trapping technique.
Remote Sensing and Computational, Evolutionary, Supercomputing, and Intellige...University of Maribor
Slides from talk:
Aleš Zamuda: Remote Sensing and Computational, Evolutionary, Supercomputing, and Intelligent Systems.
11th International Conference on Electrical, Electronics and Computer Engineering (IcETRAN), Niš, 3-6 June 2024
Inter-Society Networking Panel GRSS/MTT-S/CIS Panel Session: Promoting Connection and Cooperation
https://www.etran.rs/2024/en/home-english/
Or: Beyond linear.
Abstract: Equivariant neural networks are neural networks that incorporate symmetries. The nonlinear activation functions in these networks result in interesting nonlinear equivariant maps between simple representations, and motivate the key player of this talk: piecewise linear representation theory.
Disclaimer: No one is perfect, so please mind that there might be mistakes and typos.
dtubbenhauer@gmail.com
Corrected slides: dtubbenhauer.com/talks.html
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
Phenomics assisted breeding in crop improvementIshaGoswami9
As the population is increasing and will reach about 9 billion upto 2050. Also due to climate change, it is difficult to meet the food requirement of such a large population. Facing the challenges presented by resource shortages, climate
change, and increasing global population, crop yield and quality need to be improved in a sustainable way over the coming decades. Genetic improvement by breeding is the best way to increase crop productivity. With the rapid progression of functional
genomics, an increasing number of crop genomes have been sequenced and dozens of genes influencing key agronomic traits have been identified. However, current genome sequence information has not been adequately exploited for understanding
the complex characteristics of multiple gene, owing to a lack of crop phenotypic data. Efficient, automatic, and accurate technologies and platforms that can capture phenotypic data that can
be linked to genomics information for crop improvement at all growth stages have become as important as genotyping. Thus,
high-throughput phenotyping has become the major bottleneck restricting crop breeding. Plant phenomics has been defined as the high-throughput, accurate acquisition and analysis of multi-dimensional phenotypes
during crop growing stages at the organism level, including the cell, tissue, organ, individual plant, plot, and field levels. With the rapid development of novel sensors, imaging technology,
and analysis methods, numerous infrastructure platforms have been developed for phenotyping.
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Leonel Morgado
Current descriptions of immersive learning cases are often difficult or impossible to compare. This is due to a myriad of different options on what details to include, which aspects are relevant, and on the descriptive approaches employed. Also, these aspects often combine very specific details with more general guidelines or indicate intents and rationales without clarifying their implementation. In this paper we provide a method to describe immersive learning cases that is structured to enable comparisons, yet flexible enough to allow researchers and practitioners to decide which aspects to include. This method leverages a taxonomy that classifies educational aspects at three levels (uses, practices, and strategies) and then utilizes two frameworks, the Immersive Learning Brain and the Immersion Cube, to enable a structured description and interpretation of immersive learning cases. The method is then demonstrated on a published immersive learning case on training for wind turbine maintenance using virtual reality. Applying the method results in a structured artifact, the Immersive Learning Case Sheet, that tags the case with its proximal uses, practices, and strategies, and refines the free text case description to ensure that matching details are included. This contribution is thus a case description method in support of future comparative research of immersive learning cases. We then discuss how the resulting description and interpretation can be leveraged to change immersion learning cases, by enriching them (considering low-effort changes or additions) or innovating (exploring more challenging avenues of transformation). The method holds significant promise to support better-grounded research in immersive learning.
Unlocking the mysteries of reproduction: Exploring fecundity and gonadosomati...AbdullaAlAsif1
The pygmy halfbeak Dermogenys colletei, is known for its viviparous nature, this presents an intriguing case of relatively low fecundity, raising questions about potential compensatory reproductive strategies employed by this species. Our study delves into the examination of fecundity and the Gonadosomatic Index (GSI) in the Pygmy Halfbeak, D. colletei (Meisner, 2001), an intriguing viviparous fish indigenous to Sarawak, Borneo. We hypothesize that the Pygmy halfbeak, D. colletei, may exhibit unique reproductive adaptations to offset its low fecundity, thus enhancing its survival and fitness. To address this, we conducted a comprehensive study utilizing 28 mature female specimens of D. colletei, carefully measuring fecundity and GSI to shed light on the reproductive adaptations of this species. Our findings reveal that D. colletei indeed exhibits low fecundity, with a mean of 16.76 ± 2.01, and a mean GSI of 12.83 ± 1.27, providing crucial insights into the reproductive mechanisms at play in this species. These results underscore the existence of unique reproductive strategies in D. colletei, enabling its adaptation and persistence in Borneo's diverse aquatic ecosystems, and call for further ecological research to elucidate these mechanisms. This study lends to a better understanding of viviparous fish in Borneo and contributes to the broader field of aquatic ecology, enhancing our knowledge of species adaptations to unique ecological challenges.
Unlocking the mysteries of reproduction: Exploring fecundity and gonadosomati...
Laboratory Balance: How They Work, Checking Their Accuracy
1. 48
Precision weighing is a necessity
in laboratories of all types. Accuracy
exceeding one part per million is com-
monplace now for masses in the range
of 1 gram to 1 kilogram. Although the
requirement for highly accurate weigh-
ing for the preparation of reagents,
standards, and calibrators has dimin-
ished, many laboratories use precision
balances for periodic checking of the
accuracy of mechanical pipettors.
Principle of Operation
The principle of operation of a
modern laboratory balance bears some
resemblance to its predecessor—the
equal arm balance. The older instru-
ment opposed the torque exerted by an
unknown mass on one side of a pivot to
that of an adjustable known weight on
the other side. When the pointer
returned to the center position, the
torques must be equal, and the weight
was determined by the position of the
moving weights.
The modern equivalent is called
magnetic force restoration. In this system,
the force exerted by the object being
weighed is lifted by an electromagnet.
The typical mechanism consists of a coil
of wire suspended in a magnetic field.
Because the magnetic field is radially
oriented relative to the coil, the direction
of current flow and the direction of the
magnetic field are perpendicular at all
points. The force exerted is therefore in
the direction of the axis of the coil. The
coil is supported by precision springs,
which allow it to move in the direction of
its axis. An optical sensor detects the po-
sition of the coil, and provides a feedback
signal to an electronic amplifier. The am-
plifier automatically adjusts the current to
maintain the position of the coil at a ref-
erence position (the “null” point). The
amount of force exerted by the coil is
directly proportional to the amount of
current flowing in it. So, by measuring
that current, the force can be calculated.
The equal-arm balance opposed 2
counteracting torques and caused the
deflection of a needle to return to a ref-
erence position, whereas the electro-
magnetic force restoration system
opposes the linear force exerted by the
unknown against the adjustable and
known linear force exerted by the coil
at a specific spatial position.
Checking Accuracy
The College of American Patholo-
gists (CAP) Laboratory Inspection Pro-
gram requires that “the verification of
accuracy of the analytical balance must
be performed each time it is used for the
creation of analytical calibrators and/or
weighed-in controls from standard mate-
rials.” as well as when gravimetrically
checking the accuracy of pipets.2
Checking a balance is very similar
in concept to the checks we do in the
laboratory for other analyses. There are
4 components to the testing of a preci-
sion laboratory balance: reproducibility,
linearity, calibration, and cornerload.
Reproducibility refers to the instru-
ment’s ability to repeatedly deliver the
same weight reading for a given object.
It is expressed as a standard deviation.
Standard deviation, or reproducibility, is
often an advertised performance specifi-
cation for a laboratory balance.
Linearity is the characteristic which
quantifies the accuracy of the instrument
at intermediate readings throughout the
weighing range of the instrument. The
weighing range of the instrument is simi-
lar to the analytical measurement range
(AMR) of a laboratory test. Since a labo-
ratory balance will often be used to weigh
items much smaller than the capacity of
the instrument, this is a critical aspect.
Calibration refers to a comparison
of the weight reading of a given mass
standard, and the actual value of that
standard. This measurement is often
done at full capacity.
Cornerload errors are those errors
associated with different positions on
the weighing pan of the object being
weighed. A given object should pro-
duce the same reading, regardless of its
position on the weighing pan. This is a
type of error that is unlike those usually
seen in the clinical laboratory and is
unique to the analytical balance.
How to Check Balance
Accuracy
The tests for balance accuracy are
similar to those done in the clinical labo-
ratory for checking the accuracy and pre-
cision of analytical tests. However,
because balances are special mechanical
devices, there are some differences in the
CE update [generalist]
Laboratory Balances: How They Work,
Checking Their Accuracy
Douglas Morse, BS,1 Daniel M. Baer, MD2
1IES Corporation, 2VA Medical Center, Oregon Health and Sciences University, Portland, OR
DOI: 10.1309/QYR5UV73FRY2YBMJ
̈ Modern electronic laboratory
balances work on the principle of
magnetic force restoration. In this
system, the force exerted by the
object being weighed is lifted by an
electromagnet. A detector
measures the current required to
oppose the downward motion of
the weight in the magnetic field.
̈ Accuracy checks of a balance
involve testing for reproducibility,
linearity, calibration, and cornerload.
After reading this article, the reader should understand the principle operation of the modern electronic balance and under-
stand how to check the balance’s accuracy.
Generalist exam 90401 questions and corresponding answer form are located after the “Your Lab Focus” section on p. 59.
2. 49
details of the reliability tests. Complete
instructions with forms for collecting the
test data1 can be found at the Web site,
www.labbalancerepair.com/test.doc.
Reproducibility Testing
Reproducibility testing entails re-
peatedly weighing a given object,
recording the results, and analyzing
those results. A test weight equal to, or
nearly equal to, the weighing capacity
of the instrument should be selected.
Twenty pairs of readings should be
taken for 2 data sets: “full-scale read-
ing” and “zero reading.” A detailed
procedure is as follows.
1. Tare the instrument to read all zeros.
Do not record the initial zero reading.
2. Place the test weight on the pan.
Record the reading in the column la-
beled “FULL SCALE READING” .
3. Remove the weight (DO NOT
REZERO), and record the reading
under “ZERO READING”
4. Repeat steps 2 and 3 until lines 1
through 20 are all filled in.
5. Transcribe the 2 columns of num-
bers into a spreadsheet or QM statis-
tical program.
6. Use the program to calculate the
standard deviation and coefficient of
variation (CV) of both columns of
numbers.
7. Calculated standard deviations larger
than allowed in the instrument speci-
fications indicate that the instrument
is either operating in an unstable en-
vironment (static, air draft, warm-up,
vibration, etc), or that the instrument
is in need of repair.
Linearity Testing
Linearity testing verifies the accu-
racy of the instrument at intermediate
values of weight. Manufacturers often
use the term “accuracy” in advertised
specifications. This test is quite different
from linearity testing in the clinical labo-
ratory where we make a series of meas-
urements over the AMR of the test. The
balance linearity test measures the ability
of the balance to accurately measure an
added weight before and after a non-
measured weight load has been placed on
the balance. The procedure is as follows.
1. Use 2 weights, each of
approximately one-half the weighing
capacity of the instrument. It is im-
perative that these 2 weights not be
interchanged within this procedure.
Refer to the individual weights as
“weight A” and “weight B.”
2. Rezero the display. Place “A” on the
pan (at the center), and record the
reading on the “Linearity Chart” in a
column marked “0% – 50%.”
3. Remove “A” and place “B” pan near
its center. Rezero the display with
“B” still on the pan.
4. Again place “A” on the pan with
“B” still on the pan. Record the
reading under the column marked
“50% - 100%.”
5. Calculate the difference between the
2 (0-50 and 50-100) readings.
6. The difference should be less than
the advertised tolerance for linearity
or accuracy.
A common error in linearity (accu-
racy) testing is to simply place test
weights on the weighing pan and observe
For complete information: http://www.labmedicine.com/marketplace/market.html
3. 50
the difference between the indicated
weight and the nominal value of the
test weight. This process fails to
account for the fact that test weights
are imperfect and that the difference
between the nominal value and the ac-
tual weight might be significant. This is
especially true with analytical balances,
where the balance may be more accu-
rate than any standard test weight. The
above procedure nullifies this problem
by comparing the weight readings of
the same object, both with and without
a preload. The accuracy of the test
weight is thus immaterial.
Calibration
Calibration of instruments is differ-
ent from model to model and manufac-
turer to manufacturer. Generally, it is a
simple procedure described in the user’s
manual which comes with the
instrument.
Many instruments now include in-
ternal calibration weights, so calibration
is as easy as pushing a single button. The
user may ask, “How do I know the inter-
nal calibration weight is correct?” The
answer is that the only way to know is to
have an external standard for compari-
son, as required by CAP LAP Checklist
question AGC.27540. The advantage of
an external standard, whether it is used
to calibrate the instrument or just to
check the internal weight, is that it can
be checked by a calibration laboratory
and traced to national standards.
CAP LAP Checklist question
AGC.27540 (a Phase II deficiency) re-
quires that laboratories have the appro-
priate ASTM class weights for accuracy
testing (the same as checking calibra-
tion as described in this paper). Stan-
dardized test weights are made to
various levels of accuracy. CAP’s re-
quirements for the use of the appropri-
ate ASTM class of weights are: “ASTM
Class 1 weights are appropriate for cali-
brating high precision analytical bal-
ances (0.01 to 0.1 mg). ASTM Class 2
weights are appropriate for calibrating
high precision top-loading balances
with readabilities from 0.001 to 0.01 g.
ASTM Class 3 weights are appropriate
for calibrating moderate precision bal-
ances, from 0.01 to 0.1 g.”2
ASTM class 1 is the most accurate
weight class commonly available. Most
weights in ASTM class 1 are accurate
to 1 part in 400,000. Since analytical
and microbalances are considerably
more accurate (resolution of greater
than 1 ppm), one might wonder how a
standard weight can be used to test or
calibrate analytical balances. The an-
swer would be that the standard weight
can itself be calibrated by a laboratory
which specializes in that service. The
characterization of the weight by such a
laboratory will determine the actual
value of the weight to a much higher
degree of precision than required by the
ASTM standard. Such a weight can be
used to verify the accuracy of an inter-
nal calibration weight.
After placing a test weight on an
operating instrument and finding that
the displayed weight value does not
exactly match the value of the test
weight, many users have concluded that
the instrument is miscalibrated. How-
ever, that conclusion is by no means
certain unless the test weight has been
calibrated and its correction from nomi-
nal value is known. The calibration
weights internal to high quality labora-
tory balances are more accurate than
commonly available test weights. In the
absence of actual calibration data for
specific weights, users should presume
that internal weights are more accurate
than external test weights. We therefore
disagree with the CAP LAP policy of
using ASTM class 1 weights for cali-
brating analytical (0.01 to 0.1 mg reso-
lution) balances. ASTM class 1 weights
are less accurate than most analytical
balances. Many toploading balances
would be poorly calibrated using class
2 weights for the same reason.
If the balance internal calibration is
more accurate than the marked value of
the weights then the user should check
the calibration of the balance as
described below. Why is it required to
perform that checking of calibration
each time analytical standards or cali-
brators are prepared or pipettors are
checked for accuracy? This is a valid
question. However, the requirement
does exist and laboratories must have
documentation that the checking has
occurred. One suggestion is to perform
the following “calibration verification”
test. This test is referred to as “Accu-
racy Checking” in the CAP Checklist.
1. Select 5 or 6 weights over the AMR
of the balance.
Laboratory Environment Cornerload Tolerances
Resolution
.1 g .01 g .001 g .000 1g .00001 g
Capacity
30 gram - - - 2 5
100 g - - 2 4 10
300 g - 2 4 10 -
1000 g 2 4 10 - -
3000 g 3 6 - - -
10 Kg 4 - - - -
30 Kg 12 - - - -
Industrial Environment Cornerload Tolerances
Resolution
.1 g .01 g .001 g .0001 g .00001 g
Capacity
30 gram - - - 4 10
100 g - - 4 8 20
300 g - 4 8 20 -
1000 g 4 8 20 - -
3000 g 6 12 - - -
10 Kg 8 - - - -
30 Kg 24 - - - -
The value for each combination of balance capacity and resolution is the greatest allowable difference between the
highest and lowest reading of the cornerload test weights in the most right hand digits of the balance readout.
T1
4. 51
2. Tare the balance so that its reading is
zero.
3. Weigh each of the standard weights
and record the observed weight in a
log. Use the same weights from the
same weight set each time this test is
performed.
4. Compare the observed weights with
previous weightings.
5. The results are judged to be satisfac-
tory if there has been no significant
shift of the observed weights of the
weights. If a single weight does not
agree, reweigh it and confirm that
there has not been a substitution of
weights from a previous calibration
verification.
If one has serious concerns about
the calibration of the balance, a quali-
fied balance repair service should be
consulted.
Cornerload Testing
Cornerload testing verifies that
the instrument delivers the same
weight reading, regardless of where
on the weighing pan the object being
weighed is placed. Cornerload per-
formance specifications are often not
advertised. Typical tolerances are
shown in T1.
1. Select a test weight close to the
weighing capacity of the instrument.
2. Place the test weight in the center of
the weighing pan. Then re-zero the
display.
3. Move the weight one-half way from
the center to the front edge of the
pan. Record the reading on the “Cor-
nerload Chart” under the heading
“FRONT.”
4. Repeat step 3 at the half-way loca-
tions for right, rear, and left edges,
recording the readings in the appro-
priate spaces in the chart.
5. Cornerload tolerances are often not a
component of advertised specifica-
tions. T1 shows typical tolerances for
instruments operating in both labora-
tory and industrial conditions. The
laboratory environment presumes a
specially built, leveled, rigid table
and uninterrupted temperature con-
trol within 1°F. The industrial envi-
ronment includes a sturdy table and
uninterrupted temperature control
within 4°F.
Handling Test Weights
The weights used to test laboratory
balances are precision devices and need
to be handled accordingly. When han-
dling weights, avoid direct hand contact
with weights by using clean gloves or
special lifting tools. Hand contact with
the weights can cause corrosion. Also
avoid sliding weights across any surface,
especially across the stainless steel
weighing pan of the balance under test.
Weights should be kept in a covered, pro-
tected box. CAP LAP Checklist question
AGC.27560 (Phase II deficiency) says
“Weights must be well-maintained (cov-
ered when not in use, not corroded).”
Environmental Conditions for
Best Weighing Accuracy
In order to pass any test of repro-
ducibility an instrument must be operat-
ing in an acceptable environment. A
poor environment will degrade the re-
sults of a standard deviation (SD) test
and falsely suggest that the performance
is substandard. There are several aspects
of the environment which impact the
performance of a laboratory balance.
Temperature
The accuracy and overall perform-
ance of any laboratory balance is af-
fected by the room temperature. For
best stability and performance the room
temperature should be regulated to
within 1°F without interruption. The
instrument should remain with power
ON continuously.
Air Drafts
In the cases of measurements with
resolution of .001 gram and less the
force exerted by moving air is readily
detectable. A shroud or enclosure around
the weighing pan will shield the pan
from these effects. Avoid plastic materi-
als for draft shields because of potential
static electricity interference problems.
Static Electricity
Static electricity exerts a mechani-
cal force which is readily detectable by
analytical and microbalances. An ex-
ample of static electricity exerting a
mechanical force would be lint sticking
to clothing. Static will be a problem
when it exists on the object being
weighed, on the person using the bal-
ance, on draft shields, or on weighing
vessels. Sources of static are carpets,
Vibrum shoe soles, plastic draft shields,
plastic weighing vessels, and melamine
(Formica) table tops. Low ambient hu-
midity exacerbates static problems.
You can test for a static problem
easily. On an analytical balance place a
metal enclosure (a coffee can works
well) over the weighing pan, so that the
pan is enclosed by the can but NOT
touched by it. If the weight readings
stabilize with the can in place, then
static may be the cause of the instabil-
ity. Notice that the coffee can provides
an effective draft shield too.
Floor Vibration/Table
Instability
Many laboratory balances are ex-
tremely sensitive to vibration or move-
ment. If the weight readings change as
you walk around the instrument, or if
the readings change as you lean on the
table or move objects on the table, then
the table and floor are affecting weight
readings. You can minimize these ef-
fects by using an especially sturdy
table and minimizing movement. Users
of microbalances often need specially
built marble tables on concrete floors.
Conclusions
Although the high precision labo-
ratory balance is used less now than in
the past when many analytical
reagents, standards, and calibrators
were made rather than purchased, the
balance is still an important laboratory
instrument. Periodic checking of its
accuracy and precision is an important
part of the laboratory’s quality manage-
ment program. The procedures in this
paper are straight forward ways to
check on the balance’s performance.
1. Testing your laboratory balance, IES
Corporation. Available at:
www.labbalancerepair.com/test.doc.
Accessed on November 13, 2003.
2. CAP LAP Checklist question AGC.27540,
College of American Pathologists. Available at:
http://www.cap.org/lap/accstandards.html.
Accessed on November 13, 2003.