The peer-reviewed International Journal of Engineering Inventions (IJEI) is started with a mission to encourage contribution to research in Science and Technology. Encourage and motivate researchers in challenging areas of Sciences and Technology.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
UV Spectrophotometric Method Development and Validation for quantitative estimation of Miconazole nitrate
(MIC). U.V Spectrophotometric method have been widely employed in determination of individual components in
a mixture or fixed dose combination. Our aim is to develop spectroscopic method for estimation of the Miconazole
nitrate (MIC) in ternary mixture by using U.V spectrophotometry. The method was validated as per ICH
guidelines. The recovery studies confirmed the accuracy and precision of the method. It was successfully applied
for the analysis of the drug in bulk and could be effectively used for the routine analysis.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
U.V Spectrophotometric method have been widely employed in determination of Curcumin in a mixture or fixed dose combination. For the ternary mixture containing Curcumin, no spectrophotometric method for evaluation has been reported so far. Thus our aim is to develop method for Curcumin estimation in ternary mixture using U.V spectrophotometry.
This document describes two spectrophotometric methods (Methods A and B) developed for the quantification of etoricoxib (ETX) in tablets. Method A is an area under curve method using integrated absorbance values between 215-244nm. Method B is a first derivative spectroscopy method measuring amplitude at 236nm. Both methods showed linearity between 2-16μg/ml of ETX. The proposed methods were validated and found to accurately quantify ETX in tablets without interference from excipients, as determined through assay and recovery experiments. The methods were concluded to be simple, sensitive, accurate and useful for routine analysis of ETX in pharmaceutical formulations.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
UV Spectrophotometric Method Development And Validation For Quantitative Esti...Sagar Savale
U.V Spectrophotometric method have been widely employed in determination of Halcinonide in a mixture or fixed dose combination. For the ternary mixture containing Halcinonide, no spectrophotometric method for evaluation has been reported so far. Thus our aim is to develop method for Halcinonide estimation in ternary mixture using U.V spectrophotometry.
Standard practice for characterization of particles astmInkalloys Perú
This standard is issued under the fixed designation F1877; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
This document discusses various NMR techniques for ligand screening in drug discovery. It begins by providing background on the increasing role of NMR in drug research due to its ability to sensitively detect molecular interactions and provide structural information. The document then reviews both ligand-observed and target-observed NMR screening techniques, describing methods based on changes in molecular diffusion, relaxation, and intramolecular or intermolecular magnetization transfer upon ligand binding. Specific techniques discussed include saturation transfer difference (STD) NMR, transferred nuclear Overhauser effect (trNOE), and NOE pumping. The review concludes by noting the dual importance of NMR for drug screening and structure-based drug design.
Investigation of mass flow properties of particles in Silo DryersAM Publications
The most relevant question of the mixing systems used in silo dryers is the mixing efficiency of screw augers. The aim of design is that the construction stirs the granular assembly on an optimal level i.e. mixing should be uniform and the mixed amount should as much as possible. Although the mixing process appears unsophisticated, it is a very complex phenomenon. Engineers and researchers work on this field, use mostly experimental data for designing and development because there not too much is known about what happens around the rotating mixing screw. In our prior work, the mixing efficiency and effective radius were determined [3]. In the present article, we investigate the mixing process with a mass flow rate, which determined with cylindrical volumes along the vertical axis. To model this phenomenon, we used the EDEM Academic 2.7. discrete element software
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
UV Spectrophotometric Method Development and Validation for quantitative estimation of Miconazole nitrate
(MIC). U.V Spectrophotometric method have been widely employed in determination of individual components in
a mixture or fixed dose combination. Our aim is to develop spectroscopic method for estimation of the Miconazole
nitrate (MIC) in ternary mixture by using U.V spectrophotometry. The method was validated as per ICH
guidelines. The recovery studies confirmed the accuracy and precision of the method. It was successfully applied
for the analysis of the drug in bulk and could be effectively used for the routine analysis.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
U.V Spectrophotometric method have been widely employed in determination of Curcumin in a mixture or fixed dose combination. For the ternary mixture containing Curcumin, no spectrophotometric method for evaluation has been reported so far. Thus our aim is to develop method for Curcumin estimation in ternary mixture using U.V spectrophotometry.
This document describes two spectrophotometric methods (Methods A and B) developed for the quantification of etoricoxib (ETX) in tablets. Method A is an area under curve method using integrated absorbance values between 215-244nm. Method B is a first derivative spectroscopy method measuring amplitude at 236nm. Both methods showed linearity between 2-16μg/ml of ETX. The proposed methods were validated and found to accurately quantify ETX in tablets without interference from excipients, as determined through assay and recovery experiments. The methods were concluded to be simple, sensitive, accurate and useful for routine analysis of ETX in pharmaceutical formulations.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
UV Spectrophotometric Method Development And Validation For Quantitative Esti...Sagar Savale
U.V Spectrophotometric method have been widely employed in determination of Halcinonide in a mixture or fixed dose combination. For the ternary mixture containing Halcinonide, no spectrophotometric method for evaluation has been reported so far. Thus our aim is to develop method for Halcinonide estimation in ternary mixture using U.V spectrophotometry.
Standard practice for characterization of particles astmInkalloys Perú
This standard is issued under the fixed designation F1877; the number immediately following the designation indicates the year of
original adoption or, in the case of revision, the year of last revision. A number in parentheses indicates the year of last reapproval. A
superscript epsilon (´) indicates an editorial change since the last revision or reapproval.
This document discusses various NMR techniques for ligand screening in drug discovery. It begins by providing background on the increasing role of NMR in drug research due to its ability to sensitively detect molecular interactions and provide structural information. The document then reviews both ligand-observed and target-observed NMR screening techniques, describing methods based on changes in molecular diffusion, relaxation, and intramolecular or intermolecular magnetization transfer upon ligand binding. Specific techniques discussed include saturation transfer difference (STD) NMR, transferred nuclear Overhauser effect (trNOE), and NOE pumping. The review concludes by noting the dual importance of NMR for drug screening and structure-based drug design.
Investigation of mass flow properties of particles in Silo DryersAM Publications
The most relevant question of the mixing systems used in silo dryers is the mixing efficiency of screw augers. The aim of design is that the construction stirs the granular assembly on an optimal level i.e. mixing should be uniform and the mixed amount should as much as possible. Although the mixing process appears unsophisticated, it is a very complex phenomenon. Engineers and researchers work on this field, use mostly experimental data for designing and development because there not too much is known about what happens around the rotating mixing screw. In our prior work, the mixing efficiency and effective radius were determined [3]. In the present article, we investigate the mixing process with a mass flow rate, which determined with cylindrical volumes along the vertical axis. To model this phenomenon, we used the EDEM Academic 2.7. discrete element software
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
This document describes the development and validation of a UV spectrophotometric method for the quantitative estimation of paracetamol. Paracetamol was found to exhibit maximum absorption at 244 nm in methanol. The method was validated according to ICH guidelines and showed good linearity (R2 = 0.9999), recovery (99.78-100.54%), precision (<0.06% RSD), ruggedness (<0.02% RSD), and sensitivity (LOD = 0.37 μg/ml, LOQ = 0.98 μg/ml). The developed method is simple, rapid, economical and suitable for the analysis of paracetamol in bulk drug samples.
The document discusses particle characterization techniques, focusing on the Coulter Principle which is well-suited for measuring ink toners in the 400nm to 1.7mm size range. It provides details on how the Coulter Principle works by measuring electrical impedance of individual particles, and its advantages over other techniques for applications in the toner industry where tight control of particle size distribution is important for product quality and performance.
This document provides an overview of sample preparation techniques for X-ray fluorescence (XRF) analysis. It discusses preparation of metal, powder, and liquid samples. For metals, the sample surface must be ground flat to remove impurities and obtain consistent roughness. Powder samples are pressed into pellets after pulverization to reduce heterogeneity effects. Liquid samples can be analyzed directly in sample cells or by drying microdroplets on filter paper to measure lighter elements. Proper sample preparation is crucial for obtaining accurate and reproducible XRF analysis results.
spectrophotometric estimation of metformin in bulk and in its dosage formsaikiranyuvi
The document describes the development and validation of a UV spectrophotometric method for the estimation of metformin in bulk and tablet dosage forms. Key aspects include determining the absorption maximum of 646 nm for metformin and developing a linear calibration curve within the concentration range of 8-16 μg/ml. The method was validated based on parameters such as accuracy, precision, LOD, LOQ and recovery, demonstrating the method is simple, accurate, precise and can be used to analyze metformin in pharmaceutical formulations.
UV spectrophotometric method development and validation for quantitative esti...Sagar Savale
UV Spectrophotometric Method Development and Validation for quantitative estimation of Ondansetron
Hydrochloride (HCL). U.V Spectrophotometric method have been widely employed in determination of
individual components in a mixture or fixed dose combination. Our aim is to develop spectroscopic method for
estimation of the Ondansetron HCL in ternary mixture by using U.V spectrophotometry. The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method. It was
successfully applied for the analysis of the drug in bulk and could be effectively used for the routine analysis.
A Systematic Approach to Overcome the Matrix Effect during LC-ESI-MS/MS AnalysisBhaswat Chakraborty
This document discusses matrix effects (MEs) that can occur during LC-MS/MS bioanalytical methods and presents a systematic approach to overcome MEs through different sample extraction techniques. It finds that solid phase extraction produces the cleanest samples with the lowest MEs, while protein precipitation using methanol produces the dirtiest samples with the highest MEs. Different phospholipids are identified as contributing to MEs, with longer-retained phospholipids playing a more significant role. Among extraction methods, solid phase extraction is most effective at removing phospholipids and minimizing MEs, while protein precipitation is least effective.
This document summarizes an article that examines how different ionization source designs in LC-ESI-MS/MS systems can influence matrix effects during analysis. The article analyzes acamprosate (ACM) using two different LC-ESI-MS/MS instruments with different ionization source designs (a Z-spray source and an orthogonal spray source) coupled to UPLC/HPLC systems under the same chromatographic conditions. It finds that ACM showed almost complete ion suppression in the Z-spray source coupled to UPLC/HPLC, but only minor ion enhancement in the orthogonal spray source coupled to HPLC. Different phospholipids were responsible for the matrix effects in each case. The study demonstrates how
IRJET - Different Curing Modes and its Effect on Colour Stability of Univers...IRJET Journal
This study evaluated the effect of different curing modes of LED light on color stability of composite resin. Composite resin discs were cured using either continuous or intermittent LED curing modes. The discs were then immersed in methylene blue dye and alcohol, and the amount of dye absorption was measured using a spectrophotometer. The results showed that composite resin discs cured with the intermittent mode absorbed less dye compared to those cured continuously, indicating greater color stability with intermittent curing. The authors concluded that curing mode affects the degree of monomer conversion and properties of the composite resin such as color stability, with intermittent curing demonstrating better color stability.
The Effect of Milling Times and Annealing on Synthesis of Strontium Titanate ...AM Publications
Analysis of microstructure of Strontium titanate (SrTiO3) phase obtained by milling and annealing of
SrCO3 and TiO2 precursors. However, the material properties for strontium titanate require a careful control of
crystallite structure as well as microstructure design to meet a specific application. The mixture of strontium
carbonate (SrCO3) and tintanium oxide (TiO2) powders was used to obtain SrTiO3 phase by using vibrator ball mill
with ball to powder ratio 10:1 and heat treatment processes. The size of powder particles was determined by a laser
particle analyzer (PSA). The X-ray diffraction methods were used for qualitative, quantitative phase analyses and for
crystallite size and lattice distortion determination. The milling process of strontium carbonate and tintanium oxide
mixture causes decrease of the mean particle size and crystallite size of involved phases. The X-ray diffraction
investigations of SrCO3 and TiO2 mixture milled for 60 hours and annealed at 900°C with 24 h of holding time
enabled the identification of SrTiO3 phase. Annealing the sample of the particles at 900 0C has resulted in a dense
compact and promoted the formation of particles containing nanocrystallites. The crystallite-growth samples of
SrTiO3 phase were dependent on temperature and time of their annealing
IOSRPHR(www.iosrphr.org) IOSR Journal of Pharmacyiosrphr_editor
A simple reverse phase liquid chromatographic method was developed and validated for the simultaneous estimation of lornoxicam and paracetamol from their pharmaceutical dosage forms. The method utilized a C18 column with a mobile phase of potassium dihydrogen phosphate (pH 7.3) and acetonitrile (70:30) and detected compounds at 257nm. The method was linear over 20-60μg/ml for paracetamol and 0.2-1.8μg/ml for lornoxicam. Retention times were 2.33 minutes for paracetamol and 7.61 minutes for lornoxicam. The method was validated per ICH guidelines and demonstrated good precision, accuracy, reproducibility
This document discusses various pre-formulation studies including analytical methods used to characterize drug substances and formulations. It describes techniques such as microscopy, differential scanning calorimetry (DSC), powder X-ray diffraction (PXRD), and thermogravimetric analysis (TGA) that are used to investigate the physical and chemical properties of drugs and excipients alone and in combination. Specific application and procedures for each technique are provided with examples.
Design of computerized monitoring and processing system for magnetic field c...IJECEIAES
Black power represents the main difficulty faced by the oil flow in pipelines. The negative effect of this powder reaches to stop the oil flow due to clogging the pipelines, in addition to the damaging of the crude oil pumps. Many solutions have been proposed in literature based on chemical or physical processes. On the other side, applying the fixed magnetic field has been presented in separation and extraction process of metal impurities in water pipelines applications. From these facts, this paper proposes an alternative solution (idea, design, and methodology for future implementation) for the black power removing from oil pipelines. The proposed system works on firstly sensing the resistivity parameter in the crude oil as an indication about the oil status with respect to the quantity of the black powder particles, then works on monitoring and controlling the level, location, and polarity of the required magnetic field that to work on cracking particles cracking function that in order to facilitate the crude oil motion in the pipelines. In addition, the proposed solution presents a new design of electrical resistivity sensor as an important indication in terms of evaluating the proposed system performance.
Part A: To developed analytical (UV) method for determination of TOPIRAMATE in bulk and in oral solid dosage form.
Part B: To validate developed method as per ICH guidelines for parameters:
COLORIMETRY
It is the science & technology used to quantify & describe physically the Human color perception.
selection of drug
1. Drugs which not have strong UV absorbance.
2. Drugs for which Colorimetric methods are not available.
3. Drugs for which methods are available but they are time consuming & complex.
Eg. Dicloxacillin, Topiramate etc.
conclusion
Colorimetric method was developed and validated as per ICH guidelines for estimation of Topiramate in tablets.
Development of color is by reaction of amino group of drug with Ninhydrin reagent in presence of pyridine.
The method was found to be simple, accurate, precise and specific.
So, the proposed method can be used for the routine quality control analysis of the bulk drug as well as oral dosage forms.
This document outlines the development and validation of a derivative spectroscopic method for estimating serratiopeptidase and diclofenac sodium in bulk and tablet form. The method involves using UV-visible spectrophotometry and derivative spectroscopy to qualitatively and quantitatively analyze the two drugs. The method was developed using the Shimadzu UV-1700 instrument. Standard solutions of the drugs were used to generate calibration curves and determine linearity, accuracy, and precision of the method as per ICH guidelines. The developed and validated method was found to be simple, rapid, precise, and accurate for routine analysis of serratiopeptidase and diclofenac sodium in tablet dosage forms.
Method development and validation for the estimation of metronidazole in tabl...pharmaindexing
This document describes the development and validation of two spectrophotometric methods for the estimation of metronidazole in tablet dosage forms. The methods utilize UV spectroscopy and first derivative spectroscopy. Metronidazole showed maximum absorbance at 313nm in methanol:water for UV spectroscopy and a minimum at 298nm for derivative spectroscopy. Both methods were linear between 4-12μg/ml and were validated according to ICH guidelines. The methods were found to be accurate, precise and reproducible for the analysis of metronidazole in pure form and pharmaceutical formulations.
In vitro tests of adhesive and composite dental materialsSilas Toka
The document summarizes a review article on the relevance of in vitro tests of adhesive and composite dental materials. It discusses how laboratory tests are standardized according to ISO protocols to evaluate properties like depth of cure, flexural strength, water sorption and solubility. While laboratory tests provide useful data on material properties, they do not replace clinical studies. Some laboratory recommendations did not prove superior to simpler techniques in clinical trials. Additionally, unexpected clinical problems may arise that were not anticipated by laboratory testing alone, emphasizing the need to augment laboratory studies with long-term clinical evaluations.
The introduction of a research report should include the purpose, scope, and background. The purpose section explains what the research aimed to discover, the type of problem studied, and why. The scope details how the researcher approached the problem and any other methods considered. The background provides necessary context for readers by discussing previous work and assumptions.
The document defines a hypothesis as a conjectural statement or tentative explanation about the relationship between two or more variables that can be tested. Several authors contribute definitions stating that a hypothesis makes a specific, testable prediction and must be falsifiable. Key aspects of a hypothesis include identifying variables, having explanatory power, and being testable, quantifiable, and generalizable. The document also distinguishes between statistical hypotheses about population parameters, null hypotheses being tested, and critical regions for rejecting null hypotheses based on sample data.
The document provides an overview of hypothesis testing, including defining the null and alternative hypotheses, types of errors, significance levels, critical values, test statistics, and conducting hypothesis tests using both the traditional and p-value methods. Examples are provided for z-tests, t-tests, and tests of proportions to demonstrate applications of hypothesis testing methodology.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
This document describes the development and validation of a UV spectrophotometric method for the quantitative estimation of paracetamol. Paracetamol was found to exhibit maximum absorption at 244 nm in methanol. The method was validated according to ICH guidelines and showed good linearity (R2 = 0.9999), recovery (99.78-100.54%), precision (<0.06% RSD), ruggedness (<0.02% RSD), and sensitivity (LOD = 0.37 μg/ml, LOQ = 0.98 μg/ml). The developed method is simple, rapid, economical and suitable for the analysis of paracetamol in bulk drug samples.
The document discusses particle characterization techniques, focusing on the Coulter Principle which is well-suited for measuring ink toners in the 400nm to 1.7mm size range. It provides details on how the Coulter Principle works by measuring electrical impedance of individual particles, and its advantages over other techniques for applications in the toner industry where tight control of particle size distribution is important for product quality and performance.
This document provides an overview of sample preparation techniques for X-ray fluorescence (XRF) analysis. It discusses preparation of metal, powder, and liquid samples. For metals, the sample surface must be ground flat to remove impurities and obtain consistent roughness. Powder samples are pressed into pellets after pulverization to reduce heterogeneity effects. Liquid samples can be analyzed directly in sample cells or by drying microdroplets on filter paper to measure lighter elements. Proper sample preparation is crucial for obtaining accurate and reproducible XRF analysis results.
spectrophotometric estimation of metformin in bulk and in its dosage formsaikiranyuvi
The document describes the development and validation of a UV spectrophotometric method for the estimation of metformin in bulk and tablet dosage forms. Key aspects include determining the absorption maximum of 646 nm for metformin and developing a linear calibration curve within the concentration range of 8-16 μg/ml. The method was validated based on parameters such as accuracy, precision, LOD, LOQ and recovery, demonstrating the method is simple, accurate, precise and can be used to analyze metformin in pharmaceutical formulations.
UV spectrophotometric method development and validation for quantitative esti...Sagar Savale
UV Spectrophotometric Method Development and Validation for quantitative estimation of Ondansetron
Hydrochloride (HCL). U.V Spectrophotometric method have been widely employed in determination of
individual components in a mixture or fixed dose combination. Our aim is to develop spectroscopic method for
estimation of the Ondansetron HCL in ternary mixture by using U.V spectrophotometry. The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method. It was
successfully applied for the analysis of the drug in bulk and could be effectively used for the routine analysis.
A Systematic Approach to Overcome the Matrix Effect during LC-ESI-MS/MS AnalysisBhaswat Chakraborty
This document discusses matrix effects (MEs) that can occur during LC-MS/MS bioanalytical methods and presents a systematic approach to overcome MEs through different sample extraction techniques. It finds that solid phase extraction produces the cleanest samples with the lowest MEs, while protein precipitation using methanol produces the dirtiest samples with the highest MEs. Different phospholipids are identified as contributing to MEs, with longer-retained phospholipids playing a more significant role. Among extraction methods, solid phase extraction is most effective at removing phospholipids and minimizing MEs, while protein precipitation is least effective.
This document summarizes an article that examines how different ionization source designs in LC-ESI-MS/MS systems can influence matrix effects during analysis. The article analyzes acamprosate (ACM) using two different LC-ESI-MS/MS instruments with different ionization source designs (a Z-spray source and an orthogonal spray source) coupled to UPLC/HPLC systems under the same chromatographic conditions. It finds that ACM showed almost complete ion suppression in the Z-spray source coupled to UPLC/HPLC, but only minor ion enhancement in the orthogonal spray source coupled to HPLC. Different phospholipids were responsible for the matrix effects in each case. The study demonstrates how
IRJET - Different Curing Modes and its Effect on Colour Stability of Univers...IRJET Journal
This study evaluated the effect of different curing modes of LED light on color stability of composite resin. Composite resin discs were cured using either continuous or intermittent LED curing modes. The discs were then immersed in methylene blue dye and alcohol, and the amount of dye absorption was measured using a spectrophotometer. The results showed that composite resin discs cured with the intermittent mode absorbed less dye compared to those cured continuously, indicating greater color stability with intermittent curing. The authors concluded that curing mode affects the degree of monomer conversion and properties of the composite resin such as color stability, with intermittent curing demonstrating better color stability.
The Effect of Milling Times and Annealing on Synthesis of Strontium Titanate ...AM Publications
Analysis of microstructure of Strontium titanate (SrTiO3) phase obtained by milling and annealing of
SrCO3 and TiO2 precursors. However, the material properties for strontium titanate require a careful control of
crystallite structure as well as microstructure design to meet a specific application. The mixture of strontium
carbonate (SrCO3) and tintanium oxide (TiO2) powders was used to obtain SrTiO3 phase by using vibrator ball mill
with ball to powder ratio 10:1 and heat treatment processes. The size of powder particles was determined by a laser
particle analyzer (PSA). The X-ray diffraction methods were used for qualitative, quantitative phase analyses and for
crystallite size and lattice distortion determination. The milling process of strontium carbonate and tintanium oxide
mixture causes decrease of the mean particle size and crystallite size of involved phases. The X-ray diffraction
investigations of SrCO3 and TiO2 mixture milled for 60 hours and annealed at 900°C with 24 h of holding time
enabled the identification of SrTiO3 phase. Annealing the sample of the particles at 900 0C has resulted in a dense
compact and promoted the formation of particles containing nanocrystallites. The crystallite-growth samples of
SrTiO3 phase were dependent on temperature and time of their annealing
IOSRPHR(www.iosrphr.org) IOSR Journal of Pharmacyiosrphr_editor
A simple reverse phase liquid chromatographic method was developed and validated for the simultaneous estimation of lornoxicam and paracetamol from their pharmaceutical dosage forms. The method utilized a C18 column with a mobile phase of potassium dihydrogen phosphate (pH 7.3) and acetonitrile (70:30) and detected compounds at 257nm. The method was linear over 20-60μg/ml for paracetamol and 0.2-1.8μg/ml for lornoxicam. Retention times were 2.33 minutes for paracetamol and 7.61 minutes for lornoxicam. The method was validated per ICH guidelines and demonstrated good precision, accuracy, reproducibility
This document discusses various pre-formulation studies including analytical methods used to characterize drug substances and formulations. It describes techniques such as microscopy, differential scanning calorimetry (DSC), powder X-ray diffraction (PXRD), and thermogravimetric analysis (TGA) that are used to investigate the physical and chemical properties of drugs and excipients alone and in combination. Specific application and procedures for each technique are provided with examples.
Design of computerized monitoring and processing system for magnetic field c...IJECEIAES
Black power represents the main difficulty faced by the oil flow in pipelines. The negative effect of this powder reaches to stop the oil flow due to clogging the pipelines, in addition to the damaging of the crude oil pumps. Many solutions have been proposed in literature based on chemical or physical processes. On the other side, applying the fixed magnetic field has been presented in separation and extraction process of metal impurities in water pipelines applications. From these facts, this paper proposes an alternative solution (idea, design, and methodology for future implementation) for the black power removing from oil pipelines. The proposed system works on firstly sensing the resistivity parameter in the crude oil as an indication about the oil status with respect to the quantity of the black powder particles, then works on monitoring and controlling the level, location, and polarity of the required magnetic field that to work on cracking particles cracking function that in order to facilitate the crude oil motion in the pipelines. In addition, the proposed solution presents a new design of electrical resistivity sensor as an important indication in terms of evaluating the proposed system performance.
Part A: To developed analytical (UV) method for determination of TOPIRAMATE in bulk and in oral solid dosage form.
Part B: To validate developed method as per ICH guidelines for parameters:
COLORIMETRY
It is the science & technology used to quantify & describe physically the Human color perception.
selection of drug
1. Drugs which not have strong UV absorbance.
2. Drugs for which Colorimetric methods are not available.
3. Drugs for which methods are available but they are time consuming & complex.
Eg. Dicloxacillin, Topiramate etc.
conclusion
Colorimetric method was developed and validated as per ICH guidelines for estimation of Topiramate in tablets.
Development of color is by reaction of amino group of drug with Ninhydrin reagent in presence of pyridine.
The method was found to be simple, accurate, precise and specific.
So, the proposed method can be used for the routine quality control analysis of the bulk drug as well as oral dosage forms.
This document outlines the development and validation of a derivative spectroscopic method for estimating serratiopeptidase and diclofenac sodium in bulk and tablet form. The method involves using UV-visible spectrophotometry and derivative spectroscopy to qualitatively and quantitatively analyze the two drugs. The method was developed using the Shimadzu UV-1700 instrument. Standard solutions of the drugs were used to generate calibration curves and determine linearity, accuracy, and precision of the method as per ICH guidelines. The developed and validated method was found to be simple, rapid, precise, and accurate for routine analysis of serratiopeptidase and diclofenac sodium in tablet dosage forms.
Method development and validation for the estimation of metronidazole in tabl...pharmaindexing
This document describes the development and validation of two spectrophotometric methods for the estimation of metronidazole in tablet dosage forms. The methods utilize UV spectroscopy and first derivative spectroscopy. Metronidazole showed maximum absorbance at 313nm in methanol:water for UV spectroscopy and a minimum at 298nm for derivative spectroscopy. Both methods were linear between 4-12μg/ml and were validated according to ICH guidelines. The methods were found to be accurate, precise and reproducible for the analysis of metronidazole in pure form and pharmaceutical formulations.
In vitro tests of adhesive and composite dental materialsSilas Toka
The document summarizes a review article on the relevance of in vitro tests of adhesive and composite dental materials. It discusses how laboratory tests are standardized according to ISO protocols to evaluate properties like depth of cure, flexural strength, water sorption and solubility. While laboratory tests provide useful data on material properties, they do not replace clinical studies. Some laboratory recommendations did not prove superior to simpler techniques in clinical trials. Additionally, unexpected clinical problems may arise that were not anticipated by laboratory testing alone, emphasizing the need to augment laboratory studies with long-term clinical evaluations.
The introduction of a research report should include the purpose, scope, and background. The purpose section explains what the research aimed to discover, the type of problem studied, and why. The scope details how the researcher approached the problem and any other methods considered. The background provides necessary context for readers by discussing previous work and assumptions.
The document defines a hypothesis as a conjectural statement or tentative explanation about the relationship between two or more variables that can be tested. Several authors contribute definitions stating that a hypothesis makes a specific, testable prediction and must be falsifiable. Key aspects of a hypothesis include identifying variables, having explanatory power, and being testable, quantifiable, and generalizable. The document also distinguishes between statistical hypotheses about population parameters, null hypotheses being tested, and critical regions for rejecting null hypotheses based on sample data.
The document provides an overview of hypothesis testing, including defining the null and alternative hypotheses, types of errors, significance levels, critical values, test statistics, and conducting hypothesis tests using both the traditional and p-value methods. Examples are provided for z-tests, t-tests, and tests of proportions to demonstrate applications of hypothesis testing methodology.
This document provides an overview of a presentation on statistical hypothesis testing using the t-test. It discusses what a t-test is, how to perform a t-test, and provides an example of a t-test comparing spelling test scores of two groups that received different teaching strategies. The document outlines the six steps for conducting statistical hypothesis testing using a t-test: 1) stating the hypotheses, 2) choosing the significance level, 3) determining the critical values, 4) calculating the test statistic, 5) comparing the test statistic to the critical values, and 6) writing a conclusion.
The patch clamp technique allows researchers to study single or multiple ion channels in cells. It involves forming a high resistance seal between a glass micropipette and a cell membrane to record electric currents. There are several variations of the technique, including cell-attached, inside-out, whole-cell, and outside-out patches, that provide different ways to manipulate and study ion channels. The patch clamp technique was developed in the 1970s-1980s and was a major breakthrough that enabled recording electric currents from single ion channels for the first time.
The document discusses the similarities between statistical hypothesis testing and judicial decision making. Both involve making dichotomous decisions (e.g. guilty/not guilty, different/not different) where there are four possible outcomes. The default position for both is "not guilty" or failing to reject the null hypothesis. Both processes aim to minimize Type 1 errors (false positives) by establishing standards of evidence required to reject the default.
Analyze Gear Failures and Identify Defects in Gear System for Vehicles Using ...IOSR Journals
This document summarizes a research paper that analyzes gear failures and identifies defects in gear systems for vehicles using digital image processing. The paper proposes a gear defect recognition system that uses computer vision and local thresholding techniques to identify possible defects in gears. The recognizer processes digital images of gears, applies restoration and thresholding techniques to generate binary images, counts the number of teeth to determine if it matches expected specifications, and can identify defective areas. Experimental results on plastic gear images demonstrate the system's ability to detect defects like differences in tooth counts and surface blemishes. The paper concludes that future work could apply machine learning to make defect detection more robust and accurate over time.
IRJET- Review on Design and Fabrication of Coating Powder Filtration MachineIRJET Journal
This document summarizes a research paper on the design and fabrication of a coating powder filtration machine. It discusses how coating powder is currently filtered manually, which is time-consuming and involves labor costs. The proposed filtration machine would automate the sieving process using a cyclone and sieving mechanism to efficiently filter coating powder and allow it to be reused, reducing costs. The literature review covers past research on cyclone efficiency, sieving kinetics, simulation of sieving behavior, and effects of nanoparticles on powder flow properties. There is a need for an automated process to address issues with the current manual filtration method. The conclusion discusses how cyclone efficiency and pressure drop are affected by design parameters based on analysis of previous studies.
A Review on Fabric Defect Detection TechniquesIRJET Journal
The document discusses techniques for automated fabric defect detection. It begins with an introduction to the importance of automated defect detection systems for quality control in the textile industry. It then categorizes fabric defect detection techniques into three groups: statistical, spectral, and model-based approaches. The majority of the document describes various statistical approaches that have been used for defect detection, including methods based on morphological operations, thresholding, fractal dimension, edge detection, co-occurrence matrices, autocorrelation functions, eigenfilters, local linear transforms, histograms, and local binary patterns. Spectral and model-based approaches are also briefly mentioned. The goal of the review is to evaluate and compare different computer vision-based defect detection algorithms.
A practical approach to eliminate defects in gravity die cast al alloy castin...eSAT Journals
Abstract
This paper deals with elimination of defects in aluminium alloy castings produced by gravity die casting process. The main intention of work is to investigate the defects and improve quality of a gravity die cast component using Computer Aided Casting Simulation Software. In this study an industrial gravity casting die is used which was producing defective components. The die and components produced by the die are studied to eliminate the defects using virtual simulations. The defects in the components are identified to be solidification shrinkage, cracks, unfilled riser and incomplete mould cavity. The reasons for the defects are analyzed as either improper selection of process parameters, or improper design of gating and risering system. SOLIDCast simulation software is used for simulating the solidification process of casting and visualizing outputs showing possible problematic areas or defects which may occur in the cast product. The work is carried out in two stages. In first stage, few test castings are produced by modifying the process parameters (pouring temperature, pouring time, pre heat and alloy type) and results are compared with simulation results produced using same parameters. The pouring and simulation results are observed to be in good accordance with each other. In second stage, number of virtual iterations of casting is performed by changing riser dimensions. It was found from the simulation results that riser with 35mm diameter is required to produce casting with zero defects. The die is modified accordingly with the simulation results and metal is poured. The castings produced are observed to be sound and contain no defects; and also it is verified that solidification simulation helps in locating the defects, eliminating them and ultimately improving the quality of castings without any shop-floor trails.
Keywords: Aluminum-Alloys, Casting Defects, Gravity Die Casting, Material Density and SOLIDCast Simulation
1) The study uses Failure Mode and Effects Analysis (FMEA) to analyze and prioritize causes of broken filament defects in a direct spin drawing yarn production process.
2) FMEA identified the 10 highest risk causes, which were mostly related to detection methods and machine parameter settings.
3) Improvements to detection methods and frequencies reduced the defective rate from 3.35% to 1.76%. Machine parameters were suggested to further optimize using design of experiments.
Analysis of Wear Rate of Internal Combustion Engine using Ferrography TechniqueIRJET Journal
This document analyzes wear rate in an internal combustion engine using ferrography technique. Lubricating oil samples were collected from the oil sump of a single cylinder, four-stroke petrol engine at different mileage intervals. Ferrography analysis was performed to identify wear particle concentration, shape, and size in each sample. Results showed wear particle concentration was highest for the new engine and decreased with mileage, indicating higher wear rate initially that lowered over time. Ferrography allows predictive maintenance by detecting potential component failures from abnormal wear particles before serious damage occurs.
Grease Sampling and Analysis of Offshore Wind Installations in Europe to Impr...Rich Wurzbach
This document summarizes a study on sampling and analyzing grease from offshore wind turbine installations in Europe. The study aimed to develop reliable grease sampling and analysis methods to assess bearing condition and improve reliability. Researchers tested active grease sampling devices and analyzed samples for properties like ferrous debris, moisture, and consistency. Spatial sampling of two turbine bearings showed heterogeneity in grease properties. Analysis methods provided accurate wear and contamination data for condition monitoring and optimized maintenance. The study demonstrated that grease analysis is an important tool for monitoring wind turbine bearing health.
Leather Quality Estimation Using an Automated Machine Vision SystemIOSR Journals
This document describes a proposed machine vision system to automate the inspection and quality estimation of leather materials. Key steps of the proposed methodology include image acquisition, preprocessing, segmentation of defects, computation of defect features like location, area, perimeter, and a histogram analysis to estimate surface smoothness. These quantitative defect features would be compiled into a feature vector to objectively determine the quality of the leather in a standardized way. Related works on automated leather inspection and applications of machine vision in leather manufacturing processes are also reviewed. The proposed system aims to provide repeatable, consistent and time-efficient leather quality assessment compared to manual inspection.
Leather Quality Estimation Using an Automated Machine Vision SystemIOSR Journals
Abstract :In the presented work, it is proposed to design the feature vector of the parameters of leather
material in order to completely define the quality of the leather material. The proposed parameters are holes,
cracks, spots, cuts, and roughness etc.. The defects are localized according to their position on leather surface,
their size and shape. The histogram analysis method is proposed to be developed for use with very low-level
image features, such as color and luminance and is used as an image descriptor for color-matching
requirements. In the present scenario, it is observed that the leather quality is highly sensitive to surface finish
of the leather material. Manually it is not possible always to inspect each area of the leather surface under test
because of heavy lot of material and it is time consuming too. The main problem during the inspection is that
how to achieve the repeatable quality at regular interval. Therefore, in order to get the authentic leather
quality, it is required to have a set of features that could be given some numerical value so that the quality can
be justified in quantified manner. A machine vision system offers a fair solution in order to solve this problem.
In the proposed work, we propose to deduce some mathematical parameters besides size and location that are arranged in a vector so as to determine the leather quality.
Keywords: GLCM Gray Level Co-occurrence Matrix
IRJET- Real Time Vision System for Thread Counting in Woven FabricIRJET Journal
This document presents a real-time vision system for automatically counting threads in woven fabrics. It begins with an introduction to woven fabrics and the traditional manual method of counting threads, which is time-consuming and prone to errors. It then describes a proposed automated system using image processing techniques like blob detection and feature matching to track fabric motion and recognize warp and weft counts in real-time with high accuracy. The system is tested on denim fabric and is able to accurately count the number of warp and weft threads in the sample image. The automated approach provides an improvement over manual counting by reducing labor costs and eliminating human errors.
IRJET-Develpoment and Analysis of Frequency Response Setup for Pole Shoe Ferr...IRJET Journal
This document discusses the development and analysis of a frequency response setup for non-destructively testing pole shoe components made of carbon steel. The setup uses vibration analysis to detect internal defects in pole shoes. Vibrations are induced using a DC motor and spring assembly, and responses are measured using a vibrometer to obtain frequency, displacement, and acceleration readings. The results are aimed to help identify defective pole shoes early to prevent wasted time and materials from being used in motors. Non-destructive vibration-based testing provides an economical and reliable alternative to other NDT methods for these components.
REAL-TIME MOUTH DEFECTS DETECTION ON MILITARY CARTRIDGE CASEScsandit
A military cartridge includes four elements; case, capsule, ammunition and powder. While
manufacturing, defects may occur in the case. These defects should be detected and the defected
cases should be separated. Defects could occur in the mouth, surface and primer parts of the
case. This paper proposes the methodology that involves the real-time inspection of the defects
in the mouth part of the cases using image processing techniques. The algorithms of the
proposed methodology were implemented on real images and the obtained results have showed
that common defects such as split and dent defects occurring on the mouth part of the case can
be detected with high accuracy.
A Novel System to Monitor Illegal Sand Mining using Contour Mapping and Color...CSCJournals
Developing nations face the issue of illegal and excessive land mining which has adverse effects on the environment. A robust and cost effective system is presented in this paper to monitor the mining process. This system includes a novel vehicle detection approach for detecting vehicles from static images and calculating the amount of sand being carried to prevent the malpractices of sand smuggling. Different from traditional methods, which use machine learning to detect vehicles, this method introduces a new contour mapping model to find important “vehicle edges” for identifying vehicles The sand detection algorithm uses color based segmentation since sand can have various colors under different weather and lighting conditions The proposed new color segmentation model has excellent capabilities to identify sand pixels from background, even though the pixels are lighted under varying illuminations. The detected amount of sand is checked against the maximum set threshold value specific to the recognized vehicle. Experimental results show that the integration of Hough features and color based image segmentation is powerful. The average accuracy rate of the system is 94.9%.
The document presents an experimental study on the effect of various parameters on the tribological performance of a nano lubricant containing multi-walled carbon nanotubes (MWCNT). Design of experiments was used to study the effect of four factors - MWCNT quantity, surfactant quantity, load, and speed. Experiments were conducted using a block-on-disk tribometer with four levels of each factor. The results showed that speed and MWCNT quantity had the greatest effect on wear, followed by surfactant quantity and load. The interaction between load and surfactant quantity and between load and MWCNT quantity were also significant. The minimum wear was achieved with 0.05% MWCNT in the nano lubric
This document presents an experimental study on the parameters affecting the tribological performance of nano lubricant containing multi-walled carbon nanotubes (MWCNT) using design of experiments (DOE). Four factors were studied - MWCNT quantity, surfactant quantity, load, and speed. Experiments were conducted using a block and disk test setup to measure wear. The results showed that speed and MWCNT quantity had the greatest effect on wear, followed by surfactant quantity and load. The interaction between load and surfactant quantity and between load and MWCNT quantity were also significant. The study concluded that adding 0.05% MWCNT to the nano lubricant significantly reduced wear under different load and speed conditions
Survey on Different Methods for Defect DetectionIRJET Journal
This document discusses various methods for defect detection in images and products. It begins with an introduction to digital image processing and its applications such as enhancement, restoration, and segmentation. Defect detection is important for quality control in manufacturing. Traditionally, human inspection was used but it has disadvantages. The document then surveys statistical approaches like autocorrelation functions and co-occurrence matrices. Spectral approaches including Fourier transforms, wavelet transforms, and Gabor filters are also covered. Model-based approaches using autoregressive models are summarized as well. The advantages and disadvantages of each method are compared. In conclusion, the document states that combining approaches may provide better results than individual methods for defect detection.
Analysis of Image Fusion Techniques for fingerprint Palmprint Multimodal Biom...IJERA Editor
The multimodal Biometric System using multiple sources of information has been widely recognized. However computational models for multimodal biometrics recognition have only recently received attention. In this paper the fingerprint and palmprint images are chosen and fused together using image fusion methods. The biometric features are subjected to modality extraction. Different fusion methods like average fusion, minimum fusion, maximum fusion, discrete wavelet transform fusion and stationary wavelet transformfusion are implemented for the fusion of extracting modalities. The best fused template is analyzed by applying various fusion metrics. Here the DWT fused image provided better results.
Recognition of Surgically Altered Face ImagesIRJET Journal
This document discusses a multiple granular algorithm to match face images before and after plastic surgery. It extracts 40 face granules from images at different levels of granularity. Features are extracted from granules using SIFT and EUCLBP descriptors. A genetic algorithm is used to select features and optimize weights for each granule. The algorithm combines information from multiple granules to address nonlinear variations caused by plastic surgery. Experiments show it achieves higher accuracy than existing algorithms in recognizing surgically altered faces.
Analysis and optimization of sand casting defects with the help of artificial...eSAT Journals
Abstract Casting defects is one and the only limitation in any casting process. Sand casting process too suffers from the same problem. Finding out the optimum condition towards acquiring minimum casting defects is very critical. The normal method that most of the companies use is the trial and error method. But due to limitations like error prone results, expensive and time consuming, this method causes too much cost to company. In this paper, an attempt has been made to minimize the casting defects by optimizing the process parameters of sand casting defect using Artificial Neural Network (ANN). The Toolbox of the MATLAB software is used to run the different values of the parameters. Parameters are selected on the basis of survey from different industry and a rigorous research of the previous paper on this topic. Before that a program was prepared in MATLAB to generate the values of different parameter by using their highest and smallest values which has been collected from a local casting industry. In our first attempt to optimize the sand casting process parameters, it is found that if we consider randomly both input and output just by considering their limits the results are satisfactory just up to a limit. And it changes if the parameters are being changed. For specific conditions the result is found to be 3.175 as casting defect. Later on a new type of program is generated based on the relation of sand casting parameters and sand casting defects. Three specific casting defects are considered called, Expansion Defect, Gas Defect, Weak sand Defect. At last, their results are found to be Expansion defect: 6.23%, Gas Defect: 7.28%, Weak Sand Defect: 5.74%. Keywords: Sand casting, Artificial Neural Network (ANN), MATLAB, Casting Defect
This document summarizes the development of an automated drapability tester that quantifies the draping behavior of reinforcement fabrics. The tester combines force measurement with optical analysis to detect defects like gaps, loops, and wrinkles during forming. It uses cameras and laser scanning to capture these defects, allowing drapability effects to be quantified. Test results on non-crimp fabrics and woven fabrics show how the tester can measure forces, gap widths, fiber misalignment, and sample deformation at different forming levels. The automated tester provides detailed drapability data to support composite part and process design.
Similar to Statistical Hypothesis Testing Of the Increase in Wear Debris Size Parameters and the Deterioration of Oil (20)
This document discusses the impact of data mining on business intelligence. It begins by defining business intelligence as using new technologies to quickly respond to changes in the business environment. Data mining is an important part of the business intelligence lifecycle, which includes determining requirements, collecting and analyzing data, generating reports, and measuring performance. Data mining allows businesses to access real-time, accurate data from multiple sources to improve decision making. Using business intelligence and data mining techniques can help businesses become more efficient and make better decisions to increase profits and customer satisfaction. The expected results of applying business intelligence include improved decision making through accurate, timely information to support organizational goals and strategic plans.
This document presents a novel technique for solving the transcendental equations of selective harmonics elimination pulse width modulation (SHEPWM) inverters based on the secant method. The proposed algorithm uses the secant method to simplify the numerical solution of the nonlinear equations and solve them faster compared to other methods. Simulation results validate that the proposed method accurately estimates the switching angles to eliminate specific harmonics from the output voltage waveform and achieves near sinusoidal output current for various modulation indices and numbers of harmonics eliminated.
This document summarizes a research paper that designed and implemented a dual tone multi-frequency (DTMF) based GSM-controlled car security system. The system uses a DTMF decoder and GSM module to allow a car to be remotely controlled and secured from a mobile phone. It works by sending DTMF tones from the phone through calls to the GSM module in the car. The decoder interprets the tones and a microcontroller executes commands to disable the ignition or control other devices. The system was created to improve car security and accessibility through remote monitoring and control with DTMF and GSM technology.
This document presents an algorithm for imperceptibly embedding a DNA-encoded watermark into a color image for authentication purposes. It applies a multi-resolution discrete wavelet transform to decompose the image. The watermark, encoded into DNA nucleotides, is then embedded into the third-level wavelet coefficients through a quantization process. Specifically, the watermark nucleotides are complemented and used to quantize coefficients in the middle frequency band, modifying the coefficients. The watermarked image is reconstructed through inverse wavelet transform. Extraction reverses these steps to recover the watermark without the original image. The algorithm aims to balance imperceptibility and robustness through this wavelet-based, blind watermarking scheme.
1) The document analyzes the dynamic saturation point of a deep-water channel in Shanghai port based on actual traffic data and a ship domain model.
2) A dynamic channel transit capacity model is established that considers factors like channel width, ship density, speed, and reductions due to traffic conditions.
3) Based on AIS data from the channel, the average traffic flow is calculated to be 15.7 ships per hour, resulting in a dynamic saturation of 32.5%, or 43.3% accounting for uneven day/night traffic volumes.
The document summarizes research on the use of earth air tunnels and wind towers as passive solar techniques. Key findings include:
- Earth air tunnels circulate air through underground pipes to take advantage of the stable temperature 4 meters below ground for cooling in summer and heating in winter. Testing showed the technique can reduce ambient temperatures by up to 14 degrees Celsius.
- Wind towers circulate air through tall shafts to cool air entering buildings at night and provide downward airflow of cooled air during the day.
- Experimental testing of an earth air tunnel system over multiple months found maximum temperature reductions of 33% in spring and minimum reductions of 15% in summer.
The document compares the mechanical and physical properties of low density polyethylene (LDPE) thin films and sheets reinforced with graphene nanoparticles. LDPE/graphene thin films were produced via solution casting, while sheets were made by compression molding. Testing showed that the thin films had enhanced tensile strength, lower melt flow index, and higher thermal stability compared to sheets. The tensile strength of thin films increased by up to 160% with 1% graphene, while sheets increased by 70%. Melt flow index decreased more for thin films, indicating higher viscosity. Thin films also showed greater improvement in glass transition temperature. These results demonstrate that processing technique affects the properties of LDPE/graphene nanocomposites.
The document describes improvements made to a friction testing machine. A stepper motor and PLC control system were added to automatically vary the load on friction pairs, replacing the manual method. Tests using the improved machine found that the friction coefficient decreases as the load increases, and that abrasive and adhesive wear increased with higher loads. The improved machine allows more accurate and convenient testing of friction pairs under varying load conditions.
This document summarizes a research article that investigates the steady, two-dimensional Falkner-Skan boundary layer flow over a stationary wedge with momentum and thermal slip boundary conditions. The flow considers a temperature-dependent thermal conductivity in the presence of a porous medium and viscous dissipation. Governing partial differential equations are non-dimensionalized and transformed into ordinary differential equations using similarity transformations. The equations are highly nonlinear and cannot be solved analytically, so a numerical solver is used. Numerical results are presented for the skin friction coefficient, local Nusselt number, velocity and temperature profiles for varying parameters like the Falkner-Skan parameter and Eckert number.
An improvised white board compass was designed and developed to enhance the teaching of geometrical construction concepts in basic technology courses. The compass allows teachers to visually demonstrate geometric concepts and constructions on a white board in an engaging, hands-on manner. It supports constructivist learning principles by enabling students to observe and emulate the teacher. The design process utilized design and development research methodology to test educational theories and validate the practical application of the compass. The improvised compass was found to effectively engage students and improve their performance in learning geometric constructions.
The document describes the design of an energy meter that calculates energy using a one second logic for improved accuracy. The meter samples voltage and current values using an ADC synchronized to the line frequency via PLL. It calculates active and reactive power by averaging the sampled values over each second. The accumulated active power for each second is multiplied by one second to calculate energy, which is accumulated and converted to kWh. Test results showed the meter achieved an error of 0.3%, within the acceptable limit for class 1 meters. Considering energy over longer durations like one second helps reduce percentage error in the calculation.
This document presents a two-stage method for solving fuzzy transportation problems where the costs, supplies, and demands are represented by symmetric trapezoidal fuzzy numbers. In the first stage, the problem is solved to satisfy minimum demand requirements. Remaining supplies are then distributed in the second stage to further minimize costs. A numerical example demonstrates using robust ranking techniques to convert the fuzzy problem into a crisp one, which is then solved using a zero suffix method. The total optimal costs from both stages provide the solution to the original fuzzy transportation problem.
1) The document proposes using an Adaptive Neuro-Fuzzy Inference System (ANFIS) controller for a Distributed Power Flow Controller (DPFC) to improve voltage regulation and power quality in a transmission system.
2) A DPFC is placed at a load bus in an IEEE 4 bus system and its performance is compared using a PI controller and ANFIS controller.
3) Simulation results show the ANFIS controller provides faster convergence and better voltage profile maintenance during voltage sags and swells compared to the PI controller.
The document describes an improved particle swarm optimization algorithm to solve vehicle routing problems. It introduces concepts of leptons and hadrons to particles in the algorithm. Leptons interact weakly based on individual and neighborhood best positions, while hadrons (local best particles) undergo strong interactions by colliding with the global best particle. When stagnation occurs, particle decay is used to increase diversity. Simulations show the improved algorithm avoids premature convergence and finds better solutions compared to the basic particle swarm optimization.
This document presents a method for analyzing photoplethysmographic (PPG) signals using correlative analysis. The method involves calculating the autocorrelation function of the PPG signal, extracting the envelope of the autocorrelation function using a low pass filter, and approximating the envelope by determining attenuation coefficients. Ten PPG signals were collected from volunteers and analyzed using this method. The attenuation coefficients were found to have similar values around 0.46, providing a potentially useful parameter for medical diagnosis.
This document describes the simulation and design of a process to recover monoethylene glycol (MEG) from effluent waste streams of a petrochemical company in Iran. Aspen Plus simulation software was used to model the process, which involves separating water, salts, and various glycols (MEG, DEG, TEG, TTEG) using a series of distillation columns. Sensitivity analyses were performed to optimize column parameters such as pressure, reflux ratio, and boilup ratio. The results showed that MEG, DEG, TEG, and TTEG could be recovered at rates of 5.01, 2.039, 0.062, and 0.089 kg/hr, respectively.
This document presents a numerical analysis of fluid flow and heat transfer characteristics of ventilated disc brake rotors using computational fluid dynamics (CFD). Two types of rotor configurations are considered: circular pillared (CP) and diamond pillared radial vane (DP). A 20° sector of each rotor is modeled and meshed. Governing equations for mass, momentum, and energy are solved using ANSYS CFX. Boundary conditions include 900K and 1500K isothermal rotor walls for different speeds. Results show the DP rotor has 70% higher mass flow and 24% higher heat dissipation than the CP rotor. Velocity and pressure distributions are more uniform for the DP rotor at higher speeds, ensuring more uniform cooling. The
This document describes the design and testing of an automated cocoa drying house prototype in Trinidad and Tobago. The prototype included automated features like a retractable roof, automatic heaters, and remote control. It aims to address issues with the traditional manual sun drying process, which is time-consuming and relies on human monitoring of changing weather conditions. Initial testing with farmers showed interest in the automated system as a potential solution.
This document presents the design of a telemedical system for remote monitoring of cardiac insufficiency. The system includes an electrocardiography (ECG) device that collects and digitizes ECG signals. The ECG signals undergo digital signal processing including autocorrelation analysis. Graphical interfaces allow patients and doctors to view ECG data and attenuation coefficients derived from autocorrelation analysis. Data is transmitted between parties using TCP/IP protocol. The system aims to facilitate remote monitoring of cardiac patients to reduce hospitalizations through early detection of health changes.
The document summarizes a polygon oscillating piston engine invention. The engine uses multiple pistons arranged around the sides of a polygon within cylinders. As the pistons oscillate, they compress and combust air-fuel mixtures to produce power. This design achieves a very high power-to-weight ratio of up to 2 hp per pound. Engineering analysis and design of a prototype 6-sided engine is presented, showing it can produce 168 hp from a 353 cubic feet per minute air flow at 12,960 rpm. The invention overcomes issues with prior oscillating piston designs by keeping the pistons moving in straight lines within cylinders using conventional piston rings.
More from International Journal of Engineering Inventions www.ijeijournal.com (20)
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Trusted Execution Environment for Decentralized Process MiningLucaBarbaro3
Presentation of the paper "Trusted Execution Environment for Decentralized Process Mining" given during the CAiSE 2024 Conference in Cyprus on June 7, 2024.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Astute Business Solutions | Oracle Cloud Partner |
Statistical Hypothesis Testing Of the Increase in Wear Debris Size Parameters and the Deterioration of Oil
1. International Journal of Engineering Inventions
e-ISSN: 2278-7461, p-ISSN: 2319-6491
Volume 2, Issue 8 (May 2013) PP: 01-08
www.ijeijournal.com Page | 1
Statistical Hypothesis Testing Of the Increase in Wear Debris Size
Parameters and the Deterioration of Oil
Manoj Kumar1
, P. S. Mukherjee2
, N. M. Misra3
1
Mechanical Engineering Department, B.I.T. Sindri, Sindri Institute, Dhanbad-828123, Jharkhand, India
2
Department of Mechanical Engineering and Mining Machinery, Indian School of Mines, Dhanbad-826004,
Jharkhand, India
3
Department of Applied Chemistry, Indian School of Mines, Dhanbad-826004, Jharkhand, India
Abstract: The effectiveness of lubricant diminishes with use. It also affects the condition of the surface which it
is lubricating. Hence characteristics of the wear particles from the surface it is lubricating may change with the
condition of the lubricant. This work attempts to investigate the morphological changes of wear particles with
the oil degradation and can be helpful in finding the correlation between the two, the age of the oil and the
morphology of wear debris.
Wear particles from the two gear oil samples at substantial operating time interval were filtered and their
images were captured using SEM (Scanning Electron Microscope). These images were binarized and size
parameters of these binary images were extracted using blob analysis, using an image analysis software.
Increase in these size parameters with oil ageing were investigated by statistical hypothesis testing at 5%
significance level.
Significant increase in a few parameters of size of wear debris was observed with ageing of oil.
Keywords: electron microscopy, ferrography, gear oil, mining, significance level
I. Introduction
Early and reliable diagnostics, prior to machinery failure, is one of the key requirements for any
maintenance system. Methodologies like vibration and acoustic monitoring, thermal and visual inspection, and
wear debris analysis are currently used by maintenance personnel for this requirement [1]. Wear debris analysis
is a component of oil analysis in which the wear debris being carried by the lube oil are trapped and analyzed for
their chemical composition, colour, concentration, size distribution and morphology. The deterioration in
machine components and their unexpected failure can be monitored and avoided by morphological analysis of
wear particles as their morphological features are directly related to the mode and mechanism of wear existing
in the component [2]. Visual examination of wear debris has been used as a cost effective machinery diagnostic
method [3]. Fig.1 shows optical debris monitoring in the hierarchy of machinery failure prevention technology.
MACHINERY FAILURE PREVENTION TECHNOLOGY
Run – to – Failure Maintenance Preventive Maintenance Condition – Based Maintenance (CBM)
Vibration Analysis Performance
Monitoring
Oil Analysis
Wear Debris AnalysisLubricant Condition Tests
Spectrometric Oil Analysis
(SOA)
Optical Debris Monitoring
(Ferrography, Filter Analysis)
Chip Detectors
Figure1: Optical debris monitoring in machinery failure prevention technology [5]
2. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 2
The dependency on human expertise for the analysis and interpretation is the biggest hurdle for wear
debris analysis to be exploited by the industry to its full potential and becoming one of the most powerful
machine condition monitoring strategy. It makes the interpretation and result subjective in nature, costly and
time consuming. Its remedy is developing an automatic and reliable wear particle classification standard [4]. In
this conjunction, imaging techniques has been used to quantify the morphology of wear debris with numerical
parameters. Two dimensional binary images of wear particles can indicate the specific wear condition under
which they were generated [5]. This study uses binary images of wear debris separated from gear oil to extract
some of the size parameters and performs statistical hypothesis testing to investigate their variation with the
ageing of oil.
1.1. Previous Work
Since the advent of ferrography in 1970s, attempts are being made to use computer image analysis to
extract the morphological features of the wear debris to develop a reliable and automatic wear debris
classification system and also to study the distribution of these morphological parameters. Roylance and Pocock
[6] have applied Weibull distribution function to the size distribution of wear particles for the study of wear
condition. Kirk et al [7] have discussed different numerical parameters to describe the morphology of individual
wear particles. The computer images of the particle were analyzed using software developed for this study. Ahn
et al [8] have discussed statistical analysis based on the Weibull distribution function of skewness and mean
particle size distribution of wear debris. Skewness give trend in wear debris generation and mean size represents
severity of wear rating. Peng and Kirk [9, 10] and Peng [11] have used computer image analysis to extract
different morphological parameters of wear debris and then applied some artificial intelligence tools to get an
objective, reliable and automatic wear debris classification system. Cho and Tichy [12] have performed more
comprehensive quantitative analysis of wear debris. Wear debris morphology is quantified with numerical
parameters and further quantitative correlation is performed using multivariate statistical techniques to
demonstrate how specific statistical data analysis can be used to find out morphological groups of wear debris.
Cho and Tichy [5] have studied feasibility of observation of two-dimensional binary images of wear debris for
detecting the change of wear conditions. Analysis of variance is applied to determine which morphological
parameters are significantly affected by the difference in wear conditions. Laghari et al [13] describes a
knowledge based system to classify wear particles according to their morphological attributes of size, shape,
edge details, thickness ratio, colour and texture. Khan et al [14] describes an online debris shape analysis
technique. It uses imaging technology and rule based algorithms to perform near real time debris analysis
diagnostics.
1.2. Problem Definition
Cho and Tichy [5] using Analysis of variance had statistically investigated the influence of different
wear conditions on two-dimensional debris morphology. Wear conditions were varied by changing loading
conditions, material combinations, contact geometry, surface roughness and the oils used. They found that
among the size, shape and curvature parameters, size parameters were significantly affected, shape parameters
were moderately affected and curvature parameters were least affected by difference in wear conditions.
During its use lubricants degrade and many of its physical and chemical properties change. These changes must
affect the wear conditions and hence a variation in the wear debris morphology is expected. The available
literature on wear debris analysis focuses on determining the phase, mode and mechanism of wear to predict the
condition of machines. No work has been found to study the change in morphological parameters of wear
particles with the ageing of lubricating oil. Since among various two-dimensional morphological parameters, the
size parameters are most affected by changing wear conditions, this paper tries to investigate the effect of oil
ageing on some of the size parameters of wear particles.
II. Methodology
Wear particles were filtered from the sample oil using a vacuum arrangement and their images were
captured using electron microscopy. Image analysis software was used to process and analyze the image.
Different size parameters were extracted from the images using blob analysis. When working with bright
objects, a blob is a group of touching nonzero pixels. Any pixel with zero value is considered to be part of
background. The size parameters used in this study were –
Area was calculated by counting number of pixels in the given blob in µm2
.
Perimeter was the total length of edges of the required blob in µm, with an allowance made for staircase effect.
Major length and minor length as described later in section 5.3, were determined using Feret’s diameter.
Convex Perimeter is an approximation of the perimeter of the convex hull of the blob. It was derived from
several Feret’s diameters.
3. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 3
Hypothesis testing was used to verify our assumption about population parameter. Hypothesis testing is
about making inferences about a population from only a small sample. In hypothesis testing we first make an
assumption about the population parameter, called null hypothesis, H0. Then this hypothesis is tested with the
help of difference between the sample statistic and the hypothesized population parameter. How large the
difference will be acceptable or not is totally the decision maker’s choice and he decides it on the risk he
assumes of rejecting a null hypothesis when it is true. This is quantified by a term called Significance Level,
which sets a limit, when the difference between the sample statistic and hypothesized population parameter
becomes significant enough to reject the hypothesized value [15]. For our studies, 5% significance level was
chosen based on the available literature on wear debris analysis [5].
III. Experimental Procedure
3.1. Sample Collection And Debris Separation
Gear oil samples were collected from the differential assembly of a dumper used for open cast coal
mining. The first sample was at 200 hours of running after the drain off and recharge (called Sample1) and
second sample was of drain off oil at 2000 hours of running (called Sample2). The dumper selected was of 100
ton capacity, Caterpillar make and the oil being used in it was of HTF C4 SAE60 type and MAK make. To
ensure the sample drawing from mid layer of reservoir, vacuum pump with disposable plastic tube was used and
samples were kept in plastic bottles with proper labels to identify them. The vacuum pump and storage bottles
were rinsed with solvent and flushed with fresh oil to avoid contamination. Oil was filtered following a method
described by Hunt [16]. 15 ml of sample was filtered without dilution with Axiva nylon filter of 0.2 µm pore
size on a vacuum arrangement. The solvent was gently allowed to pass through the filter after switching off the
vacuum pump. Then vacuum pump was run for around 20 minutes for air to pass through the filter paper to dry
it, followed by drying in an oven at 1200
C for approximately 24 hours.
3.2. Image Acquisition
A portion of around 12mmX12mm was cut from this filter and placed on a stub with both side adhesive
carbon tape. The sample was gold sputtered at 5-10 Pa pressure and 10-15 mAmp current in Hitachi E1010 Ion
Sputter. This sample was placed in SEM (Hitachi 3400N) with chamber pressure less than 1Pa to capture the
images of wear debris. An Image at lower magnification of X40-X60 (Fig. 2) gives an overall idea of particle
distribution in the oil. Our aim was to get random images of individual particles and ensuring also that the
particles were not repeated. For this we started taking image of particle in one corner, say top left. After many
trials, the magnification was fixed at X600 for image acquisition, as at this magnification image of most of the
individual particles of significant size could be obtained. After capturing initial image at X600, we moved frame
by frame with the direction keys, only in horizontal direction keeping the vertical coordinate fixed till we got a
new particle in the frame. Image of this particle was captured and then we moved further right repeating the
process till the other end of the sample was reached. Now we moved vertically downwards with direction keys,
till all the area of previous frame disappeared from the new frame. We started moving left horizontally capturing
the images appearing in the frame. The process was repeated till images of around thirty particles were captured.
Thirty was kept to ensure that sample size was sufficient to apply central limit theorem and use normal
distribution as an approximation to sampling distribution without having any idea about the actual distribution
of population [15]. The process was repeated for Sample2. Fig.3 and Fig.4 are two such images from Sample1
and Sample2 respectively.
Figure2: SEM image at X40 magnification of debris
filtered from gear oil Sample2
Figure3: SEM image of individual particle at
X600 magnification filtered from Sample1
4. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 4
3.3. Image analysis
The image analysis was carried out using Matrox Inspector, Version 8.0. The main process steps performed on
the image to extract different size parameters are shown in Fig. 5.
After loading an image it was preprocessed with brightness control, contrast control, flattening
background and sharpening edges tools to improve the quality. Image was then calibrated to change the units
from pixel world to real world. Image was cropped by selecting a rectangular region of interest around particle
and removing the unnecessary portion of image. As cropped image may change its size, so it was again
recalibrated. The image was binaries by thresholding to get white object and dark background. There might be
some dark spots left inside the image of object and might be many bright noise in the background. They were
rectified by Blob Reconstruct operations. Major length and minor length were determined using Feret’s
diameter, which is the maximum distance between two parallel lines which just touch the shape in the position it
takes [16]. The angle of maximum axis of debris was found out in firs blob analysis step and the image was
rotated by the same angle so that the maximum axis became horizontal. The major and the minor length are the
Figure4: SEM image of individual particle at X600
magnification filtered from Sample2
Loading the Image
Preprocessing
Image Calibration
Cropping Image
Recalibrating the Cropped Image
Thresholding the image to
binarize it
First Blob Analysis
Rotating the Image
Second Bob Analysis
Transfer of Data to Excel Sheet
Figure5: image processing steps performed
5. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 5
width and the height of the rectangle box which just touch the debris [5]. Figure6 shows some of the rotated
binary images of particles from Sample1 and Sample2. Size parameters: area, perimeter, convex perimeter,
major length and minor length were derived in tabular form in second blob analysis step. By setting the
minimum and maximum area options the calculations of other bright noise blobs present were discarded. The
data was then transferred to Excel sheet for further calculations and analysis.
IV. Result And Discussion
Table1 lists the range of values, mean value and standard deviation of different size parameters of images of
particles in Sample1 and Sample2. Images of 33 particles were captured from Sample1 and 34 particles from
sample2. The mean value of all the size parameters from Sample2 was found to be greater than Sample1. As the
results of one set (Sample) might not be extended to the complete population, having uncountable particles,
hence the hypothesis testing was used to draw inferences about the population.
Table1: Size parameters of images of particles in Sample1 and Sample2
Sample1 (for 33 particles)
Range of
Paramete
rs
Area in µm2
Parameter in
µm
Convex Perimeter in
µm
Major
length
in µm
Minor length
in µm
Mean
29.083 – 4717.519
20.997 –
507.957
19.987 – 282.290
6.500 –
99.995
6.000 – 81.496
Standard.
Deviation
672.862 123.639 84.480 30.778 22.803
1129.791 115.237 65.932 23.684 19.051
Sample2 (for 34 particles)
Range of
Paramete
rs
Area in µm2
Parameter in
µm
Convex Perimeter in
µm
Major
length
in µm
Minor
length
in µm
Mean
33.111 – 5764.222
26.280 –
491.886
23.675 – 331.523
8.333 –
137.167
7.167 – 79.000
Standard.
Deviation
1240.647 155.821 122.579 46.216 30.353
1585.635 116.815 82.158 31.687 20.924
Sample
1
Sample
2
Figure6: binary and rotated images of some of the particles from Sample1 and Sample2
6. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 6
4.1 Hypothesis testing
The symbols used in the testing are –
- Mean value for population 1 (All wear particles in gear oil after 200 hrs. of running)
- Mean value for population 2 (All wear particles in gear oil after 2000 hrs. of running)
- Mean value for Sample 1
– Mean value for Sample 2
α – Significance level
- Estimated standard deviation of population 1 and = s1
- Estimated standard deviation of population 2 and = s2
s1 – Standard deviation of Sample1
s2 – Standard deviation of Sample2
n1 – Number of observations in sample 1
n2 – Number of observations in sample 2
4.1.1. Hypothesis Testing for area
H0 : = ; Null hypothesis : There is no difference in the mean area of particles in population 1 and
population 2.
H1 : > ; Alternative hypothesis: Population 2 has particles with mean area greater than that of population 1.
α = 0.05; 5% significance level
= 672.862µm2
= 1240.647µm2
s1 = 1129.791 µm2
s2 = 1585.635 µm2
n1 = 33 n2 = 34
Standard deviation of populations was not known, hence the estimated standard error of the difference between
two means
=
As = s1 and = s2
= = 335.601
When the difference of sample means, - , was standardized
Z = = = 1.692
Both samples were large enough to allow us to use Normal distribution. From Normal distribution table
the nearest critical value of Z corresponding to 5% significance level was 1.65.
Statistical analysis gave results: Z=1.692 which was greater than Zcritical =1.65. Hence, Null hypothesis was not
accepted. The alternative hypothesis was accepted- that the particles in the oil after 2000 hours of running have
mean area greater than that of oil after 200 hours of running. Graphical representation of the result is shown in
Fig.7.
As standard deviation of
populations are not known to us
7. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 7
During the study it was found that differential of the dumper was running without any trouble. It
continued to perform well for a pretty long time. Hence, it may be concluded that the increase in mean area was
due to oil deterioration.
4.1.2. Hypothesis Testing For Other Size Parameters
Similar analysis was done for other size parameters. Results are shown graphically, Fig.8 to Fig.11. For
perimeter Z=1.135 < Zcritical =1.65, null hypothesis was accepted. It can be inferred that particles in the oil after
2000 hours of running do not show significant increase in mean perimeter than that of oil after 200 hours of
running (Fig.8).
Z=2.26
ZCritical=1.65
0
0.05 of Area
Rejection Region
Zcritical=1.65
Z=1.545
0.45
of
Area
0.5
of
Area
Acceptance Region
Accept H0 if Z value in this region
0
Figure10: hypothesis test for increase of Major Length
at o.o5 level of significance
Figure11: hypothesis test for increase of Minor
Length at o.o5 level of significance
Z=1.692
ZCritical=1.65
0
Figure7: Hypothesis test for increase of Area at o.o5
level of significance
0.05 of Area
Rejection Region
Zcritical=1.65
Z=1.135
0.45
of
Area
0.5
of
Area
Acceptance Region
Accept H0 if Z value in this region
0
Figure8: hypothesis test for increase of Perimeter at o.o5
level of significance
Z=2.096
ZCritical=1.65
0
Figure9: hypothesis test for increase of Convex
Perimeter at o.o5 level of significance
8. Statistical Hypothesis Testing Of The Increase In Wear Debris Size Parameters And The
www.ijeijournal.com Page | 8
The results for Convex Perimeter, Major Length and Minor Length are shown in Figure9, Figure10 and
Figure11, respectively. For Convex Perimeter, Z=2.096 > Zcritical =1.65, the alternative hypothesis was accepted.
It can be said that Convex Perimeter of particles in oil after 2000 hours of running is greater than that of
particles in the oil after 200 hours of running. Similarly, for Major Length being Z=2.26 > Zcritical =1.65,
significant increase with ageing of oil was concluded. Minor Length had Z=1.545 < Zcritical =1.65, so the null
hypothesis of equality was accepted: Minor lengths do not show significant increase.
V. Conclusion
This paper investigated increase in size parameters, from the two dimensional binary images of wear
particles, with ageing of gear oil. Five parameters – area, perimeter, convex perimeter, major length and minor
length were measured. Among these, area, convex perimeter and major length showed a significant increase,
whereas perimeter and minor length did not increase significantly. It indicates that some of the size parameters
are significantly correlated with the oil condition, and this correlation needs to be investigated further.
References
[1] Rao, B. Handbook of Condition Monitoring, 1996, Elsevier advanced technology, Oxford.
[2] Mukherjee, P.S., et al. Investigating the engine condition of a mining equipment by wear debris analysis using SEM. in: Proc. of the
24th
International Congress on Condition Monitoring and Diagnostic Engineering Management (COMADEM 2011), 30th
May-1st
June, 2011, Stavanger, Norway, pp. 519-524.
[3] Seifert, W.W.; Westcott, V.C. A method for the study of wear particles in lubricating oil. Wear, 1972, 21, pp. 27-42.
[4] Kumar, M., et al., Advancement and current status of wear debris analysis for machine condition monitoring – A review. Industrial
Lubrication and Tribology, 2013, 65(1), pp. 3-11.
[5] Cho, U.; Tichy, J.A. A study of two-dimensional binary images of wear debris as an indicator of distinct wear conditions.
Tribologgy Transactions, 2001, 44(1), pp. 132-136.
[6] Roylance, B.J.; Pocock, G. Wear studies through particle size distribution -: Application of the Weibull distribution to ferrography.
Wear, 1983, 90, pp. 113-136.
[7] Kirk, T.B., et al. Computer image analysis of wear debris for machine condition monitoring and fault diagnosis. Wear, 1995, 181-
183, pp. 717-722.
[8] Ahn, H.S., et al. Practical contaminant analysis of lubricating oil in a steam turbine-generator. Tribology International, 1996, 29 (2),
pp. 161-168.
[9] Peng, Z.; Kirk, T.B. Automatic wear-particle classification using neural networks. Tribology Letters, 1998, 5, pp. 249-257.
[10] Peng, Z.; Kirk, T.B. Wear particle classification in a fuzzy grey system. Wear, 1999, 225-229, pp. 1238-1247.
[11] Peng, Z. An integrated intelligence system for wear debris analysis. Wear, 2002, 252, pp. 730-743.
[12] Cho,U.; Tichy, J.A. Quantitative correlation of wear debris morphology: grouping and classification. Tribology International, 2000,
33, pp. 461-467.
[13] Laghari, M.S., et al. Knowledge based wear particle analysis. International Journal of Information Technology, 2004, 1 (3), pp. 91-
95.
[14] Khan, M.A., et al. A methodology for online wear debris morphology and composition analysis. Proc. Institute Mech. Engineers,
2008, 222 (J), pp. 785-796.
[15] Levin, I.R.; Rubin, D.S. Statistics for Management, 2006, Prentice-Hall of India, New Delhi.
[16] Hunt, T.M. Handbook of Wear Debris Analysis and Particle Detection in Liquids, 1993, Elsevier applied science, London and New
York.