Initial-pressed juice is an important intermediate
product in cane sugar industry, and sugar brix is a key indicator
for evaluating sugar quality. Real-time evaluation of sugar quality
requires determining the content of sugar brix in all steps of the
cane sugar process. Near-infrared (NIR) spectroscopy is simple,
rapid and non-destructive technologies on the analysis of material
contents. In this study, the chemometric algorithm of parametercombined
tuning of Savitzky-Golay (SG) smoother and Partial
Least Squares (PLS) regression was utilized for NIR analysis of
sugar brix contents in sugarcane initial-pressure juice. The
algorithms of combined optimization of SG smoother and PLS
regression was achieved and the calibration models were
optimally established by screening the expanded 540 SG
smoothing modes and the 1-30 latent valuables (LV). The
optimized models have high predictive accuracy. These results
confirm that the combined optimization of SG smoothing modes
and PLS LVs is effective in the quantitative determination of
sugar brix contents in sugarcane initial-pressure juice, and that
the NIR spectroscopic technology with its chemometric algorithms
have the potential in the analysis of cane sugar intermediates.
Chemometrical Optimization for Fourier Transform Near Infrared Analysis of Su...IJERD Editor
Real-time evaluation of sugar quality requires determining the content of sugar brix in the steps of
the cane sugar process. Sugar brix is a key indicator for evaluating sugar quality. Fourier transform near infrared
(FTNIR) spectroscopy is a simple, rapid and non-destructive technology on the analysis of material contents. In
this study, the chemometric algorithm of parameter-combined tuning of Savitzky-Golay (SG) smoother and
Partial Least Squares (PLS) regression was utilized for FTNIR analysis of sugar brix content in sugarcane
clarified juice, an important intermediate product in cane sugar industry. The algorithms of combined
optimization of SG smoother and PLS regression was achieved and the calibration models were established
optimized by screening the expanded 540 SG smoothing modes and the 1-30 latent valuables (LV). The
optimized models have high predictive accuracy. These results confirm that the combined optimization of SG
smoothing modes and PLS LVs is effective in the quantitative determination of sugar brix contents in sugarcane
clarified juice, and that the FTNIR spectroscopic technology with its chemometric algorithms have the potential
in the analysis of cane sugar intermediates.
This document summarizes a study that modeled the dose distribution of a radioactive 32P-coated coronary stent using Monte Carlo simulations. The researchers calculated dosimetry parameters like the geometry function, anisotropy function, and radial dose function according to AAPM TG-60 protocols. The Monte Carlo calculated dose rates were found to be within acceptable agreement of measured values from previous studies, with differences of less than 5% near the stent surface decreasing to around 30% at farther distances, which can be explained by measurement uncertainties. The high dose region was confined to near the stent surface, decreasing rapidly with depth in tissue.
THE USE OF ADAPTIVE STATISTICAL ITERATIVE RECONSTRUCTION (ASIR) ON IMAGE QUAL...AM Publications
Use of Adaptive Statistical Iterative Reconstruction (ASIR) on CT scan of thorax has been investigated. This study aims is to determine the effect of the use of ASIR on the radiation dose and image quality. The study was conducted using phantom anthropomorphic using CT Scan GE Optima 580 with setting ASIR Dose Reduction 0% - 50%. This study was carried out by using two parameters: fixed tube current and Tube Current Modulation (TCM). Analyzing of radiation dose is done by calculation of CTDIvol. While image quality are investigated by calculated Signal to Noise Ratio (SNR) and Contrast to Noise Ratio (CNR). There is a difference in the CTDIvol value, between the fixed tube current and TCM settings. At fixed tube current strength, the CTDIvol value is measured, starting from 0% to 50% ASIR, respectively: 48.60 mGy; 43.74 mGy; 38.88 mGy; 34.02 mGy; 29.16 mGy; and 24.30 mGy. While the CTDIvol value measured in TCM settings uses 0% to 50% ASIR as follows: 21.92 mGy; 20.09 mGy; 18.33 mGy; 16.51 mGy; 14.59 mGy; and 12.75 mGy. Using ASIR with TCM can produce CTDIvol values that are smaller than ASIR with fixed tube current. The difference in the average CTDIvol value is 51.80% between the use of TCM and fixed tube current. The greater the percentage of ASIR regulated, the greater the decrease in the CTDIvol dose. There is no significant difference in the SNR and CNR values produced by 0% to 50% ASIR with a fixed current strength. There is no significant difference in the SNR and CNR values produced by ASIR 0% to 50% with TCM. The average CNR value shown in TCM is higher than that of fixed tube current. The use of ASIR influence in dose radiation decreases without change the image quality.
An approach to assess the quality of honey using partial least square method IJECEIAES
The objective of the present study is to obtain the quantity of honey components such as moisture, glucose, fructose and sucrose in order to access the quality of honey. The tested honey samples are authenticated if the characteristics of a pure honey. The average ratio of 56% fructose to 44% glucose, but the ratios in the individual honeys ranged from a high of 64% fructose and 36% glucose to a low of 50% fructose and 50% glucose. The contents such as fructose and sucrose in honey is due to the presence of invertase enzymes. The organic acids present in the honey is responsible for the flavor and stability against the contamination of honey due to microorganisms. The natural food items are adulterated intentionally to increase the quantity and there by the quality gets affected. The main adulterants added in honey are sucrose, corn syrup, sugar syrup and jaggery syrup. The quantification deals in finding out the amount of basic constituents present in pure honey and adulterated honey using Fourier transform infrared (FTIR) spectrometer with the multivariate analysis and validating the same using chemical analysis method. The partial least square model is used in predicting the constituents of the samples.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
Road crack detection using adaptive multi resolution thresholding techniquesTELKOMNIKA JOURNAL
Machine vision is very important for ensuring the success of intelligent transportation systems, particularly in the area of road maintenance. For this reason, many studies had been focusing on automatic image-based crack detection as a replacement for manual inspection that had depended on the specialist’s knowledge and expertise. In the image processing technique, the pre-processing and edge detection stages are important for filtering out noises and in enhancing the quality of the edges in the image. Since threshold is one of the powerful methods used in the edge detection of an image, we have therefore proposed a modified Otsu-Canny Edge Detection Algorithm in the selection of the two threshold values as well as implemented a multi-resolution level fixed partitioning method in the analysis of the global and local threshold values of the image. This is then followed by a statistical measure in selecting the edge image with the best global threshold. This study had utilized the road crack image dataset that were obtained from Crackforest. The results had revealed the proposed method to not only perform better than the conventional Canny edge detection method but had also shown the maximum value derived from the local threshold of 5x5 partitioned image outperforming the other partitioned scales.
A Simple and an Innovative Gas Chromatography Method to Quantify Isopentane i...IOSR Journals
This document presents a simple and innovative gas chromatography method to quantify isopentane in cosmetic products. The method uses a capillary column and flame ionization detection. Isopentane is used as an ingredient in cosmetics like shaving gel. The method was developed to specifically quantify isopentane, which is difficult due to its volatile nature. The method was validated according to ICH guidelines and found to be simple, specific, precise, linear, accurate, robust and rugged for isopentane analysis. Standard and sample solutions were prepared and analyzed using gas chromatography. The method was optimized and validated for specificity, linearity, accuracy, precision, robustness and ruggedness. The results met validation acceptance criteria
This document describes a GPU-based omni-directional integration method for reconstructing 3D pressure fields from measured acceleration data. It validates the method using a direct numerical simulation database and applies it to experimental measurements of turbulent channel flow over a compliant surface. The key findings are that the 3D omni-directional integration method achieves high accuracy (<1% error) and performance (~1 minute runtime), and the reconstructed pressure fields from this method correlate well with the flow behavior.
Chemometrical Optimization for Fourier Transform Near Infrared Analysis of Su...IJERD Editor
Real-time evaluation of sugar quality requires determining the content of sugar brix in the steps of
the cane sugar process. Sugar brix is a key indicator for evaluating sugar quality. Fourier transform near infrared
(FTNIR) spectroscopy is a simple, rapid and non-destructive technology on the analysis of material contents. In
this study, the chemometric algorithm of parameter-combined tuning of Savitzky-Golay (SG) smoother and
Partial Least Squares (PLS) regression was utilized for FTNIR analysis of sugar brix content in sugarcane
clarified juice, an important intermediate product in cane sugar industry. The algorithms of combined
optimization of SG smoother and PLS regression was achieved and the calibration models were established
optimized by screening the expanded 540 SG smoothing modes and the 1-30 latent valuables (LV). The
optimized models have high predictive accuracy. These results confirm that the combined optimization of SG
smoothing modes and PLS LVs is effective in the quantitative determination of sugar brix contents in sugarcane
clarified juice, and that the FTNIR spectroscopic technology with its chemometric algorithms have the potential
in the analysis of cane sugar intermediates.
This document summarizes a study that modeled the dose distribution of a radioactive 32P-coated coronary stent using Monte Carlo simulations. The researchers calculated dosimetry parameters like the geometry function, anisotropy function, and radial dose function according to AAPM TG-60 protocols. The Monte Carlo calculated dose rates were found to be within acceptable agreement of measured values from previous studies, with differences of less than 5% near the stent surface decreasing to around 30% at farther distances, which can be explained by measurement uncertainties. The high dose region was confined to near the stent surface, decreasing rapidly with depth in tissue.
THE USE OF ADAPTIVE STATISTICAL ITERATIVE RECONSTRUCTION (ASIR) ON IMAGE QUAL...AM Publications
Use of Adaptive Statistical Iterative Reconstruction (ASIR) on CT scan of thorax has been investigated. This study aims is to determine the effect of the use of ASIR on the radiation dose and image quality. The study was conducted using phantom anthropomorphic using CT Scan GE Optima 580 with setting ASIR Dose Reduction 0% - 50%. This study was carried out by using two parameters: fixed tube current and Tube Current Modulation (TCM). Analyzing of radiation dose is done by calculation of CTDIvol. While image quality are investigated by calculated Signal to Noise Ratio (SNR) and Contrast to Noise Ratio (CNR). There is a difference in the CTDIvol value, between the fixed tube current and TCM settings. At fixed tube current strength, the CTDIvol value is measured, starting from 0% to 50% ASIR, respectively: 48.60 mGy; 43.74 mGy; 38.88 mGy; 34.02 mGy; 29.16 mGy; and 24.30 mGy. While the CTDIvol value measured in TCM settings uses 0% to 50% ASIR as follows: 21.92 mGy; 20.09 mGy; 18.33 mGy; 16.51 mGy; 14.59 mGy; and 12.75 mGy. Using ASIR with TCM can produce CTDIvol values that are smaller than ASIR with fixed tube current. The difference in the average CTDIvol value is 51.80% between the use of TCM and fixed tube current. The greater the percentage of ASIR regulated, the greater the decrease in the CTDIvol dose. There is no significant difference in the SNR and CNR values produced by 0% to 50% ASIR with a fixed current strength. There is no significant difference in the SNR and CNR values produced by ASIR 0% to 50% with TCM. The average CNR value shown in TCM is higher than that of fixed tube current. The use of ASIR influence in dose radiation decreases without change the image quality.
An approach to assess the quality of honey using partial least square method IJECEIAES
The objective of the present study is to obtain the quantity of honey components such as moisture, glucose, fructose and sucrose in order to access the quality of honey. The tested honey samples are authenticated if the characteristics of a pure honey. The average ratio of 56% fructose to 44% glucose, but the ratios in the individual honeys ranged from a high of 64% fructose and 36% glucose to a low of 50% fructose and 50% glucose. The contents such as fructose and sucrose in honey is due to the presence of invertase enzymes. The organic acids present in the honey is responsible for the flavor and stability against the contamination of honey due to microorganisms. The natural food items are adulterated intentionally to increase the quantity and there by the quality gets affected. The main adulterants added in honey are sucrose, corn syrup, sugar syrup and jaggery syrup. The quantification deals in finding out the amount of basic constituents present in pure honey and adulterated honey using Fourier transform infrared (FTIR) spectrometer with the multivariate analysis and validating the same using chemical analysis method. The partial least square model is used in predicting the constituents of the samples.
UV Spectrophotometric Method Development and Validation for Quantitative Esti...Sagar Savale
Aim: UV Spectrophotometric Method Development and Validation for quantitative estimation of
Diclofenac Sodium. Objective: U.V Spectrophotometric method have been widely employed for
determination of analyte in a mixture. Our aim is to develop spectroscopic method for estimation of the
diclofenac sodium in ternary mixture by using U.V spectrophotometry. Methodology: The method was
validated as per ICH guidelines. The recovery studies confirmed the accuracy and precision of the method.
Conclusion: It was successfully applied for the analysis of the drug in bulk and could be effectively used for
the routine analysis.
Road crack detection using adaptive multi resolution thresholding techniquesTELKOMNIKA JOURNAL
Machine vision is very important for ensuring the success of intelligent transportation systems, particularly in the area of road maintenance. For this reason, many studies had been focusing on automatic image-based crack detection as a replacement for manual inspection that had depended on the specialist’s knowledge and expertise. In the image processing technique, the pre-processing and edge detection stages are important for filtering out noises and in enhancing the quality of the edges in the image. Since threshold is one of the powerful methods used in the edge detection of an image, we have therefore proposed a modified Otsu-Canny Edge Detection Algorithm in the selection of the two threshold values as well as implemented a multi-resolution level fixed partitioning method in the analysis of the global and local threshold values of the image. This is then followed by a statistical measure in selecting the edge image with the best global threshold. This study had utilized the road crack image dataset that were obtained from Crackforest. The results had revealed the proposed method to not only perform better than the conventional Canny edge detection method but had also shown the maximum value derived from the local threshold of 5x5 partitioned image outperforming the other partitioned scales.
A Simple and an Innovative Gas Chromatography Method to Quantify Isopentane i...IOSR Journals
This document presents a simple and innovative gas chromatography method to quantify isopentane in cosmetic products. The method uses a capillary column and flame ionization detection. Isopentane is used as an ingredient in cosmetics like shaving gel. The method was developed to specifically quantify isopentane, which is difficult due to its volatile nature. The method was validated according to ICH guidelines and found to be simple, specific, precise, linear, accurate, robust and rugged for isopentane analysis. Standard and sample solutions were prepared and analyzed using gas chromatography. The method was optimized and validated for specificity, linearity, accuracy, precision, robustness and ruggedness. The results met validation acceptance criteria
This document describes a GPU-based omni-directional integration method for reconstructing 3D pressure fields from measured acceleration data. It validates the method using a direct numerical simulation database and applies it to experimental measurements of turbulent channel flow over a compliant surface. The key findings are that the 3D omni-directional integration method achieves high accuracy (<1% error) and performance (~1 minute runtime), and the reconstructed pressure fields from this method correlate well with the flow behavior.
My last vacations Leonardo castro malaverleonik2510
Leonardo Castro is a 25-year-old man from Zipaquira, Colombia. He visited his wife's parents in Yacopi where he had a fun trip, staying with his in-laws and having breakfast with them. Later, he went for a two-hour walk by the river before resting at home, then shared beers with his family and had a late lunch together before going for a ride in the country and returning home the next day.
This document discusses using genetic algorithms to forecast a country's GDP. It begins with an introduction to genetic algorithms and how they can be applied to problems like GDP forecasting. It then provides details on GDP calculation methods and key factors considered. The proposed system would use a genetic algorithm approach trained on historical economic data to predict a country's GDP over a year in advance. This would analyze trends, policies, investments and other changing economic conditions to generate adaptive forecasts. Developing the algorithm and testing the approach on real data is considered future work.
Data mining , knowledge discovery is the process
of analyzing data from different perspectives and summarizing it
into useful information - information that can be used to increase
revenue, cuts costs, or both. Data mining software is one of a
number of analytical tools for analyzing data. It allows users to
analyze data from many different dimensions or angles, categorize
it, and summarize the relationships identified. Technically, data
mining is the process of finding correlations or patterns among
dozens of fields in large relational databases. The goal of
clustering is to determine the intrinsic grouping in a set of
unlabeled data. But how to decide what constitutes a good
clustering? It can be shown that there is no absolute “best”
criterion which would be independent of the final aim of the
clustering. Consequently, it is the user which must supply this
criterion, in such a way that the result of the clustering will suit
their needs.
For instance, we could be interested in finding
representatives for homogeneous groups (data reduction), in
finding “natural clusters” and describe their unknown properties
(“natural” data types), in finding useful and suitable groupings
(“useful” data classes) or in finding unusual data objects (outlier
detection).Of late, clustering techniques have been applied in the
areas which involve browsing the gathered data or in categorizing
the outcome provided by the search engines for the reply to the
query raised by the users. In this paper, we are providing a
comprehensive survey over the document clustering.
Background: Septoplasty is one of the commonest nasal surgeries
performed by otolaryngologist. Silicone is the most common
material used for nasal splints. Trans-septal suturing technique
has been described to approximate the mucosal flaps after septal
procedures to reduce the complication rate; however there are
few studies proving the efficacy.
Objective: This study is to elucidate the efficacy of trans-septal
suture method in preventing complications, discomfort and pain
in comparison with intranasal splinting using silicone plates after
septoplasty.
Patients and methods: This is a prospective study of 59 adult
patients underwent Septoplasty, between August 2013-January
2014 in Rizgary Teaching Hospital - Erbil city. Patients were
divided into 2 groups; trans-septal suture and silicone, 29 and 30
patients respectively. Visual analogue scale was used to evaluate
postoperative pain, bleeding, post-nasal drip, dysphagia and
sleep disturbance for three days. Epiphora and septal hematoma
are also evaluated. Septal perforation, crustation, and adhesion
were evaluated at 4th postoperative week.
Results: The severity of pain and post nasal drip were
significantly lower in trans-septal suture group than silicone
group (P< 0.05). The septal hematoma and septal perforation
were not seen in the study. No any significant difference found
concerning epiphora, crustation and adhesion.
Conclusion: we conclude that, suturing can be used safely in
septoplasty sp
The document summarizes the key findings and recommendations from the Legal Education Review in the UK, which examined legal education and training. It identified three key areas for improvement: quality, access and mobility, and flexibility. For quality, it recommended strengthening requirements around legal ethics, values, management and communication skills. For access and mobility, it proposed standards for internships, increasing career progression opportunities, and providing better information on legal careers. For flexibility, it suggested cooperating to set baseline standards, clarifying accreditation of prior learning, and allowing more innovative pathways. The SRA agreed to reforms focused on client confidence, skills for legal services, and access to education. It then outlined its "Training for Tomorrow" proposals and competency
Batch adsorption experiments were carried out for
the adsorption of cationic dye from aqueous solution onto
composite activated carbon. The composite activated carbon was
prepared from brewer’s spent grain and sea bean shell at a ratio
of 1:1. The equilibrium studies were done at different
concentrations and temperatures. The equilibrium data were
fitted to Langmuir, Freundlich, Dubinin-Radushkevich, and
Temkin isotherm models. The results showed that both Lagmuir
and Freundlich isotherm model fitted the data reasonably well
but Freundlich isotherm fitted better in the temperature range
studied. This confirmed that the adsorption is heterogeneous,
non-specific and non-uniform in nature. Kinetic studies were also
undertaken in terms of first order, second order, pseudo first
order, pseudo second order, Elovich, Boyd, and intra-particle
diffusion models. The results indicated that the data followed
pseudo second order model with surface adsorption and intraparticle
diffusion concurrently operating during the adsorbateadsorbent
interaction. The values of the thermodynamic
parameters computed from Van’t Hoff plot confirmed the
process to be endothermic and spontaneous in nature.
This document provides a self-evaluation by Chloe Read of various tasks involved in a preliminary radio station project. For most tasks, Chloe indicates what went well, what problems occurred, and what she would do differently in the future. The tasks included planning, identifying roles, script writing, creating a jingle, recording, editing, teamwork, blogging, individual contributions, classwork, and independent work. For many tasks, Chloe found the work went smoothly with no issues, such as teamwork, recording with a dictaphone, and blogging. Some tasks presented initial challenges, like learning to use editing software, but improved with experience.
Aristotle wrote about 200 treatises on a wide range of topics, including logic, metaphysics, philosophy of science, ethics, political philosophy, aesthetics, rhetoric, physics, astronomy and biology. He transformed many areas of knowledge. The document also discusses the Popocatepetl volcano in Mexico, describing its name meaning "smoking mountain" and classification as a stratovolcano formed from layers of ash and fragments. It notes the municipalities that border the volcano across the states of Mexico, Morelos and Puebla.
This document provides an introduction to finance concepts for a business principles course. It discusses key topics like the importance of cash over profit, debt versus equity sources of funding, and the working capital cycle. The learning outcomes are to explain the difference between cash and profit, compare debt and equity, evaluate financing options, and understand the importance of cash flow to a business. The document also outlines different sources of funding like loans, leasing, overdrafts, and shares, and how their costs vary based on risk level.
This document provides an overview of the contents of a BUS 370 Entire Course. It lists the titles of discussion questions, assignments, and a final paper that make up the coursework. The coursework covers topics like organizational change, goal setting, communication processes, and diagnostic models. It also provides descriptions of some of the discussion prompts, which generally involve analyzing case studies or applying organizational development principles.
Java interview questions tutorialspointمحمود الحسين
This document contains an interview questions about Java programming language. It begins with introductory questions like what Java is and its key features. It then covers more technical questions about Java concepts like object oriented programming, classes, inheritance, polymorphism, exceptions and more. The questions range from basic to more advanced.
Scotland's parks provide important benefits like places for physical activity, relaxation, family time, and experiencing nature, but a report finds concerns about declining quality of UK parks, with Scots finding a widening gap between expectations of parks and reality. However, many Scots value parks and visit them regularly, and working together communities and parks organizations can create a sustainable future for parks.
Background and objectives: Iron is one of the major component
of hemoglobin, is required for transport of oxygen, ferritin is a
protein for storage iron, and transferring is a protein for iron
transport, all these it may be change in breast cancer.
The aim of present study was to measure the serum ferritin, iron,
total iron binding capacity, transferring, and C-reactive protein
concentration in breast tumors.
Material and method: A prospective study was carried out from
April 2013 to August 2014 by clinical biochemistry department in
College of Pharmacy-University of Sulaimani on (45) healthy
female individuals, (group 1) and (50) females with breast tumor
(group 2).
Results: The mean value of serum iron, transferring, total iron
binding capacity were significantly lower in females with breast
tumors (group 2), than that of healthy female individuals, (group
1), while serum ferritin was significantly higher in females with
breast tumors (group2), than that of healthy individuals (group
1).
Conclusion: Based on findings of the present study it can be
concluded that breast tumors can cause deficient of all iron
profile except ferritin will be increase in female breast cancers.
Key words- Serum iron, S.Ferritin, S. Total iron binding, S.
Transferrin, S.C-Reactive protein, breast tumors.
World Wide Web is large sized repository of
interlinked hypertext documents accessed via the Internet. Web
may contain text, images, video, and other multimedia data. The
user navigates through this using hyperlink. Search Engine gives
millions of results and applies Web mining techniques to order the
results. The sorted order of search results is obtained by applying
some special algorithms called—Page ranking algorithms. The
algorithm measures the importance of the pages by analyzing the
number of inlinked and outlinked pages. Our proposed system is
built on an idea that to rank the relevant pages higher in the
retrieved document set, an analysis of both page‘s text substance
and links information is required. The proposed approach is
based on the assumption that the effective weight of a term in a
page is computed by adding the weight of a term in the current
page and additional weight of the term in the linked pages. In
this chapter, we first study the nature of web pages, the various
link analysis ranking algorithms and their limitations and then
show the comparative analysis of the ranking scores obtained
through these approaches with our new suggested ranking
approach.
Attitudes toward suicide may influence the
treatment content and outcomes. Hence, this study aimed to
investigate how public attitudes toward suicide were influenced
by (1) their degree of idealism; and (2) their degree of
relativism. A questionnaire survey with Suicide Perception
Scale and Ethic Position Questionnaire was carried out on 50
male and 50 female participants (aged 21 and above) from
Klang Valley, Malaysia to obtain answer. The findings
supported both hypotheses, indicated that (1) higher idealism is
associated with lower level of acceptance toward suicide; and
(2) higher relativism is associated with higher level of
acceptance toward suicide. In sum, variations in public’s
attitude toward suicide were related to individual differences in
personal ethical ideologies and moral philosophies.
Abledate Online Mature Online dating, undoubtedly has made its symbol in the online world. Abledate Online relationship can discover you that special somebody in life. Abledate can switch the typical traditional connections that you have had in the previous. Abledate online dating website is a type of internet dating that is also enjoyable if you know how to make it really unique. Abledate online relationship is presently the hot craze and many persons are turning to our internet relationship sites to find buddies, romance, love and relationships. Abledate Online relationship might be the best solution to many that are dating. Abledate might be less complicated for them to locate somebody special, but Abledate might just be a way to date without a time constraint in USA. Hurry up start dating after signup at Abledate.com
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimatesipij
The technique of cepstrum thresholding, which is shown to be an effective, yet simple, way of obtaining a smoothed non parametric spectrum estimate of a stationary signal. The major problem of this method is the choice of the threshold value for variance reduction of spectrum estimates. This paper proposes a new threshold selection method which is based on cross validation schemes such as Leave-One-Out, LeaveTwo-Out and Leave-Half-Out. This new methods are easy to describe, simple to implement, and does not impose severe conditions on the unknown spectrum. Numerical results suggest that this new methods are shown to be in agreement with those obtained when the spectrum is fully known.
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingCSCJournals
For stationary signals, there are number of power spectral density estimation techniques. The main problem of power spectral density (PSD)estimation methods is high variance. Consistent estimates may be obtained by suitable processing of the empirical spectrum estimates (periodogram). This may be done using window functions. These methods all require the choice of a certain resolution parameters called bandwidth. Various techniques produce estimates that have a good overall bias Vs variance tradeoff. In contrast, smooth components of this spectral required a wide bandwidth in order to achieve a significant noise reduction. In this paper, we explore the concept of cepstrum for non parametric spectral estimation. The method developed here is based on cepstrum thresholding for smoothed non parametric spectral estimation. The algorithm for Consistent Minimum Variance Unbiased Spectral estimator is developed and implemented, which produces good results for Broadband and Narrowband signals.
Photoacoustic tomography based on the application of virtual detectorsIAEME Publication
This document discusses using virtual detectors to improve photoacoustic tomography (PAT) image reconstruction when full scanning data is unavailable. It proposes interpolation and compressed sensing methods to generate virtual detector data and increase the number of measurements. Simulation results show applying these methods to preprocessed photoacoustic data significantly improves the peak signal-to-noise ratio of reconstructed images compared to direct reconstruction with limited detectors. Dictionary-based compressed sensing provides the best performance by learning an over-complete dictionary to sparsify signals. The methods allow better quality PAT imaging when hardware and spatial constraints limit actual detector positions and sampling angles.
My last vacations Leonardo castro malaverleonik2510
Leonardo Castro is a 25-year-old man from Zipaquira, Colombia. He visited his wife's parents in Yacopi where he had a fun trip, staying with his in-laws and having breakfast with them. Later, he went for a two-hour walk by the river before resting at home, then shared beers with his family and had a late lunch together before going for a ride in the country and returning home the next day.
This document discusses using genetic algorithms to forecast a country's GDP. It begins with an introduction to genetic algorithms and how they can be applied to problems like GDP forecasting. It then provides details on GDP calculation methods and key factors considered. The proposed system would use a genetic algorithm approach trained on historical economic data to predict a country's GDP over a year in advance. This would analyze trends, policies, investments and other changing economic conditions to generate adaptive forecasts. Developing the algorithm and testing the approach on real data is considered future work.
Data mining , knowledge discovery is the process
of analyzing data from different perspectives and summarizing it
into useful information - information that can be used to increase
revenue, cuts costs, or both. Data mining software is one of a
number of analytical tools for analyzing data. It allows users to
analyze data from many different dimensions or angles, categorize
it, and summarize the relationships identified. Technically, data
mining is the process of finding correlations or patterns among
dozens of fields in large relational databases. The goal of
clustering is to determine the intrinsic grouping in a set of
unlabeled data. But how to decide what constitutes a good
clustering? It can be shown that there is no absolute “best”
criterion which would be independent of the final aim of the
clustering. Consequently, it is the user which must supply this
criterion, in such a way that the result of the clustering will suit
their needs.
For instance, we could be interested in finding
representatives for homogeneous groups (data reduction), in
finding “natural clusters” and describe their unknown properties
(“natural” data types), in finding useful and suitable groupings
(“useful” data classes) or in finding unusual data objects (outlier
detection).Of late, clustering techniques have been applied in the
areas which involve browsing the gathered data or in categorizing
the outcome provided by the search engines for the reply to the
query raised by the users. In this paper, we are providing a
comprehensive survey over the document clustering.
Background: Septoplasty is one of the commonest nasal surgeries
performed by otolaryngologist. Silicone is the most common
material used for nasal splints. Trans-septal suturing technique
has been described to approximate the mucosal flaps after septal
procedures to reduce the complication rate; however there are
few studies proving the efficacy.
Objective: This study is to elucidate the efficacy of trans-septal
suture method in preventing complications, discomfort and pain
in comparison with intranasal splinting using silicone plates after
septoplasty.
Patients and methods: This is a prospective study of 59 adult
patients underwent Septoplasty, between August 2013-January
2014 in Rizgary Teaching Hospital - Erbil city. Patients were
divided into 2 groups; trans-septal suture and silicone, 29 and 30
patients respectively. Visual analogue scale was used to evaluate
postoperative pain, bleeding, post-nasal drip, dysphagia and
sleep disturbance for three days. Epiphora and septal hematoma
are also evaluated. Septal perforation, crustation, and adhesion
were evaluated at 4th postoperative week.
Results: The severity of pain and post nasal drip were
significantly lower in trans-septal suture group than silicone
group (P< 0.05). The septal hematoma and septal perforation
were not seen in the study. No any significant difference found
concerning epiphora, crustation and adhesion.
Conclusion: we conclude that, suturing can be used safely in
septoplasty sp
The document summarizes the key findings and recommendations from the Legal Education Review in the UK, which examined legal education and training. It identified three key areas for improvement: quality, access and mobility, and flexibility. For quality, it recommended strengthening requirements around legal ethics, values, management and communication skills. For access and mobility, it proposed standards for internships, increasing career progression opportunities, and providing better information on legal careers. For flexibility, it suggested cooperating to set baseline standards, clarifying accreditation of prior learning, and allowing more innovative pathways. The SRA agreed to reforms focused on client confidence, skills for legal services, and access to education. It then outlined its "Training for Tomorrow" proposals and competency
Batch adsorption experiments were carried out for
the adsorption of cationic dye from aqueous solution onto
composite activated carbon. The composite activated carbon was
prepared from brewer’s spent grain and sea bean shell at a ratio
of 1:1. The equilibrium studies were done at different
concentrations and temperatures. The equilibrium data were
fitted to Langmuir, Freundlich, Dubinin-Radushkevich, and
Temkin isotherm models. The results showed that both Lagmuir
and Freundlich isotherm model fitted the data reasonably well
but Freundlich isotherm fitted better in the temperature range
studied. This confirmed that the adsorption is heterogeneous,
non-specific and non-uniform in nature. Kinetic studies were also
undertaken in terms of first order, second order, pseudo first
order, pseudo second order, Elovich, Boyd, and intra-particle
diffusion models. The results indicated that the data followed
pseudo second order model with surface adsorption and intraparticle
diffusion concurrently operating during the adsorbateadsorbent
interaction. The values of the thermodynamic
parameters computed from Van’t Hoff plot confirmed the
process to be endothermic and spontaneous in nature.
This document provides a self-evaluation by Chloe Read of various tasks involved in a preliminary radio station project. For most tasks, Chloe indicates what went well, what problems occurred, and what she would do differently in the future. The tasks included planning, identifying roles, script writing, creating a jingle, recording, editing, teamwork, blogging, individual contributions, classwork, and independent work. For many tasks, Chloe found the work went smoothly with no issues, such as teamwork, recording with a dictaphone, and blogging. Some tasks presented initial challenges, like learning to use editing software, but improved with experience.
Aristotle wrote about 200 treatises on a wide range of topics, including logic, metaphysics, philosophy of science, ethics, political philosophy, aesthetics, rhetoric, physics, astronomy and biology. He transformed many areas of knowledge. The document also discusses the Popocatepetl volcano in Mexico, describing its name meaning "smoking mountain" and classification as a stratovolcano formed from layers of ash and fragments. It notes the municipalities that border the volcano across the states of Mexico, Morelos and Puebla.
This document provides an introduction to finance concepts for a business principles course. It discusses key topics like the importance of cash over profit, debt versus equity sources of funding, and the working capital cycle. The learning outcomes are to explain the difference between cash and profit, compare debt and equity, evaluate financing options, and understand the importance of cash flow to a business. The document also outlines different sources of funding like loans, leasing, overdrafts, and shares, and how their costs vary based on risk level.
This document provides an overview of the contents of a BUS 370 Entire Course. It lists the titles of discussion questions, assignments, and a final paper that make up the coursework. The coursework covers topics like organizational change, goal setting, communication processes, and diagnostic models. It also provides descriptions of some of the discussion prompts, which generally involve analyzing case studies or applying organizational development principles.
Java interview questions tutorialspointمحمود الحسين
This document contains an interview questions about Java programming language. It begins with introductory questions like what Java is and its key features. It then covers more technical questions about Java concepts like object oriented programming, classes, inheritance, polymorphism, exceptions and more. The questions range from basic to more advanced.
Scotland's parks provide important benefits like places for physical activity, relaxation, family time, and experiencing nature, but a report finds concerns about declining quality of UK parks, with Scots finding a widening gap between expectations of parks and reality. However, many Scots value parks and visit them regularly, and working together communities and parks organizations can create a sustainable future for parks.
Background and objectives: Iron is one of the major component
of hemoglobin, is required for transport of oxygen, ferritin is a
protein for storage iron, and transferring is a protein for iron
transport, all these it may be change in breast cancer.
The aim of present study was to measure the serum ferritin, iron,
total iron binding capacity, transferring, and C-reactive protein
concentration in breast tumors.
Material and method: A prospective study was carried out from
April 2013 to August 2014 by clinical biochemistry department in
College of Pharmacy-University of Sulaimani on (45) healthy
female individuals, (group 1) and (50) females with breast tumor
(group 2).
Results: The mean value of serum iron, transferring, total iron
binding capacity were significantly lower in females with breast
tumors (group 2), than that of healthy female individuals, (group
1), while serum ferritin was significantly higher in females with
breast tumors (group2), than that of healthy individuals (group
1).
Conclusion: Based on findings of the present study it can be
concluded that breast tumors can cause deficient of all iron
profile except ferritin will be increase in female breast cancers.
Key words- Serum iron, S.Ferritin, S. Total iron binding, S.
Transferrin, S.C-Reactive protein, breast tumors.
World Wide Web is large sized repository of
interlinked hypertext documents accessed via the Internet. Web
may contain text, images, video, and other multimedia data. The
user navigates through this using hyperlink. Search Engine gives
millions of results and applies Web mining techniques to order the
results. The sorted order of search results is obtained by applying
some special algorithms called—Page ranking algorithms. The
algorithm measures the importance of the pages by analyzing the
number of inlinked and outlinked pages. Our proposed system is
built on an idea that to rank the relevant pages higher in the
retrieved document set, an analysis of both page‘s text substance
and links information is required. The proposed approach is
based on the assumption that the effective weight of a term in a
page is computed by adding the weight of a term in the current
page and additional weight of the term in the linked pages. In
this chapter, we first study the nature of web pages, the various
link analysis ranking algorithms and their limitations and then
show the comparative analysis of the ranking scores obtained
through these approaches with our new suggested ranking
approach.
Attitudes toward suicide may influence the
treatment content and outcomes. Hence, this study aimed to
investigate how public attitudes toward suicide were influenced
by (1) their degree of idealism; and (2) their degree of
relativism. A questionnaire survey with Suicide Perception
Scale and Ethic Position Questionnaire was carried out on 50
male and 50 female participants (aged 21 and above) from
Klang Valley, Malaysia to obtain answer. The findings
supported both hypotheses, indicated that (1) higher idealism is
associated with lower level of acceptance toward suicide; and
(2) higher relativism is associated with higher level of
acceptance toward suicide. In sum, variations in public’s
attitude toward suicide were related to individual differences in
personal ethical ideologies and moral philosophies.
Abledate Online Mature Online dating, undoubtedly has made its symbol in the online world. Abledate Online relationship can discover you that special somebody in life. Abledate can switch the typical traditional connections that you have had in the previous. Abledate online dating website is a type of internet dating that is also enjoyable if you know how to make it really unique. Abledate online relationship is presently the hot craze and many persons are turning to our internet relationship sites to find buddies, romance, love and relationships. Abledate Online relationship might be the best solution to many that are dating. Abledate might be less complicated for them to locate somebody special, but Abledate might just be a way to date without a time constraint in USA. Hurry up start dating after signup at Abledate.com
Data Driven Choice of Threshold in Cepstrum Based Spectrum Estimatesipij
The technique of cepstrum thresholding, which is shown to be an effective, yet simple, way of obtaining a smoothed non parametric spectrum estimate of a stationary signal. The major problem of this method is the choice of the threshold value for variance reduction of spectrum estimates. This paper proposes a new threshold selection method which is based on cross validation schemes such as Leave-One-Out, LeaveTwo-Out and Leave-Half-Out. This new methods are easy to describe, simple to implement, and does not impose severe conditions on the unknown spectrum. Numerical results suggest that this new methods are shown to be in agreement with those obtained when the spectrum is fully known.
Consistent Nonparametric Spectrum Estimation Via Cepstrum ThresholdingCSCJournals
For stationary signals, there are number of power spectral density estimation techniques. The main problem of power spectral density (PSD)estimation methods is high variance. Consistent estimates may be obtained by suitable processing of the empirical spectrum estimates (periodogram). This may be done using window functions. These methods all require the choice of a certain resolution parameters called bandwidth. Various techniques produce estimates that have a good overall bias Vs variance tradeoff. In contrast, smooth components of this spectral required a wide bandwidth in order to achieve a significant noise reduction. In this paper, we explore the concept of cepstrum for non parametric spectral estimation. The method developed here is based on cepstrum thresholding for smoothed non parametric spectral estimation. The algorithm for Consistent Minimum Variance Unbiased Spectral estimator is developed and implemented, which produces good results for Broadband and Narrowband signals.
Photoacoustic tomography based on the application of virtual detectorsIAEME Publication
This document discusses using virtual detectors to improve photoacoustic tomography (PAT) image reconstruction when full scanning data is unavailable. It proposes interpolation and compressed sensing methods to generate virtual detector data and increase the number of measurements. Simulation results show applying these methods to preprocessed photoacoustic data significantly improves the peak signal-to-noise ratio of reconstructed images compared to direct reconstruction with limited detectors. Dictionary-based compressed sensing provides the best performance by learning an over-complete dictionary to sparsify signals. The methods allow better quality PAT imaging when hardware and spatial constraints limit actual detector positions and sampling angles.
Biosensors And Bioelectronics Presentation by Sijung HuConferenceMind
Excellent presentation by Sijung Hu, Loughborough University, United Kingdom. He talks about - "Opto-physiological modeling to drive an effective physiological monitoring: from contact to noncontact, from point to imaging" at the 2nd International Webinar on Biosensors And Bioelectronics
Date: July 12-13, 2021
Visit here for more details:
https://conferencemind.com/conference/biosensorsandbioelectronics
Follow us:-
https://www.facebook.com/Conference-Mind-103557674347276/
https://twitter.com/ConferenceMind
https://www.instagram.com/conferencemind/
https://www.linkedin.com/company/conferencemind-conferences/
https://www.youtube.com/channel/UC2KH-I3EBpPSMkJEZQKIU0A
https://in.pinterest.com/academic0532/_created/
https://www.flickr.com/photos/190611570@N06/
This document presents a study comparing artificial neural network (ANN) and adaptive neuro-fuzzy inference system (ANFIS) models for estimating pH and mixed liquor suspended solids (MLSS) in a domestic wastewater treatment plant. Data from the Bunus regional sewage treatment plant was used to develop and test ANN and ANFIS models. Simulation results showed that the ANFIS model predictions were more accurate than the ANN model, making ANFIS a useful tool for predicting important wastewater treatment plant parameters.
Prediction of mango firmness by near infrared spectroscopy tandem with machin...CSITiaesprime
The firmness of the mango fruit is one of the internal physical properties that can show its quality. Unfortunately, non-destructive methods to measure this are not yet available. In the current study, we develop a calibration model using near infrared spectroscopy to predict the physical properties (firmness) of the mango cultivar Arumanis (Mangifera indica cv. Arumanis) via machine learning. Spectral data were acquired using the fourier transform near-infrared (FTNIR) benchtop with a wavelength range of 1000 to 2500 nm. Multivariate spectra analysis based on machine learning, including principal component regression (PCR), partial least squares regression (PLSR), and support vector machine regression (SVMR), was utilized and compared to estimate the firmness of fresh mangos. The results obtained show that the prediction of machine learning by PLSR is better than that of SVMR and PCR for the prediction of mango firmness. The coefficient correlation of calibration (rc) and validation (rcv), the root means square error of calibration (RMSE-C) and validation (RMSE-CV), and the ratio of prediction to deviation (RPD) were 0.941, 0.382 kgf, 0.920, 0.472 kgf, and 2.556, respectively. The general results satisfactorily indicate that near infrared spectroscopy technology integrated with an appropriate machine learning algorithm has optimistic results in determining the firmness of mango non-destructively.
Characterization of Liquid Waste in Isotope production and Research Facilitiesiosrjce
IOSR Journal of Applied Physics (IOSR-JAP) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of physics and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in applied physics. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
FT-NIR as a real-time QC tool for polymer manufacturingGalaxy Scientific
Near infrared spectroscopy has been used widely in the polymer industry. Compared to traditional methods such as wet chemistry and chromatographic methods, NIR spectroscopy provides considerable advantages in process and quality control applications through fundamental benefits such as low to no cost of consumables such as solvents, columns, reagents; real time analysis - generally less than 10 seconds measurement time; multiple components per analysis; elimination of sample preparation time; and elimination of many sources of systematic error.
This presentation will present three FT-NIR polymer applications: 1) at line polyether polyols’ hydroxyl value analysis; 2) real time isocyanate number monitoring during a polyurethane reaction; and 3) off-line quality control of percentage styrene in styrene copolymers.
Study of Parametric Kernels for LSSVM Models in NIR Determination of Fishmeal...researchinventy
High quality fishmeal provides suitable protein ratio in the modernizing balance of animal breeding. Near-infrared (NIR) spectroscopy technology was used for the determination of protein concentration in fishmeal samples. In this paper, Least squares support vector machine (LSSVM) method was applied to the NIR analytical procedure to search the optimal calibration model. We focus on the study of three common LSSVM kernels and the tuning of parameter valuing. The parameters of the linear, polynomial and Gaussian radial basis (GRB) kernels were respectively tuned to examine whether they output an optimal or acceptable predictive result, either in the validating or the testing part. These trainings were combined with the LSSVM regulation factor changing at a regular step pace. The results showed that, for polynomial kernel (including the linear kernel), the identified optimal parameters were able to outlet appreciating root mean square error, but the correlation seemed not so good as expected. For GRB kernel, we selected a relatively smaller regularization factor for each fixed kernel width to simplify the complexity in practical applications. In conclusion, the linear, polynomial and GRB kernels are feasible to explicit the NIR features of protein for the LSSVM model in NIR analysis of fishmeal samples
This research introduces an instrument for performing quality control on aromatic rice by utilizing feature extraction of Principle Component Analysis (PCA) method. Our proposed system (DNose v0.2) uses the principle of electronic nose or enose. Enose is a detector instrument that work based on classification of the smell, like function of human nose. It has to be trained first for recognizing the smell before work in classification process. The aim of this research is to build an enose system for quality control instrument, especially on aromatic rice. The advantage of this system is easy to operate and not damaging the object of research. In this experiment, ATMega 328 and 6 gas sensors are involved in the electronic module and PCA method is used for classification process.
EXPERIMENTAL IMPLEMENTATION OF EMBARRASINGLY PARALLEL PROCESS IN ANALYSIS OF ...ijesajournal
This document describes an experimental implementation of an embarrassingly parallel process to analyze blood glucose concentration using ATmega32 microcontrollers. The system was designed to handle multiple blood samples simultaneously using 4 sensor nodes connected to a master node via I2C bus. The sensor nodes operate in parallel to measure glucose levels, with the master node coordinating distribution of samples and collection of results. Evaluation showed the system achieved linear speedup in processing blood samples compared to serial methods.
Hybrid Multilevel Thresholding and Improved Harmony Search Algorithm for Segm...IJECEIAES
This paper proposes a new method for image segmentation is hybrid multilevel thresholding and improved harmony search algorithm. Improved harmony search algorithm which is a method for finding vector solutions by increasing its accuracy. The proposed method looks for a random candidate solution, then its quality is evaluated through the Otsu objective function. Furthermore, the operator continues to evolve the solution candidate circuit until the optimal solution is found. The dataset used in this study is the retina dataset, tongue, lenna, baboon, and cameraman. The experimental results show that this method produces the high performance as seen from peak signal-to-noise ratio analysis (PNSR). The PNSR result for retinal image averaged 40.342 dB while for the average tongue image 35.340 dB. For lenna, baboon and cameramen produce an average of 33.781 dB, 33.499 dB, and 34.869 dB. Furthermore, the process of object recognition and identification is expected to use this method to produce a high degree of accuracy.
Computer model simulations are widely used in the investigation of complex hydrological systems. In particular, hydrological models are tools that help both to better understand hydrological processes and to predict extreme events such as floods and droughts. Usually, model parameters need to be estimated through calibration, in order to constrain model outputs to observed variables.
Relevant model parameters used for calibration are usually selected based on expert knowledge of the modeller or by using a local one-at-a-time (OAT) sensitivity analysis (SA). However, in case of complex models those approaches may not result in proper identification of the most sensitive parameters for model calibration. In particular local OAT SA methods are only effective for assessing the relative importance of input factors when the model is linear, monotonic, and additive, which is rarely the case for complex environmental models. In contrast Global Sensitivity Analysis (GSA)
is a formal method for statistical evaluation of relevant parameters that contribute significantly to model performance. GSA techniques explore the entire feasible space of each model parameter, and they do not require any assumptions on the model nature (such as linearity or additivity).
In this work we apply the GSA to LISFLOOD, a fully-distributed hydrological model used for flood forecasting at Pan-European scale within the European Flood Awareness System (EFAS). Two case studies are considered, snowmelt- and evapotranspiration-driven catchments, to identify sensitive parameters for both types of hydrological regimes. Results of the GSA will then be used for selecting parameters that need to be estimated during model calibration. Considering the large
number of parameters of a fully-distributed model, a two-step GSA framework is applied. First, we implement the computationally efficient screening method of Morris. This method requires a limited number of simulations and produces a qualitative ranking and selection of important factors. As a second step, we apply the variance-based method of Sobol, only to the subset of factors determined as important during the previous screening. The method of Sobol provides quantitative estimates for first order and total order sensitivity indexes of input factors.
The calibration results after the GSA will be described for both case studies and compared against those obtained by using only prior expert knowledge
The document summarizes a study that used artificial neural networks (ANN) to predict chemical oxygen demand (COD) levels in an anaerobic wastewater treatment system. Four ANN backpropagation training algorithms - Levenberg-Marquardt, gradient descent with adaptive learning, gradient descent with momentum, and resilient backpropagation - were tested on a model using COD input data. The Levenberg-Marquardt algorithm produced the best results with the lowest mean squared error of 0.533 and highest regression value of 0.991, accurately predicting COD levels. The study demonstrates ANNs can effectively model and predict values in nonlinear wastewater treatment processes.
Real-time and Non-Invasive Detection of Haemoglobin level using CNNIRJET Journal
This document describes a study that aims to detect haemoglobin levels in a non-invasive and real-time manner using convolutional neural networks (CNNs). The researchers collected a dataset of finger images with varying haemoglobin levels from blood donation camps. They trained a CNN model on this dataset to classify haemoglobin levels based on image features. The CNN was tested on additional finger images and able to detect haemoglobin levels in real-time without drawing blood, providing advantages over traditional invasive methods like faster results and no patient discomfort or biohazards. The proposed non-invasive method using deep learning could help diagnose blood-related conditions earlier.
Development and Validation of RP HPLC Method for Estimation of Vortioxetine i...ijtsrd
A rapid and precise reverse phase high performance liquid chromatographic method has been developed for the validated of Vortioxetine in its pure form as well as in tablet dosage form. Chromatography was carried out on ODS C18 4.6 x 250 mm, 5µm column using Acetonitrile And Methonal 70 30 as the mobile phase at a flow rate of 1.0 mL min, the detection was carried out at 274nm. The retention time of the Vortioxetine was 2.922 ±0.02min. The method produce linear responses in the concentration range of 20 µg ml of Vortioxetine. The method precision for the determination of assay was below 2.0 RSD. The method is useful in the quality control of bulk and pharmaceutical formulations. Rathod K. G | Bargaje G. S | Rathod G. R | Deshpande O. V "Development and Validation of RP-HPLC Method for Estimation of Vortioxetine in Bulk and Pharmaceutical Dosage Form" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-6 , October 2019, URL: https://www.ijtsrd.com/papers/ijtsrd28040.pdf Paper URL: https://www.ijtsrd.com/pharmacy/analytical-chemistry/28040/development-and-validation-of-rp-hplc-method-for-estimation-of-vortioxetine-in-bulk-and-pharmaceutical-dosage-form/rathod-k-g
Rapid Prediction of Extractives and Polyphenolic Contents in Pinus caribaea B...Waqas Tariq
The potential of near infrared reflectance spectroscopy (NIRS) for rapidly and accurately determining the extractives and polyphenol contents in Pinus caribaea bark extracts was assessed. Pinus caribaea bark samples were obtained from 110 trees in plantation stands at different locations of Ghana and were then scanned by NIRS. Their extractives and polyphenol contents reference values were obtained by TAPPI T204 om-88 and Folin-Ciocalteu methods respectively. These reference values were regressed against different spectral transformations using partial least square (PLS) regression. First derivative transformation equation of the raw spectral data, resulted in a coefficient of determination r2 in the external validation of 0.91 and 0.97 respectively for extractives content and polyphenol content. The calibration samples covered a wide range of extractives content from 34 – 45% and polyphenolic content from 16 – 23.5%. The standard deviation to root mean square error of cross validation ratio (SD/RMSECV), root mean square error of calibration to standard deviation ratio (RMSEC/SD), RMSECV/RMSEC and r2 for both extractives and polyphenol models were indicative of good prediction equations. The predicted values were thus highly correlated with time-consuming wet chemical measured values of extractives content and polyphenol content. The use of NIRS for the determination of the extractives and polyphenol contents in Pinus caribaea bark thus provides an advantage of time saving and cost of analysis.
The document describes the development and implementation of an automatic track counting system for CR-39 solid state nuclear track detectors. The system uses a MATLAB software program to automatically count tracks from microscope images of etched CR-39 detectors. CR-39 detectors were exposed to different radiation levels and counted manually and automatically. Results found the automatic counts had up to 30% difference from manual counts, with the new digital microscope providing more accurate automatic counts than the traditional microscope. The system takes less than one minute to count tracks per detector. Future work aims to improve counting of overlapping tracks and develop an automatic thresholding algorithm.
techniques for detecting nanoparticles in wastewaterAlekhya Golla
techniques for seperating and analysing the metal containing nanoparticles in wastewater. these techniques are helpful in recycling process of industrial effluents containing nanowastes.
Similar to NEAR-INFRARED SPECTROSCOPIC MODELING OPTIMIZATION FOR QUANTITATIVE DETERMINATION OF SUGAR BRIX IN SUGARCANE INITIAL-PRESSURE JUICE (20)
This document summarizes research on metal-on-metal hip implants. It discusses how metal-on-metal hip implants were developed over 30 years ago and have been refined through advances in manufacturing and materials science. It reviews various modeling techniques used to analyze the performance of metal-on-metal hip implants, including finite element analysis, numerical solutions of Reynolds' equation, and multi-grid methods. The document finds that while metal-on-metal implants offer strengths like strength and low wear rates, more research is still needed to fully understand factors like lubrication, materials used, geometry, and input parameters in order to minimize failures and revision surgeries.
Background Hospital contributes significantly tangible and intangible resources on a concurred plan by the scheduling of surgery on the OT list. Postponement decreases efficiency by declining throughput leads to wastage of resources hence burden to the nation. Patients and their family face economic and emotional implication due to the postponement. Postponement rate being a quality indicator controls check mechanism could be developed from the results. Postponement of elective scheduled operations results in inefficient use of the operating room (OR) time on the day of surgery. Inconvenience to patients and families are also caused by postponements. Moreover, the day of surgery (DOS) postponement creates logistic and financial burden associated with extended hospital stay and repetitions of pre-operative preparations to an extent of repetition of investigations in some cases causing escalated costs, wastage of time and reduced income. Methodology A cross-sectional study was done in the operation theaters of a tertiary care hospital in which total ten operation theaters of General Surgery Data of scheduled, performed and postponed surgeries was collected from all the operation theater with effect from March 1st to September 30th, 2018. A questionnaire was developed to find out the reasons for the postponement for all hospital’s stakeholders (surgeons, Anesthetist, Nursing Officer) and they were further evaluated time series analysis of scheduling of Operation Theater for moving average technique. Results Total 958 surgeries were scheduled and 772 surgeries performed were and 186 surgeries were postponed with a postponement rate of 19.42% in the cardiac surgery department during the study period. Month-wise postponement Rate exponential smoothing of time series data shows the dynamic of operating suits. To test throughput Postponement rate was plotted the postponed surgeries and on regression analysis is in a perfect linear relationship.
Introduction: Postponement of elective scheduled operations results in inefficient use of operating room (OR) time on the day of surgery. Inconvenience to patients and families also caused by postponements. Moreover, day of surgery (DOS) postponement creates logistic and financial burden associated with extended hospital stay and repetitions of pre-operative preparations to an extend of repetition of investigations in some cases causing escalated costs, wastage of time and reduced income. Methodology: A cross sectional study was done in the operation theaters of a tertiary care hospital in which total ten operation theaters of General Surgery Data of scheduled, performed and postponed surgeries was collected from all the operation theater with effect from march 1st to September 30th 2018. A questionnaire was developed to find out the reasons for the postponement for all hospital’s stakeholders (Surgeons, Anesthetist, Nursing officer) and they were further evaluated Time series analysis of scheduling of Operation Theater for Moving average Technique. Results: total 2,466 surgeries were scheduled and 1,980 surgeries were performed and 486 surgeries were postponed in the general surgery department during the study period. Month wise postponement forecast was in accordance with the performed surgeries and on regression analysis postponed surgeries were in perfect linear relationship with the postponement Rate.
In the present paper the experimental study of
Nanotechnology involves high cost for Lab set-up and the
experimentation processes were also slow. Attempt has also
been made to discuss the contributions towards the societal
change in the present convergence of Nano-systems and
information technologies. one cannot rely on experimental
nanotechnology alone. As such, the Computer- simulations and
modeling are one of the foundations of computational
nanotechnology. The computer modeling and simulations
were also referred as computational experimentations. The
accuracy of such Computational nano-technology based
experiment generally depends on the accuracy of the following
things: Intermolecular interaction, Numerical models and
Simulation schemes used. The essence of nanotechnology is
therefore size and control because of the diversity of
applications the plural term nanotechnology is preferred by
some nevertheless they all share the common feature of control
at the nanometer scale the latter focusing on the observation
and study of phenomena at the nanometer scale. In this paper,
a brief study of Computer-Simulation techniques as well as
some Experimental result
Solar cell absorber Kesterite- type Cu2ZnSnS4 (CZTS) thin films have been prepared by Chemical Bath Deposition (CBD). UV–vis absorption spectra measurement indicated that the band gap of as-synthesized CZTS was about1.68 eV, which was near the optimum value for photovoltaic solar conversion in a single-band-gap device. The polycrystalline CZTS thin films with kieserite crystal structure have been obtained by XRD. The average of crystalline size of CZTS is 27 nm
Multilevel inverters play a crucial part in the
areas of high and medium voltage applications. Among the three
main multilevel inverters used, the capacitor clamped multilevel
inverter(CCMLI) has advantage with respect to voltage
redundancies. This work proposes a switching pattern to improve
the performance of chosen H-bridge type CCMLI over
conventional CCMLI. The PWM technique used in this work is
Phase Opposition Disposition PWM(PODPWM). The
performance of proposed H-bridge type CCMLI is verified
through MATLAB-Simulink based simulation. It has been
observed that the THD is low in chosen CCMLI compared to
conventional CCMLI.
This document summarizes a research paper that designed a digital matched filter (DMF) to compress a binary phase code modulation (BPCM) signal encoded with a 13-chip Barker code. The DMF was implemented on an FPGA using a digital convolution algorithm in the time domain. The DMF design included a direct digital frequency synthesizer to generate the BPCM signal, a digital noise generator to add noise, and digital delay lines and shift registers to perform the convolution. Test results showed the input and output signals on an oscilloscope for different signal-to-noise ratio levels. The DMF achieved a processing gain of 11.14 dB.
Flooding is one of the most devastating natural
disasters in Nigeria. The impact of flooding on human activities
cannot be overemphasized. It can threaten human lives, their
property, environment and the economy. Different techniques
exist to manage and analyze the impact of flooding. Some of these
techniques have not been effective in management of flood
disaster. Remote sensing technique presents itself as an effective
and efficient means of managing flood disaster. In this study,
SPOT-10 image was used to perform land cover/ land use
classification of the study area. Advanced Space borne Thermal
Emission and Reflection Radiometer (ASTER) image of 2010 was
used to generate the Digital Elevation Model (DEM). The image
focal statistics were generated using the Spatial Analyst/
Neighborhood/Focal Statistics Tool in ArcMap. The contour map
was produced using the Spatial Analyst/ Surface/ Contour Tools.
The DEM generated from the focal statistics was reclassified into
different risk levels based on variation of elevation values. The
depression in the DEM was filled and used to create the flow
direction map. The flow accumulation map was produced using
the flow direction data as input image. The stream network and
watershed were equally generated and the stream vectorized. The
reclassified DEM, stream network and vectorized land cover
classes were integrated and used to analyze the impact of flood on
the classes. The result shows that 27.86% of the area studied will
be affected at very high risk flood level, 35.63% at high risk,
17.90% at moderate risk, 10.72% at low risk, and 7.89% at no
risk flood level. Built up area class will be mostly affected at very
high risk flood level while farmland will be affected at high risk
flood level. Oshoro, Imhekpeme, and Weppa communities will be
affected at very high risk flood inundation while Ivighe, Uneme,
Igoide and Iviari communities will be at risk at high risk flood
inundation level. It is recommended among others that buildings
that fall within the “Very High Risk” area should be identified
and occupants possibly relocated to other areas such as the “No
Risk” area.
Without water, humans cannot live. Since time began,
we have lived by the water and vast tracts of waterless land have
been abandoned as it is too difficult to inhabit. At any given
moment, the earth’s atmosphere contains 4,000 cubic miles of
water, which is just 0.000012% of the 344 million cubic miles of
water on earth. Nature maintains this ratio via evaporation and
condensation, irrespective of the activities of man.
There is a certain need for an alternative to solve the water
scarcity. Obtaining water from the atmosphere is nothing new -
since the beginning of time, nature’s continuous hydrologic cycle
of evaporation and condensation in the form of rain or snow has
been the sole source and means of regenerating wholesome water
for all forms of life on earth.
An effective method to generate water is by the separation of
moisture present in air by condensation. In this study, the water
present in air is condensed on the surface of a container and then
collected in an external jacket provided on the container.
Insulations are provided to optimize the inner temperature of the
container.
The method is although uncommon but has certain advantages
which make it a success. The process is economical and does not
require a lot of utilities. It also helps in further reducing the
carbon footprint.
In every moment of functioning the Li-Ion
battery must provide the power required by the user, to have a
long operating life and to and to provide high reliability in
operation. The methods for analysis and testing batteries are
ensuring that all these conditions imposed to the batteries are
met by being tested depending on their intended use.
The success rate of real estate project is
decreasing as there is large scale of project and participation of
entities. It is necessary to study the risk factors involved in the
project. This paper focused on types of risks involved in the
project, risk factors, risk management tools & techniques.
Identification of risk of the project in terms of the total cost of the
project has been divided under Technical, Financial, Sociopolitical
and Statutory cost centers. Large real estate projects
have to tackle the following issues: land acquisition, skilledlabour
shortage, non-availability of skilled project managers, and
mechanization of the construction process to cater to the growing
demands. Non- availability of supporting infrastructure, political
issues like instability of the government leading to regulatory
issues, social issues, marketing forms an important part in these
projects as this is a onetime investment and the purchase cycle is
long , long development period makes the same project be at
different points in the real estate value cycle.
This document reviews the use of copper slag as a partial replacement for fine aggregate in concrete. It summarizes 12 previous studies that investigated replacing sand with copper slag at various percentages. The key findings of the studies are:
- Compressive, tensile, and flexural strength of concrete generally increase when replacing up to 40% of fine aggregate with copper slag.
- Workability of concrete mixes also tends to increase with copper slag replacement due to its physical properties.
- Copper slag concrete shows improved resistance to sulfate attack compared to traditional concrete.
- Maximum strength gains are observed around 20-40% fine aggregate replacement with copper slag. Beyond 50% replacement, strengths start to decrease.
- Copper slag
- Security is a concept similar to being cautious
or alert against any danger. Network security is the condition of
being protected against any danger or loss. Thus safety plays a
important role in bank transactions where disclosure of any data
results in big loss. We can define networking as the combination
of two or more computers for the purpose of resource sharing.
Resources here include files, database, emails etc. It is the
protection of these resources from unauthorized users that
brought the development of network security. It is a measure
incorporated to protect data during their transmission and also
to ensure the transmitted is protected and authentic.
Security of online bank transactions here has been
improved by increasing the number of bits while establishing the
SSL connection as well as in RSA asymmetric key encryption
along with SHA1 used for digital signature to authenticate the
user
Background: Septoplasty is a common surgical
procedure performed by otolaryngologists for the correction of
deviated nasal septum. This surgery may be associated with
numerous complications. To minimize these complications,
otolaryngologists frequently pack both nasal cavities with
different types of nasal packing. Despite all its advantages,
nasal packing is also associated with some disadvantages. To
avoid these issues, many surgeons use suturing techniques to
obviate the need for packing after surgery.
Objective: To determine the efficacy and safety of trans-septal
suture technique in preventing complications and decreasing
morbidity after septoplasty in comparison with nasal packing.
Patients and methods: Prospective comparative study. This
study was conducted in the department of Otolaryngology -
Head and Neck Surgery, Rizgary Teaching Hospital - Erbil,
from the 6th of May 2014 to the 30th of November 2014.
A total of 60 patients aged 18-45 years, undergoing septoplasty,
were included in the study. Before surgery, patients were
randomly divided into two equal groups. Group (A) with transseptal
suture technique was compared with group (B) in which
nasal packing with Merocel was done. Postoperative morbidity
in terms of pain, bleeding, postnasal drip, sleep disturbance,
dysphagia, headache and epiphora along with postoperative
complications including septal hematoma, septal perforation,
crustation and synechiae formation were assessed over a follow
up period of four weeks.
Results: Out of 60 patients, 37 patients were males (61.7%)
and 23 patients were females (38.3%). Patients with nasal
packing had significantly more postoperative pain (P<0.05)><0.05). There was no significant difference between
the two groups with respect to nasal bleeding, septal
hematoma, septal perforation, crustation and synechiae
formation.
Conclusion: Septoplasty can be safely performed using transseptal
suturing technique without nasal packing.
This document summarizes a study that evaluated the quality of drainage water in Al-Shamiya al-sharqi drain in Diwaniya city, Iraq for use in irrigation. 10 water samples were collected from locations along the drain and analyzed for various chemical parameters. An Irrigation Water Quality Index (IWQI) was used to assess the water quality, taking into account parameters like EC, sodium, chloride, bicarbonate and SAR that most affect water quality for irrigation. The IWQI was then integrated with a GIS system to map the water quality. The results found that 52% of the drainage water fell in the "Low restriction" category, 47% was "Moderate restriction" and 1% was
The cable-hoisting method and rail cable-lifting
method are widely used in the construction of suspension bridge.
This paper takes a suspension bridge in Hunan as an example,
and expounds the two construction methods, and analyzes their
respective merits and disadvantages.
Baylis-Hillman reaction has been achieved on
different organic motifs but with completion times of three to
six days. Micellar medium of CTAB in water along with the
organic base DABCO has been used to effect the BaylisHillman
reaction on a steroidal nucleus of Withaferin-A for the
first time with different aromatic aldehydes within a day to
synthesize a library of BH adducts (W1a –W14a) and (W1bW14b)
as a mixture of two isomers and W15 as a single
compound. The isomers were separated on column and the
major components were chosen for bio-evaluation. Cytotoxic
activity of the synthesized compounds was screened against a
panel of four cancer cell lines Lung A-549, Breast MCF-7,
Colon HCT-116 and Leukemia THP-1 along with 5-florouracil
and Mitomycin-C as references. All the compounds exhibited
promising activity against screened cell lines and were found to
possess enhaunced activity than parent compound. BH adducts
with aromatic systems having methoxy and nitro groups were
found to be more active.
This paper presents the details on the
experimental investigation carried out to get the desired fresh
properties of the SCC. Tests were performed on various mixtures
to obtain the required SCC. In the present research work we
have replaced 15% of cement with class F fly ash. By varying the
quantity of water and sand the mortar mix was prepared. Later
varying percentage of coarse aggregate was added to the mortar
to obtain the desired SCC.
The batteries used in electric and hybrid vehicles
consists of several cells with voltages between 3.6V battery and
4.2 V in series or parallel combinations of configurations for
obtaining the necessary available voltages in the operation of a
hybrid electric vehicle. How malfunction of a single cell affects
the behavior of the entire battery pack, BMS main function is to
protect individual cells against over-discharge, overload or
overheating. This is done by correct balancing of the cells. In
addition BMS estimates the battery charge status
This project aims at using (PD-MCPWM) Phase
disposition multi carrier pulse width modulation technique to
reduce leakage current in a transformerless cascaded multilevel
inverter for PV systems. Advantages of transformerless PV
inverter topology is as follows, simple structure, low weight and
provides higher efficiency , but however this topology provides a
path for the leakage current to flow through the parasitic
capacitance formed between the PV module and the ground.
Modulation technique reduces leakage current with an added
advantage without adding any extra components.
More from International Journal of Technical Research & Application (20)
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
20240605 QFM017 Machine Intelligence Reading List May 2024
NEAR-INFRARED SPECTROSCOPIC MODELING OPTIMIZATION FOR QUANTITATIVE DETERMINATION OF SUGAR BRIX IN SUGARCANE INITIAL-PRESSURE JUICE
1. International Journal of Technical Research and Applications e-ISSN: 2320-8163,
www.ijtra.com Volume 2, Issue 6 (Nov-Dec 2014), PP. 69-72
69 | P a g e
NEAR-INFRARED SPECTROSCOPIC MODELING
OPTIMIZATION FOR QUANTITATIVE
DETERMINATION OF SUGAR BRIX IN
SUGARCANE INITIAL-PRESSURE JUICE
Hua-Zhou Chen*, Jiang-Bei Wen, Jie-Chao Chen, Ling-Hui Li, Ya-Juan Huo
College of Science, Guilin University of Technology, Guilin, 541004, China
*
huazhouchen@163.com
Abstract— Initial-pressed juice is an important intermediate
product in cane sugar industry, and sugar brix is a key indicator
for evaluating sugar quality. Real-time evaluation of sugar quality
requires determining the content of sugar brix in all steps of the
cane sugar process. Near-infrared (NIR) spectroscopy is simple,
rapid and non-destructive technologies on the analysis of material
contents. In this study, the chemometric algorithm of parameter-
combined tuning of Savitzky-Golay (SG) smoother and Partial
Least Squares (PLS) regression was utilized for NIR analysis of
sugar brix contents in sugarcane initial-pressure juice. The
algorithms of combined optimization of SG smoother and PLS
regression was achieved and the calibration models were
optimally established by screening the expanded 540 SG
smoothing modes and the 1-30 latent valuables (LV). The
optimized models have high predictive accuracy. These results
confirm that the combined optimization of SG smoothing modes
and PLS LVs is effective in the quantitative determination of
sugar brix contents in sugarcane initial-pressure juice, and that
the NIR spectroscopic technology with its chemometric algorithms
have the potential in the analysis of cane sugar intermediates.
Key Words— Initial-Pressure Juice, Sugar Brix, Near-Infrared
Spectroscopy, Partial Least Squares regression, Savitzky-Golay
smoother.
I. INTRODUCTION
Near-infrared (NIR) spectroscopy is a well-performed
technology on analysis of both the structure of matter and
material contents. It has the advantages of simple, rapid, non-
destructive and reagent-free measurement, multi-component
simultaneous determination, etc. With the development of
modern science and computation, NIR spectroscopic analytical
technology is widely applied to many fields, such as agriculture,
food, environment, biomedicine [1-5]. In recent years, there are
preliminary studies on the application of NIR to cane sugar
industry [6-8]. In the progress of cane sugar, initial-pressed
juice is an intermediate product of much importance, and sugar
brix is an important indicator for evaluating sugar quality.
Real-time evaluation of sugar quality requires determining the
content of sugar brix in all steps of cane sugar process.
Conventional chemical analytical methods in laboratory cannot
achieve the fast (or online) determinations and this has been a
long-standing problem to be solved in cane sugar industry.
Online fast detection of cane sugar intermediates is expected
to be achieved by using NIR spectroscopic technology. NIR
spectroscopy with its chemometric methods owns the ability to
output the perspective detection results in just a few minutes
[9-10]. Common chemometric methods include Classical
Linear Regression, Multiple Linear Regression, Principal
Component Analysis, Partial Least Squares and etc. Partial
Least Squares (PLS) is a common chemometric analytical
algorithm integrating principal component analysis and
multivariate linear regression. It can effectively eliminate
spectral collinearity by creating comprehensive latent valuables.
The number of latent valuables (LV) is the key parameter of
PLS, to reduce spectral noises and extracting responsive
information [11-14]. Model predictive results will be reduced if
LV is out of the suitable range, too small or too large. Thus a
reasonable LV should be selected by taking it as a tunable
parameter.
As the NIR spectroscopy is a reflection of the
comprehensive information of all chemical components in the
analytes, and, indispensably, including some kinds of noises
generated in the detecting process. Therefore it is necessary to
study the chemometric methods of data pretreatment to reduce
spectral noises [15-16]. Savitzky-Golay (SG) smoother is a
famous and widely used method for spectral data pretreatment
[17-19]. Its major steps contain smoothing and differential, in
which the smoothing mode is quite important for model
improvement. There are many smoothing modes, determined
by the three parameters of Order of Differential (OD), Degree
of Polynomial (DP) and Number of Points (NP), and a specific
smoothing mode outputs a corresponding calculation equation
with its specific coefficients. Thus it is necessary to select a
suitable SG smoothing mode and the best way to find it out is
to screen it by tuning the three parameters combined with the
optimization of PLS latent valuables.
The aim of this research is to quantitatively determine the
sugar brix content in sugarcane initial-pressure juice by using
the NIR spectral responses. Savitzky-Golay smoother is
employed as the method for data pretreatment and PLS is
utilized as the algorithm for establishing calibration models.
Model improvement is achieved by the combined optimization
of PLS LV and the parameters of SG smoother. For a wide-
range optimization, we try to expand the tuning range of the
three SG smoothing parameters, and then establish NIR
calibration models with SG smoothing pretreatment and PLS
regressions. The reasonable smoothing modes and the optimal
PLS LV are simultaneously determined according to the model
predictive results, in the combined computational algorithm
platform. The selected pretreatment and modeling methods are
examined by the prediction sample set, to have the potential of
NIR modeling enhancement.
II. EXPERIMENT AND METHODS
A. Materials and Instruments
Eighty-six samples of sugarcane initial-pressure juice were
collected. The sugar brix content of each sample was measured
using the traditional chemical methods and the measured
values were used as the modeling reference values for NIR
quantitative analysis. The sugar brix content ranges from 19.0
to 22.2 (%Bx).
We detected the NIR absorption spectra of initial-pressure
juice samples by using Foss Rapid Liquid grating spectrometer
2. International Journal of Technical Research and Applications e-ISSN: 2320-8163,
www.ijtra.com Volume 2, Issue 6 (Nov-Dec 2014), PP. 69-72
70 | P a g e
with a 1-mm pathlength quartz cuvette. The scanning range is
set as 800-2500 nm. Because the rotation of the cuvette cell can
effectively reduce the unevenness, and multiple scans can
effectively reduce the influence of background noise, we
designed to measure the spectra while the cuvette cell is
rotating. The temperature was controlled at 25±1°C and the
relative humidity was at 46±1% RH throughout the spectral
scanning process.
B. Partitioning of Calibration and Prediction Samples
NIR spectroscopic analysis requires partitioning the samples
into calibration set and prediction set. Calibration samples are
used for model establishment and prediction samples for model
evaluation. A suitable partition will lead to perspective
modeling results. The method of sample partitioning based on
joint x-y distances (SPXY) is common used for sample
partitioning in the spectroscopic field [20-21]. SPXY process is
the performance of Kernard-Stone (KS) algorithm on both the
spectral data and the chemical data. The classic KS algorithm
is aimed at selecting a representative subset from the sample
pool (N samples). In order to ensure a uniform distribution of
such a subset along the x matrix (spectral response), KS
follows a stepwise procedure in which new selections are taken
in regions of the space far from the samples already selected.
For this purpose, the algorithm employs the Euclidean
distances dx(j, k) between the x-vectors of each pair (j, k) of
samples calculated as
2
1
( , ) ( ( ) ( ))
P
x j k
p
d j k x p x p
, , [1, ]j k N . (1)
For spectral data, xj(p) and xk(p) are the instrumental
responses at the p-th wavelength for samples j and k,
respectively. P denotes the number of wavelengths in the
spectra.
The algorithm selects the sample that exhibits the largest
minimum distance with respect to any sample already selected.
The proposal of the present paper consists of augmenting the
distance defined in Eq. (1) with a distance in the dependent
variable y vector (the contents of the target component in each
sample) for the parameter under consideration. Such a distance
dy(j, k) can be calculated for each pair of samples j and k as
2
( , ) ( )y j k j kd j k y y y y , , [1, ]j k N . (2)
In order to assign equal importance to the distribution of the
samples in the x matrix and y vector, distances dx(j, k) and dy(j,
k) are divided by their maximum values in the data set. In this
manner, a normalized xy distance is calculated as
, [1, ] , [1, ]
( , )( , )
( , )
max ( , ) max ( , )
yx
xy
x y
j k N j k N
d j kd j k
d j k
d j k d j k
, , [1, ]j k N . (3)
With a stepwise selection procedure similar to the KS
algorithm, SPXY can be applied with dxy(j, k) instead of dx(j, k)
alone.
C. The Extended Savitzky-Golay Smoother
SG smoother is a famous and widely-used pretreatment
method to eliminate spectral noise. SG smoothing parameters
include Order of Differential (OD), Degree of Polynomial (DP)
and Number of Points (NP). For convenience, we denoted that
the original spectral smoothing is 0th order differential. And
NP is usually an odd number, denoted as NP=2m+1. It means
that 2m+1 consecutive spectral data as a window, the spectral
data in the window were fitted by using polynomial function
whose independent variable was the serial number i of the
spectral data, (i=0, ±1, ±2… ±m), and the polynomial
coefficients were determined. Then the smoothing value and
each order derivative value at the center point (i=0) of the
window were calculated by using the determined polynomial
coefficients. By moving the window in the whole spectral
collecting region, the SG smoothed spectra and SG derivative
spectra were obtained.
According to the above method, the smoothing value and
each order derivative value at the center point of the window
can be expressed as a linear combination of the measured
spectral data in the window. The coefficients of the linear
combination (i.e. smoothing coefficients) were uniquely
determined by number of smoothing points (i.e. the number of
points in the window), degree of polynomial, and order of
derivatives. In Savitzky and Golay’s paper [19], it was set that
OD=0, 1, 2, 3, 4, 5, DP=2, 3, 4, 5, and NP=5, 7… 25 (odd
numbers). Different combinations of parameters correspond to
different smoothing modes, and further correspond to different
smoothing coefficient sets. There were a total of 117
smoothing modes (i.e. 117 sets of smoothing coefficients). The
appropriate smoothing mode can be selected according to
different analytes.
However, for the spectroscopic analysis of cane sugar
intermediates (e.g. initial-pressed juice and clarified juice), if
the interval between spectral points was very small and number
of points was small, then the window was narrow and the
information in the window for smoothing was not sufficient. In
this case, it was difficult to get satisfying smoothing effect.
Hence, it was very necessary to expand the range of NP. In this
paper, NP was expanded to 5, 7… 81 (odd), DP was expanded
to 2, 3, 4, 5, 6, and the corresponding sets of new smoothing
coefficients were calculated, so that a total of 540 smoothing
modes were obtained including the original 117 modes, which
is a SG smoothing group with a wider application scope.
D. Modeling Indicators
The model evaluation indicators mainly include correlation
coefficient of predication (Rp), root mean squared error of
predication (RMSEP) and the relative RMSEP (RRMSEP),
which are calculated by the Eqs. (4-6),
1
1 1
( )( ' ' )
Rp
( ) ( ' ' )
M
ip mp ip mp
i
M M
ip mp ip mp
i i
C C C C
C C C C
, (4)
2
1
( ' )
RMSEP
1
M
ip ip
i
C C
M
,
(5)
RMSEP
RRMSEP 100%
mpC
,
(6)
where C'ip and Cip were predictive value and chemical values of
the sample i in the prediction set, C'mp and Cmp were the mean
predicted value and mean chemical value of all samples in the
prediction set, and M was the sample number in the prediction
set.
The value of Rp is in coherent with RMSEP, usually that a
higher Rp corresponds to a lower RMSEP. And, RRMSEP is
always proportional to RMSEP. Thus, we take Rp and RMSEP
as the main indicators for model optimization.
3. International Journal of Technical Research and Applications e-ISSN: 2320-8163,
www.ijtra.com Volume 2, Issue 6 (Nov-Dec 2014), PP. 69-72
71 | P a g e
III. RESULTS AND DISCUSSIONS
The NIR spectra of 86 sugarcane initial-pressure juice were
showed in Figure 1. The spectral responses contain the
absorption information of many hydrocarbon groups in the
initial-pressure juice samples, such as sucrose, organic acids,
amino acids and etc. As is showed in Figure 1, the NIR spectra
of initial-pressure juice reflect severe spectral overlap, and the
absorption was weak. The absorbance is quite strong around
1460 nm and around 1940 nm because of water molecules. In
order to reduce the interference of the water molecules, it is
necessary to use SG smoother to deal with data pretreatment in
spectral modeling.
The partitioning of calibration samples and prediction
samples has to be finished before model establishment. Using
the SPXY method, we have the 86 initial-pressure juice samples
divided into 56 samples (for calibration) and the other 30 (for
prediction). The mean value and the standard derivation of the
measured values of sugar brix content for all
calibration/prediction samples were showed in Table 1.
The computational algorithm platform was built up for
establishing NIR quantitative analytical models, by combining
the 540 kinds of SG smoothing modes and the PLS LV tuning
and optimizing, where the PLS LV was set changing from 1 to
30. The optimized model parameters were selected according
to the model predictive results.
The optimal models corresponding to each order of
differential, with its parameters and predictive results, were
showed in Table 2. The non-smoothed full-range PLS
modeling results was also listed in Table 2 for comparison.
According to the maximum Rp (or minimum RMSEP), the
optimal NIR model output the predictive results of Rp=0.954,
RMSEP=0.762%Bx and RRMSEP=3.7%, with the best
smoothing mode of OD=2, DP=4 (or 5) and NP=55, and with
the optimal LV=12.
It can be concluded from Table 2 that the model predictive
results have high accuracy at each smoothing order of
differential, regardless for the NIR data of initial-pressure juice
when integrating the optimization of PLS models combined
with SG smoother. And the modeling results are significantly
superior in the PLS models with SG smoothing than without SG
smoothing. The global optimal SG smoothing mode was 2nd
order of differential and 4th (or 5th) degree of polynomial, and
the optimal number of points was obviously larger than 25.
These results indicated (1) the samples partitioning method of
SPXY lead to well-done calibration models; (2) the DP, NP of
SG smoother and the LV of PLS always altered corresponding
to the varied OD; (3) the NP of SG smoother is necessary to be
expanded to the range of larger than 25; and (4) the
spectroscopic analytical chemometric parameters are somewhat
similar for sugarcane initial-pressure juice.
To view insight the pretreatment effect of SG smoother, we
sketch the figures showing the RMSEP corresponding to each
order of differential and each number of points, optimized from
different DP of SG smoother and different LV of PLS (see
Figure 2). Figure 2 demonstrated that a lower-than-25 NP of
SG cannot reach the minimum values of RMSEP and the
expansion of NP would output the much optimal results.On
another aspect, the influence of LV of PLS on modeling effect
was also investigated. Figure 3 showed the RMSEP values
corresponding to the varied LV of PLS, optimized by SG
smoothing mode, with 2nd order of differential and 4th or 5th
degree of polynomial. The figure confirmed that the optimized
LVs were larger than 10.
Table 2. The optimal models corresponding to each order of
differential, with its parameters and predictive results
DP NP LV Rp
RMSEP
(%Bx)
RRMSEP
Non-
smoothed
- - 11 0.915 1.193 5.8%
0th order 4, 5 53 9 0.945 0.932 4.6%
1st order 3, 4 77 10 0.949 0.832 4.1%
2nd order 4, 5 55 12 0.954 0.762 3.7%
3rd order 3, 4 77 11 0.953 0.776 3.8%
4th order 2, 3 71 9 0.944 0.917 4.5%
5th order 5, 6 39 11 0.900 1.212 5.9%
Table 1. Mean value and standard derivation of the
measured content of sugar brix for calibration/prediction
samples
(unit: %Bx) Mean value Standard deviation
Calibration 20.80 0.63
Prediction 20.39 0.57
Figure 2. RMSEP corresponding to each order of differential
and each number of points for initial-pressure juice (optimized
from different DP of SG smoother and different LV of PLS)
Figure 1. NIR spectra of 86 initial-pressure juice samples
4. International Journal of Technical Research and Applications e-ISSN: 2320-8163,
www.ijtra.com Volume 2, Issue 6 (Nov-Dec 2014), PP. 69-72
72 | P a g e
Figure 3. RMSEP values corresponding to the varied LV of PLS
for initial-pressure juice (optimized by SG smoothing mode, with
2nd order of differential and 4th or 5th degree of polynomial)
Figure 4. The comparative relationship between the NIR predicted
values and the chemical measured values of prediction samples
IV. CONCLUSIONS
The chemometric algorithm combined parameter-tuning of
SG smoother and PLS regression was utilized for NIR
spectroscopic analysis of sugar brix contents in sugarcane
initial-pressure juice, to establish and screen for the optimized
calibration model. SPXY method was used smoothly and
seemed much effective in the partition of calibration and
prediction samples. The algorithms of combined optimization of
SG smoother and PLS regression was achieved and the
calibration models were optimally established by screening the
expanded 540 SG smoothing modes and the 1-30 LVs. The
combined optimized calibration models have high predictive
accuracy, and the optimized modeling results were quite
appreciated. These results confirm that the expansion of SG
parameters is quite necessary, and the combined optimization of
SG smoothing modes and PLS LVs is an important method for
the quantitative determination of sugar brix contents in
sugarcane initial-pressure juice. Our conclusions demonstrated
that the NIR spectroscopic technology with its chemometric
algorithms has the potential in the analysis of cane sugar
intermediates. This rapid, non-destructive and reagent-free
technology has practical meanings and is perspective in the
online detection for cane sugar industry.
REFERENCES
[1] D.A. Burns and E.W. Ciurczak, Handbook of near-infrared
analysis, 3rd ed., CSC Press LLC, Boca Raton, 2006.
[2] W.Z. Lu, Modern near infrared spectroscopy analytical
technology, 2nd ed., China petrochemical press, Beijing, 2007.
[3] D.I. Givens and E.R. Deaville, “The current and future role of
near infrared reflectance spectroscopy in animal nutrition: a
review,” Aust. J. Agric. Res., 1999, vol. 50, pp. 1131-1145.
[4] A. Alishahi, H. Farahmand, N. Prieto and D. Cozzolino,
“Identification of transgenic foods using NIR spectroscopy: a
review,” Spectrochim Acta A., 2010, vol. 75, pp. 1-7.
[5] A. Saleem, C. Canal, D.A. Hutchins and et al., “Techniques for
quantifying chemicals concealed behind clothing using near
infrared spectroscopy,” Anal. Methods. 2011, vol. 3, pp. 2298-
2306.
[6] N. Berding and G.A. Brotherton, “Near Infrared Reflectance
Spectroscopy for Analysis of Sugarcane from Clonal Evaluation
Trials: I. Fibrated Cane,” Crop Science, 1991, vol. 31, pp. l017-
l023.
[7] N. Berding and G.A. Brotherton, Near Infrared Reflectance
Spectroscopy for Analysis of Sugarcane from Clonal Evaluation
Trials: II. Expressed Juice,” Crop Science, 1991, vol. 31, pp.
l024-l028.
[8] J.H. Meyer, “Near infrared spectroscopy (NIRS) research in the
South African sugar industry,” International Sugar Journal (Cane
Sugar Ed.), 1998, vol. 100, pp. 279-286.
[9] P. Williams and K. Norris, Near-infrared Technology in the
Agricultural and Food Industries (2nd ed.). the American
Association of Cereal Chemists, Inc. St. Paul, Minnesota, 2001.
[10] X.L. Chu, Y.P. Xu and W.Z. Lu, “Research and Application
Progress of Chemometrics Methods in Near Infrared
Spectroscopic Analysis,” Chinese Journal of Analytical
Chemistry, 2008, vol. 36, pp. 702-709.
[11] M. Daszykowski, M.S. Wrobel, H. Czarnik-Matusewicz and B.
Walczak, “Near-infrared reflectance spectroscopy and
multivariate calibration techniques applied to modelling the
crude protein, fibre and fat content in rapeseed meal,” Analyst,
2008, vol. 113, pp. 1523-1531.
[12] B. Igne, J.B. Reeves, G. McCarty and et al., “Evaluation of
spectral pretreatments, partial least squares, least squares support
vector machines and locally weighted regression for quantitative
spectroscopic analysis of soils,” J. Near Infrared Spec. 2010, vol.
18, pp. 167-176.
[13] S. Kasemsumran, Y.P. Du, K. Maruo and et al., “Improvement of
partial least squares models for in vitro and in vivo glucose
quantifications by using near-infrared spectroscopy and searching
combination moving window partial least squares,” Chemometr.
Intell. Lab., 2006, vol. 82, pp. 97-103.
[14] H.Z. Chen, G.Q. Tang, Q.Q. Song and W. Ai, “Combination of
Modified Optical Path Length Estimation and Correction and
Moving Window Partial Least Squares to Waveband Selection
for the Fourier Transform Near-Infrared (FT-NIR) Determination
of Pectin in Shaddock Peel,” Anal. Lett. 2013, vol. 46, pp. 2060-
2074.
[15] X.L. Chu, H.F. Yuan and W.Z. Lu, “Progress and Application of
Spectral Data Pretreatment and Wavelength Selection Methods in
NIR Analytical Technique,” Progress in Chemistry, 2004, vol. 16,
pp. 528-542.
[16] K. Wang, G.Y. Chi, R. Lau and T. Chen, “Multivariate
Calibration of Near Infrared Spectroscopy in the Presence of
Light Scattering Effect: A Comparative Study,” Anal. Lett. 2011,
vol. 4, pp. 824-836.
[17] A. Savitzky and M.J.E. Golay, “Smoothing and Differentiation of
Data by Simplified Least Squares Procedures,” Analytical
Chemistry, 1964, vol. 36, pp. 1627-1637.
[18] A. Rinnan, F. Vandenberg and S.B. Engelsen, “Review of the
most common pre-processing techniques for near-infrared
spectra,” Trac-Trends in Analytical Chemistry, 2009, vol. 28, pp.
1201-1222.
[19] H.Z. Chen, Q.Q. Song, G.Q. Tang, and et al., “The Combined
Optimization of Savitzky-Golay Smoothing and Multiplicative
Scatter Correction for FT-NIR PLS Models,” ISRN Spectroscopy,
Volume 2013, Article ID 642190, 2013.
[20] R.W. Kennard and L.A. Stone, “Computer Aided Design of
Experiments,” Technometrics, 1969, vol. 11, pp. 137-148.
[21] R.K.H. Galvao, M.C.U. Araujo, G.E. Jose and et al., “A method
for calibration and validation subset partitioning,” Talanta, 2005,
vol. 67, pp. 736-740.