This document discusses various techniques for characterizing metamaterials. It categorizes the techniques into analytical methods, field averaging methods, and experimental methods. Analytical methods include modeling the unit cell and periodic structure to determine effective parameters. Field averaging methods involve obtaining local field averages to derive constitutive parameters from transmission and reflection data. Experimental techniques discussed are the Nicolson-Ross-Weir method for extracting parameters from scattering data and resonator/free space methods for laboratory measurements.
Accelerating materials property predictions using machine learningGhanshyam Pilania
The materials discovery process can be significantly expedited and simplified if we can learn effectively from available knowledge and data. In the present contribution, we show that efficient and accurate prediction of a diverse set of properties of material systems is possible by employing machine (or statistical) learning
methods trained on quantum mechanical computations in combination with the notions of chemical similarity. Using a family of one-dimensional chain systems, we present a general formalism that allows us to discover decision rules that establish a mapping between easily accessible attributes of a system and its properties. It is shown that fingerprints based on either chemo-structural (compositional and configurational information) or the electronic charge density distribution can be used to make ultra-fast, yet accurate, property predictions. Harnessing such learning paradigms extends recent efforts to systematically explore and mine vast chemical spaces, and can significantly accelerate the discovery of new application-specific materials.
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...ijiert bestjournal
In Natural Scene Image,Text detection is important tasks which are used for many content based image analysis. A maximally stable external region based method is us ed for scene detection .This MSER based method incl udes stages character candidate extraction,text candida te construction,text candidate elimination & text candidate classification. Main limitations of this method are how to detect highly blurred text in low resolutio n natural scene images. The current technology not focuses on any t ext extraction method. In proposed system a Conditi onal Random field (CRF) model is used to assign candidat e component as one of the two classes (text& Non Te xt) by Considering both unary component properties and bin ary contextual component relationship. For this pur pose we are using connected component analysis method. The proposed system also performs a text extraction usi ng OCR
In the present paper the experimental study of
Nanotechnology involves high cost for Lab set-up and the
experimentation processes were also slow. Attempt has also
been made to discuss the contributions towards the societal
change in the present convergence of Nano-systems and
information technologies. one cannot rely on experimental
nanotechnology alone. As such, the Computer- simulations and
modeling are one of the foundations of computational
nanotechnology. The computer modeling and simulations
were also referred as computational experimentations. The
accuracy of such Computational nano-technology based
experiment generally depends on the accuracy of the following
things: Intermolecular interaction, Numerical models and
Simulation schemes used. The essence of nanotechnology is
therefore size and control because of the diversity of
applications the plural term nanotechnology is preferred by
some nevertheless they all share the common feature of control
at the nanometer scale the latter focusing on the observation
and study of phenomena at the nanometer scale. In this paper,
a brief study of Computer-Simulation techniques as well as
some Experimental result
Texture classification based on overlapped texton co occurrence matrix (otcom...eSAT Journals
Abstract
Abstract: The pattern identification problems such as stone, rock categorization and wood recognition are used texture classification technique due to its valuable usage in it. Generally, texture analysis can be done one of the two ways i.e. statistical and structural approaches. More problems are occurred when working with statistical approaches in texture analysis for texture categorization. One of the most popular statistical approaches is Gray Level Co-occurrence Matrices (GLCM) approach. This approach is used to discriminating different textures in images. This approach gives better accuracy results but this takes high computational cost. Usually, texture analysis method depends upon how the texture features are extracted from the image to characterize image. Whenever a new texture feature is derived it is tested whether it is precisely classifies the textures or not. Texture features are most important for precise and accurate texture classification and also important that the way in which they are extracted and applied. The present paper derived a new co-occurrence matrix based on overlapped textons patterns. The present paper generates overlapped texton patterns and generates co-occurrence matrices derived a new matrix called Overlapped Texton Co-occurrence Matrices (OTCoM) for stone texture classification. The present paper integrates the advantages of co-occurrence matrix and texton image by representing the attribute of co-occurrence. The co-occurrence features extracted from the OTCoM provides complete texture information about a texture image. The proposed method is experimented on Vistex, Brodatz textures, CUReT, Mayang, Paul Brooke, and Google color texture images. The experimental results indicate the proposed method classification performance is superior to that of many existing methods.
Keywords: co-occurrence matrix, texton, Texture Classification
Accelerating materials property predictions using machine learningGhanshyam Pilania
The materials discovery process can be significantly expedited and simplified if we can learn effectively from available knowledge and data. In the present contribution, we show that efficient and accurate prediction of a diverse set of properties of material systems is possible by employing machine (or statistical) learning
methods trained on quantum mechanical computations in combination with the notions of chemical similarity. Using a family of one-dimensional chain systems, we present a general formalism that allows us to discover decision rules that establish a mapping between easily accessible attributes of a system and its properties. It is shown that fingerprints based on either chemo-structural (compositional and configurational information) or the electronic charge density distribution can be used to make ultra-fast, yet accurate, property predictions. Harnessing such learning paradigms extends recent efforts to systematically explore and mine vast chemical spaces, and can significantly accelerate the discovery of new application-specific materials.
CADD UNIT V - Molecular Modeling: Introduction to molecular mechanics and quantum mechanics.Energy Minimization methods and Conformational Analysis, global conformational minima determination.
ROBUST TEXT DETECTION AND EXTRACTION IN NATURAL SCENE IMAGES USING CONDITIONA...ijiert bestjournal
In Natural Scene Image,Text detection is important tasks which are used for many content based image analysis. A maximally stable external region based method is us ed for scene detection .This MSER based method incl udes stages character candidate extraction,text candida te construction,text candidate elimination & text candidate classification. Main limitations of this method are how to detect highly blurred text in low resolutio n natural scene images. The current technology not focuses on any t ext extraction method. In proposed system a Conditi onal Random field (CRF) model is used to assign candidat e component as one of the two classes (text& Non Te xt) by Considering both unary component properties and bin ary contextual component relationship. For this pur pose we are using connected component analysis method. The proposed system also performs a text extraction usi ng OCR
In the present paper the experimental study of
Nanotechnology involves high cost for Lab set-up and the
experimentation processes were also slow. Attempt has also
been made to discuss the contributions towards the societal
change in the present convergence of Nano-systems and
information technologies. one cannot rely on experimental
nanotechnology alone. As such, the Computer- simulations and
modeling are one of the foundations of computational
nanotechnology. The computer modeling and simulations
were also referred as computational experimentations. The
accuracy of such Computational nano-technology based
experiment generally depends on the accuracy of the following
things: Intermolecular interaction, Numerical models and
Simulation schemes used. The essence of nanotechnology is
therefore size and control because of the diversity of
applications the plural term nanotechnology is preferred by
some nevertheless they all share the common feature of control
at the nanometer scale the latter focusing on the observation
and study of phenomena at the nanometer scale. In this paper,
a brief study of Computer-Simulation techniques as well as
some Experimental result
Texture classification based on overlapped texton co occurrence matrix (otcom...eSAT Journals
Abstract
Abstract: The pattern identification problems such as stone, rock categorization and wood recognition are used texture classification technique due to its valuable usage in it. Generally, texture analysis can be done one of the two ways i.e. statistical and structural approaches. More problems are occurred when working with statistical approaches in texture analysis for texture categorization. One of the most popular statistical approaches is Gray Level Co-occurrence Matrices (GLCM) approach. This approach is used to discriminating different textures in images. This approach gives better accuracy results but this takes high computational cost. Usually, texture analysis method depends upon how the texture features are extracted from the image to characterize image. Whenever a new texture feature is derived it is tested whether it is precisely classifies the textures or not. Texture features are most important for precise and accurate texture classification and also important that the way in which they are extracted and applied. The present paper derived a new co-occurrence matrix based on overlapped textons patterns. The present paper generates overlapped texton patterns and generates co-occurrence matrices derived a new matrix called Overlapped Texton Co-occurrence Matrices (OTCoM) for stone texture classification. The present paper integrates the advantages of co-occurrence matrix and texton image by representing the attribute of co-occurrence. The co-occurrence features extracted from the OTCoM provides complete texture information about a texture image. The proposed method is experimented on Vistex, Brodatz textures, CUReT, Mayang, Paul Brooke, and Google color texture images. The experimental results indicate the proposed method classification performance is superior to that of many existing methods.
Keywords: co-occurrence matrix, texton, Texture Classification
A Combined Approach for Feature Subset Selection and Size Reduction for High ...IJERA Editor
selection of relevant feature from a given set of feature is one of the important issues in the field of
data mining as well as classification. In general the dataset may contain a number of features however it is not
necessary that the whole set features are important for particular analysis of decision making because the
features may share the common information‟s and can also be completely irrelevant to the undergoing
processing. This generally happen because of improper selection of features during the dataset formation or
because of improper information availability about the observed system. However in both cases the data will
contain the features that will just increase the processing burden which may ultimately cause the improper
outcome when used for analysis. Because of these reasons some kind of methods are required to detect and
remove these features hence in this paper we are presenting an efficient approach for not just removing the
unimportant features but also the size of complete dataset size. The proposed algorithm utilizes the information
theory to detect the information gain from each feature and minimum span tree to group the similar features
with that the fuzzy c-means clustering is used to remove the similar entries from the dataset. Finally the
algorithm is tested with SVM classifier using 35 publicly available real-world high-dimensional dataset and the
results shows that the presented algorithm not only reduces the feature set and data lengths but also improves the
performances of the classifier.
Study of the Class and Structural Changes Caused By Incorporating the Target ...ijceronline
High dimensional data when processed by using various machine learning and pattern recognition techniques, it undergoes several changes. Dimensionality reduction is one such successfully used pre-processing technique to analyze and represent the high dimensional data that causes several structural changes to occur in the data through the process. The high-dimensional data when used to extract just the target class from among several classes that are spatially scattered then the philosophy of the dimensionality reduction is to find an optimal subset of features either from the original space or from the transformed space using the control set of the target class and then project the input space onto this optimal feature subspace. This paper is an exploratory analysis carried out to study the class properties and the structural properties that are affected due to the target class guided feature subsetting in specific. K-nearest neighbors and minimum spanning tree are employed to study the structural properties, and cluster analysis is applied to understand the target class and other class properties. The experimentation is conducted on the target class derived features on the selected bench mark data sets namely IRIS, AVIRIS Indiana Pine and ROSIS Pavia University data set. Experimentation is also extended to data represented in the optimal principal components obtained by transforming the subset of features and results are also compared
Application of Fuzzy Min-Max Composition in Polymer Compositeijcoa
This paper presents a new technique which uses the principle of Fuzzy Min-Max composition for the polymer composites and also it gives more information about electrically resistant polymer.
Computational Chemistry aspects of Molecular Mechanics and Dynamics have been discussed in this presentation. Useful for the Undergraduate and Postgraduate students of Pharmacy, Drug Design and Computational Chemistry
Improved wolf algorithm on document images detection using optimum mean techn...journalBEEI
Detection text from handwriting in historical documents provides high-level features for the challenging problem of handwriting recognition. Such handwriting often contains noise, faint or incomplete strokes, strokes with gaps, and competing lines when embedded in a table or form, making it unsuitable for local line following algorithms or associated binarization schemes. In this paper, a proposed method based on the optimum threshold value and namely as the Optimum Mean method was presented. Besides, Wolf method unsuccessful in order to detect the thin text in the non-uniform input image. However, the proposed method was suggested to overcome the Wolf method problem by suggesting a maximum threshold value using optimum mean. Based on the calculation, the proposed method obtained a higher F-measure (74.53), PSNR (14.77) and lowest NRM (0.11) compared to the Wolf method. In conclusion, the proposed method successful and effective to solve the wolf problem by producing a high-quality output image.
Quantum Mechanics in Molecular modelingAkshay Kank
This slides gives you the information related to computer aided drug design and its application in drug discovery. Also you learn the Quantum mechanics related to the molecular mechanics. Theory related to molecular modeling and how the molecular modeling helps in drug discovery.
ANOVA and Fisher Criterion based Feature Selection for Lower Dimensional Univ...CSCJournals
Unethical uses of data hiding methods have made Image Steganalysis a very important area of
research work in the field of Digital Investigations. Effectiveness of any Image Steganalysis
algorithm depends on feature selection and feature reduction. The goal of this paper is to develop
a reduced dimensional merged feature set for universal image steganalysis using Fisher Criterion
and ANOVA techniques. Statistical features extracted from wavelet subbands and binary
similarity patterns extracted from DCT of an image are merged to make combined feature set.
Fisher criterion and ANOVA test are applied to evaluate the combined feature vector score and
then only those features are selected which are found sensitive in both feature selection methods.
These reduced dimensional 15-D feature vector is used to train SVM classifier with RBF kernel.
The proposed algorithm is tested against steganography methods like F5, Outguess and LSB
based method. Stego images are generated using widely available stego tools for two standard
image databases: CorelDraw and BSDS500. Results are further validated using 10 fold cross
validation process. The proposed algorithm achieves overall 97% detection accuracy against
various steganography methods
Determining the Complex Permittivity of Building Dielectric Materials using a...IJECEIAES
This paper presents a technique to determine the Dielectric constant and dielectric loss of the building dielectric materials using propagation constant measurements. The material sample is loaded in an X-band (8.5GHz12.5GHz) rectangular waveguide and its two port S-parameters are measured as a function of frequency using a Vector Network Analyzer without TRL Calibration. The results obtained from samples of dielectric materials (Air, Cellular concrete and Wood) on the X-band frequencies show the validity of the proposed technique to determine the complex permittivity of the building dielectric materials on the X-band frequencies.
A Threshold Fuzzy Entropy Based Feature Selection: Comparative StudyIJMER
Feature selection is one of the most common and critical tasks in database classification. It
reduces the computational cost by removing insignificant and unwanted features. Consequently, this
makes the diagnosis process accurate and comprehensible. This paper presents the measurement of
feature relevance based on fuzzy entropy, tested with Radial Basis Classifier (RBF) network,
Bagging(Bootstrap Aggregating), Boosting and stacking for various fields of datasets. Twenty
benchmarked datasets which are available in UCI Machine Learning Repository and KDD have been
used for this work. The accuracy obtained from these classification process shows that the proposed
method is capable of producing good and accurate results with fewer features than the original
datasets.
Molecular modelling for M.Pharm according to PCI syllabusShikha Popali
THE MOLECULAR MODELLING IS THE MOST IMPORTANT TOPIC FOR CHEMISTRY STUDENTS , HENCE THE THEORY OF MOLECULAR MODELLING IS COVER IN THIS PRESNTATION . HOPE THIS MATTER SAISFY ALL AS WE HAVE TRIED TO ATTEMPT ALL TH TOPICS OF IT.
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
Overview combining ab initio with continuum theoryDierk Raabe
Multi-methodological approaches combining quantum-mechanical and/or atomistic simulations
with continuum methods have become increasingly important when addressing multi-scale phenomena in
computational materials science. A crucial aspect when applying these strategies is to carefully check,
and if possible to control, a variety of intrinsic errors and their propagation through a particular multimethodological
scheme. The first part of our paper critically reviews a few selected sources of errors
frequently occurring in quantum-mechanical approaches to materials science and their multi-scale propagation
when describing properties of multi-component and multi-phase polycrystalline metallic alloys.
Our analysis is illustrated in particular on the determination of i) thermodynamic materials properties at
finite temperatures and ii) integral elastic responses. The second part addresses methodological challenges
emerging at interfaces between electronic structure and/or atomistic modeling on the one side and selected
continuum methods, such as crystal elasticity and crystal plasticity finite element method (CEFEM and
CPFEM), new fast Fourier transforms (FFT) approach, and phase-field modeling, on the other side.
A Density Control Based Adaptive Hexahedral Mesh Generation Algorithmijeei-iaes
A density control based adaptive hexahedral mesh generation algorithm for three dimensional models is presented in this paper. The first step of this algorithm is to identify the characteristic boundary of the solid model which needs to be meshed. Secondly, the refinement fields are constructed and modified according to the conformal refinement templates, and used as a metric to generate an initial grid structure. Thirdly, a jagged core mesh is generated by removing all the elements in the exterior of the solid model. Fourthly, all of the surface nodes of the jagged core mesh are matching to the surfaces of the model through a node projection process. Finally, the mesh quality such as topology and shape is improved by using corresponding optimization techniques.
-In the field of Agriculture most important things
are fertility of soil, nutrition’s available in soil, water availability
in that area, atmospheric conditions .All these parameters are
playing the measure roll regarding the productivity of crop .In
this paper we are trying to go through the techniques which will
show us how to improve productivity with the minimum use of
natural resources like water, and avoid leaching of soil by using
fertilizers through drip. This can be used in greenhouse or open
environments to efficiently monitor soil moisture and
temperature, ambient temperature, and humidity. Wired
communications, sensor networks, and other complementary
technologies provide the necessary tools to compile and processes
physical variables, including temperature, humidity, and soil
moisture, pH of soil, fertilizer concentrations. Greenhouse and
precision agricultural, in general, demand real-time precise
measurement of these parameters in order to avoid unnecessary
exposure to unhealthy ambient conditions, assure maximum
productivity and provide value-added quality. This paper aims to
implement the basic application of automizing the irrigation field
by programming the components and building the necessary
hardware with ARM7 Processor. This is used to find the exact
field condition and maintaining their levels in the soil
This study was designed to evaluate the effect of
70% ethanolic crude extract of Portulaca oleracea L on mice
orgons . (In vivo),In vivo, the acute toxicity of 70 % ethanolic
extract of the plant on normal mice was studied. No toxic effect
was noted on normal mice even at 9500 mg /kg B.W S/C
injection.Histopathological changes due to ethanolic extract of
the plant in healthy mice were summarized in hyperplasia of
white pulp with amyloid deposition, proliferation of
megakaryocytes and mononuclear cell infiltration in the liver and
kidney parenchyma. There were no significant lesions detected in
the brain, heart and ovary in all treated groups.
A Combined Approach for Feature Subset Selection and Size Reduction for High ...IJERA Editor
selection of relevant feature from a given set of feature is one of the important issues in the field of
data mining as well as classification. In general the dataset may contain a number of features however it is not
necessary that the whole set features are important for particular analysis of decision making because the
features may share the common information‟s and can also be completely irrelevant to the undergoing
processing. This generally happen because of improper selection of features during the dataset formation or
because of improper information availability about the observed system. However in both cases the data will
contain the features that will just increase the processing burden which may ultimately cause the improper
outcome when used for analysis. Because of these reasons some kind of methods are required to detect and
remove these features hence in this paper we are presenting an efficient approach for not just removing the
unimportant features but also the size of complete dataset size. The proposed algorithm utilizes the information
theory to detect the information gain from each feature and minimum span tree to group the similar features
with that the fuzzy c-means clustering is used to remove the similar entries from the dataset. Finally the
algorithm is tested with SVM classifier using 35 publicly available real-world high-dimensional dataset and the
results shows that the presented algorithm not only reduces the feature set and data lengths but also improves the
performances of the classifier.
Study of the Class and Structural Changes Caused By Incorporating the Target ...ijceronline
High dimensional data when processed by using various machine learning and pattern recognition techniques, it undergoes several changes. Dimensionality reduction is one such successfully used pre-processing technique to analyze and represent the high dimensional data that causes several structural changes to occur in the data through the process. The high-dimensional data when used to extract just the target class from among several classes that are spatially scattered then the philosophy of the dimensionality reduction is to find an optimal subset of features either from the original space or from the transformed space using the control set of the target class and then project the input space onto this optimal feature subspace. This paper is an exploratory analysis carried out to study the class properties and the structural properties that are affected due to the target class guided feature subsetting in specific. K-nearest neighbors and minimum spanning tree are employed to study the structural properties, and cluster analysis is applied to understand the target class and other class properties. The experimentation is conducted on the target class derived features on the selected bench mark data sets namely IRIS, AVIRIS Indiana Pine and ROSIS Pavia University data set. Experimentation is also extended to data represented in the optimal principal components obtained by transforming the subset of features and results are also compared
Application of Fuzzy Min-Max Composition in Polymer Compositeijcoa
This paper presents a new technique which uses the principle of Fuzzy Min-Max composition for the polymer composites and also it gives more information about electrically resistant polymer.
Computational Chemistry aspects of Molecular Mechanics and Dynamics have been discussed in this presentation. Useful for the Undergraduate and Postgraduate students of Pharmacy, Drug Design and Computational Chemistry
Improved wolf algorithm on document images detection using optimum mean techn...journalBEEI
Detection text from handwriting in historical documents provides high-level features for the challenging problem of handwriting recognition. Such handwriting often contains noise, faint or incomplete strokes, strokes with gaps, and competing lines when embedded in a table or form, making it unsuitable for local line following algorithms or associated binarization schemes. In this paper, a proposed method based on the optimum threshold value and namely as the Optimum Mean method was presented. Besides, Wolf method unsuccessful in order to detect the thin text in the non-uniform input image. However, the proposed method was suggested to overcome the Wolf method problem by suggesting a maximum threshold value using optimum mean. Based on the calculation, the proposed method obtained a higher F-measure (74.53), PSNR (14.77) and lowest NRM (0.11) compared to the Wolf method. In conclusion, the proposed method successful and effective to solve the wolf problem by producing a high-quality output image.
Quantum Mechanics in Molecular modelingAkshay Kank
This slides gives you the information related to computer aided drug design and its application in drug discovery. Also you learn the Quantum mechanics related to the molecular mechanics. Theory related to molecular modeling and how the molecular modeling helps in drug discovery.
ANOVA and Fisher Criterion based Feature Selection for Lower Dimensional Univ...CSCJournals
Unethical uses of data hiding methods have made Image Steganalysis a very important area of
research work in the field of Digital Investigations. Effectiveness of any Image Steganalysis
algorithm depends on feature selection and feature reduction. The goal of this paper is to develop
a reduced dimensional merged feature set for universal image steganalysis using Fisher Criterion
and ANOVA techniques. Statistical features extracted from wavelet subbands and binary
similarity patterns extracted from DCT of an image are merged to make combined feature set.
Fisher criterion and ANOVA test are applied to evaluate the combined feature vector score and
then only those features are selected which are found sensitive in both feature selection methods.
These reduced dimensional 15-D feature vector is used to train SVM classifier with RBF kernel.
The proposed algorithm is tested against steganography methods like F5, Outguess and LSB
based method. Stego images are generated using widely available stego tools for two standard
image databases: CorelDraw and BSDS500. Results are further validated using 10 fold cross
validation process. The proposed algorithm achieves overall 97% detection accuracy against
various steganography methods
Determining the Complex Permittivity of Building Dielectric Materials using a...IJECEIAES
This paper presents a technique to determine the Dielectric constant and dielectric loss of the building dielectric materials using propagation constant measurements. The material sample is loaded in an X-band (8.5GHz12.5GHz) rectangular waveguide and its two port S-parameters are measured as a function of frequency using a Vector Network Analyzer without TRL Calibration. The results obtained from samples of dielectric materials (Air, Cellular concrete and Wood) on the X-band frequencies show the validity of the proposed technique to determine the complex permittivity of the building dielectric materials on the X-band frequencies.
A Threshold Fuzzy Entropy Based Feature Selection: Comparative StudyIJMER
Feature selection is one of the most common and critical tasks in database classification. It
reduces the computational cost by removing insignificant and unwanted features. Consequently, this
makes the diagnosis process accurate and comprehensible. This paper presents the measurement of
feature relevance based on fuzzy entropy, tested with Radial Basis Classifier (RBF) network,
Bagging(Bootstrap Aggregating), Boosting and stacking for various fields of datasets. Twenty
benchmarked datasets which are available in UCI Machine Learning Repository and KDD have been
used for this work. The accuracy obtained from these classification process shows that the proposed
method is capable of producing good and accurate results with fewer features than the original
datasets.
Molecular modelling for M.Pharm according to PCI syllabusShikha Popali
THE MOLECULAR MODELLING IS THE MOST IMPORTANT TOPIC FOR CHEMISTRY STUDENTS , HENCE THE THEORY OF MOLECULAR MODELLING IS COVER IN THIS PRESNTATION . HOPE THIS MATTER SAISFY ALL AS WE HAVE TRIED TO ATTEMPT ALL TH TOPICS OF IT.
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
Overview combining ab initio with continuum theoryDierk Raabe
Multi-methodological approaches combining quantum-mechanical and/or atomistic simulations
with continuum methods have become increasingly important when addressing multi-scale phenomena in
computational materials science. A crucial aspect when applying these strategies is to carefully check,
and if possible to control, a variety of intrinsic errors and their propagation through a particular multimethodological
scheme. The first part of our paper critically reviews a few selected sources of errors
frequently occurring in quantum-mechanical approaches to materials science and their multi-scale propagation
when describing properties of multi-component and multi-phase polycrystalline metallic alloys.
Our analysis is illustrated in particular on the determination of i) thermodynamic materials properties at
finite temperatures and ii) integral elastic responses. The second part addresses methodological challenges
emerging at interfaces between electronic structure and/or atomistic modeling on the one side and selected
continuum methods, such as crystal elasticity and crystal plasticity finite element method (CEFEM and
CPFEM), new fast Fourier transforms (FFT) approach, and phase-field modeling, on the other side.
A Density Control Based Adaptive Hexahedral Mesh Generation Algorithmijeei-iaes
A density control based adaptive hexahedral mesh generation algorithm for three dimensional models is presented in this paper. The first step of this algorithm is to identify the characteristic boundary of the solid model which needs to be meshed. Secondly, the refinement fields are constructed and modified according to the conformal refinement templates, and used as a metric to generate an initial grid structure. Thirdly, a jagged core mesh is generated by removing all the elements in the exterior of the solid model. Fourthly, all of the surface nodes of the jagged core mesh are matching to the surfaces of the model through a node projection process. Finally, the mesh quality such as topology and shape is improved by using corresponding optimization techniques.
-In the field of Agriculture most important things
are fertility of soil, nutrition’s available in soil, water availability
in that area, atmospheric conditions .All these parameters are
playing the measure roll regarding the productivity of crop .In
this paper we are trying to go through the techniques which will
show us how to improve productivity with the minimum use of
natural resources like water, and avoid leaching of soil by using
fertilizers through drip. This can be used in greenhouse or open
environments to efficiently monitor soil moisture and
temperature, ambient temperature, and humidity. Wired
communications, sensor networks, and other complementary
technologies provide the necessary tools to compile and processes
physical variables, including temperature, humidity, and soil
moisture, pH of soil, fertilizer concentrations. Greenhouse and
precision agricultural, in general, demand real-time precise
measurement of these parameters in order to avoid unnecessary
exposure to unhealthy ambient conditions, assure maximum
productivity and provide value-added quality. This paper aims to
implement the basic application of automizing the irrigation field
by programming the components and building the necessary
hardware with ARM7 Processor. This is used to find the exact
field condition and maintaining their levels in the soil
This study was designed to evaluate the effect of
70% ethanolic crude extract of Portulaca oleracea L on mice
orgons . (In vivo),In vivo, the acute toxicity of 70 % ethanolic
extract of the plant on normal mice was studied. No toxic effect
was noted on normal mice even at 9500 mg /kg B.W S/C
injection.Histopathological changes due to ethanolic extract of
the plant in healthy mice were summarized in hyperplasia of
white pulp with amyloid deposition, proliferation of
megakaryocytes and mononuclear cell infiltration in the liver and
kidney parenchyma. There were no significant lesions detected in
the brain, heart and ovary in all treated groups.
RFID-based public transport ticketing systems
rely on widespread networks of RFID readers that locate
the user within the transport network in real time to be
able to verify whether he can travel at that time with the
ticket he holds. This paper presents a system that uses that
same RFID-based location information to give the user
navigation indications depending on his current location
provided that the user has indicated beforehand the places
he intends to visit. The system was designed to be costeffectively
deployable on the short term but open for easy
extension. This paper is based on ticketing and
identification of the passenger in the public transport. In
the metropolitan city like Mumbai, Kolkata we have a
severe malfunction of public transport and various
security problems. Firstly, there is a lot of confusion
between the passengers regarding fares which lead to
corruption, Secondly due to mismanagement of public
transport the passengers faces the problem of traffic jam,
thirdly nowadays we have severe security problems in
public transport due anti-social elements.The entire
network comprises of three modules; Base Station Module,
In-Bus Modules and Bus Stop Module. The base station
module consists of monitoring system which includes GSM
and a PC. The In-Bus Modules consists of two
Microcontrollers, GSM Modem, GPS, Zigbee, RFID, LCD
and infrared sensor. RFID for ticketing purpose, GSM,
GPS is used for mobile data transmission and tracking
location. The Zigbee module is also interfaced with the
microcontroller which is used to send the bus information
to bus stop and to get the information from the bus stop to
bus. The Bus Stop Module is fixed at every bus stop
consists of Zigbee node which is interfaced with the
Microcontroller.
Bio-char can be produced by thermal conversion of
biomass. Palm shells were obtained from palm fruits (palmira).
They were air-dried to remove moisture. The dried palm shells
were ground to become powder and heated at 600ºC, 800ºC and
1000ºC for 2 h respectively. After heating, bio-char was obtained.
Structural properties of palm shell powder and bio-char were
examined by X-ray diffraction (XRD). Scanning electron
microscopy (SEM) was used to observe microstructure of biochar.
Properties such as hydration capacity, pH were also
evaluated.
The technical study had been performed on
many foreign languages like Japanese; Chinese etc. but the
efforts on Indian ancient script is still immature. As the Modi
script language is ancient and cursive type, the OCR of it is still
not widely available. As per our knowledge, Prof. D.N.Besekar,
Dept. of Computer Science, Shri. Shivaji College of Science,
Akola had proposed a system for recognition of offline
handwritten MODI script Vowels. The challenges of
recognition of handwritten Modi characters are very high due
to the varying writing style of each individual. Many vital
documents with precious information have been written in
Modi and currently, these documents have been stored and
preserved in temples and museums. Over a period of time these
documents will wither away if not given due attention. In this
paper we propose a system for recognition of handwritten
Modi script characters; the proposed method uses Image
processing techniques and algorithms which are described
below.
General Terms
Preprocessing techniques: Gray scaling, Thresholding,
Boundary detection, Thinning, cropping, scaling, Template
generation. Other algorithms used- Average method, otsu
method, Stentiford method, Template-based matching method
The combination of steganography and
cryptography is considered as one of the best security methods
used for message protection, due to this reason, in this paper, a
data hiding system that is based on image steganography and
cryptography is proposed to secure data transfer between the
source and destination. Animated GIF image is chosen as a
carrier file format for the steganography due to a wide use in web
pages and a LSB (Least Significant Bits) algorithm is employed to
hide the message inside the colors of the pixels of an animated
GIF image frames. To increase the security of hiding, each frame
of GIF image is converted to 256 color BMP image and the
palette of them is sorted and reassign each pixels to its new index,
furthermore, the message is encrypted by LZW ( Lempel _
Ziv_Welch) compression algorithm before being hidden in the
image frames. The proposed system was evaluated for
effectiveness and the result shows that, the encryption and
decryption methods used for developing the system make the
security of the proposed system more efficient in securing data
from unauthorized users. The system is therefore, recommended
to be used by the Internet users for establishing a more secure
communication
This research deals with study of Degradation
behavior of starch blended with different percentage of
polypropylene (PP) .Twin screw extruder at 160- 190 °C and 50
rpm is used for manufacture of blend sheet. Degradation test
achieved according to ASTM standard (D 638 IV and D570-98).
Studies on their degradation properties were carried out by Soil
burial test, Water absorption test and Hydrolysis test. The
morphology test of the polypropylene / starch blend samples
was obviously seen in the (Dino- Light- Digital Microscope),
Results of soil burial test show that tensile strength and
percentage of elongation of polypropylene / starch blend
decrease with increasing the starch content and burial time. The
hydrolysis test show the weight losses increasing with the
increasing amount of starch. High percent of polypropylene
found to decrease the amount of water absorption of the blend.
The physical appearance and morphology studies of
polypropylene / starch blend after burial test in soil and
hydrolysis in water environment showed that all blend samples
was obviously changed after 90-day study period, whereas the
pure polypropylene samples remained unchanged
This study aims to employ low-cost agro waste
biosorbent tamarind (Tamarindus indica) pod shells and
activated carbon prepared by complete and partial pyrolysis of
tamarind pod shell for the removal of hexavalent chromium
ions from aqueous solution. The effect of parameters namely,
initial metal ion concentration, pH, temperature, biomass
loading on chromium removal efficiency were studied. More
than 96.9% removal of Chromium was achieved using crude
tamarind pod shells as biosorbent. The experimental data
obtained were fitted with Langmuir, Freundlich, Temkin and
Redlich-Peterson adsorption isotherm models. The
experimental data fits well to Langmuir, Freundlich and
Temkin isotherms with regression coefficient R2 more than 0.9.
For Redlich-Peterson adsorption isotherm the experimental
data does not fit so well. The crude tamarind had maximum
monolayer adsorption capacity of 40 mg/g and a separation
factor of 0.0416 indicating it as best adsorbent among the three
tested adsorbent. Further, an attempt is made to fit sorption
kinetics with pseudo first order and pseudo second order
reactions. Pseudo second order kinetics model fits well to the
experimental data for all three adsorbents.
—Stochastic processes have many useful applications
and are taught in several university programmers. In this paper
we are using stochastic process with complex concept on Markov
chains which uses a transition matrix to plot a transition diagram
and there are several examples which explains various type of
transition diagram. The concept behind this topic is simple and
easy to understand.
When a ductile material with a crack is loaded in
tension, the deformation energy builds up around the crack tip
and it is understood that at a certain critical condition voids are
formed ahead of the crack tip. The crack extension occurs by
coalescence of voids with the crack tip. The “characteristic
distance” (Lc) defined as the distance b/w the crack tip & the void
responsible for eventual coalescence with the crack tip. Nucleation
of these voids is generally associated with the presence of second
phase particles or grain boundaries in the vicinity of the crack tip.
Although approximate, Lc assumes a special significance since it
links the fracture toughness to the microscopic mechanism
considered responsible for ductile fracture. The knowledge of the
“characteristic distance” is also crucial for designing the size of
mesh in the finite element simulations of material crack growth
using damage mechanics principles. There is not much work
(experimental as well as numerical) available in the literature
related to the dependency of “characteristic distance” on the
fracture specimen geometry. The present research work is an
attempt to understand numerically, the geometry dependency of
“characteristic distance” using three-dimensional FEM analysis.
The variation of “characteristic distance” parameter due to the
change of temperature across the fracture specimen thickness was
also studied. The work also studied the variation of “characteristic
distance”, due to the change in fracture specimen thickness.
Finally, the ASTM requirement of fracture specimen thickness
criteria is evaluated for the “characteristic distance” fracture
parameter. “Characteristic distance” is found to vary across the
fracture specimen thickness. It is dependent on fracture specimen
thickness and it converges after a specified thickness of fracture
specimen. “Characteristic distance” value is also dependent on the
temperature of ductile material. In Armco iron material, it is
found to decrease with the increase in temperature.
On the surface a packet is a chunk of information
but at the deeper level a packet is one unit of binary data capable
of being transferred through a network. Delivering data packets
for highly dynamic mobile ad hoc networks in a reliable and
timely manner. Driven by this issue, an efficient Position-based
Opportunistic Routing (POR) protocol which takes advantage of
the stateless property of geographic routing. In proactive routing
protocols the route discovery and recovery procedures are time
and energy consuming process. Once the path breaks, data
packets will get lost or be delayed for a long time until the
reconstruction of the route, causing transmission interruption.
but Geographic routing (GR) uses location information to
forward data packets, in a hop-by-hop routing fashion. Greedy
forwarding is used to select next hop forwarder with the largest
positive progress toward the destination while void handling
mechanism is triggered to route around communication voids. No
end-to-end route need to be maintained, leading to GR’s high
efficiency and scalability. In the operation of greedy forwarding,
the neighbour which is relatively far away from the sender is
chosen as the next hop. If the node moves out of the sender’s
coverage area, the transmission will fail. In GPSR (a very famous
geographic routing protocol), the MAC-layer failure feedback is
used to offer the packet another chance to reroute.
Cloud computing solves the problem of real
time demand information and visibility at different location by
which information can be delivered with reliability, scalability
and flexibility between the supplier and customer. Logistics
network requires effective information flow for technical support
by which logistics infrastructures can be totally utilized and
tracked the information collection, transmission and operation.
Cloud is fast growing technology which can effectively reduce the
intermediate cost of flow of information and improve the link
between the logistics partners and customers. This paper
analyzes the advantages of cloud based logistics network and
defines in which way a logistics network manages Information
Flow Control (IFC) over the cloud, which allows the logistics
network to do work effectively.
A supply chain consists of all parties involved
directly or indirectly, in fulfilling a customer request. The supply
chain includes not only the manufacturers and suppliers, but also
transporters, workhouse, retailers and even customers
themselves. Within each organization, such as a manufactures,
the supply chain includes all functions involved in receiving and
filling a customer request. These functions include, but are not
limited to, new product development, marketing operations,
distributions, finance, and customer service. Supply chain
management (SCM) is the management of an interconnected or
interlinked between network, channel and node businesses
involved in the provision of product and service packages
required by the end customers in a supply chain. Supply chain
management spans the movement and storage of raw materials,
work-in-process inventory, and finished goods from point of
origin to point of consumption. It is also defined as the "design,
planning, execution, control, and monitoring of supply chain
activities with the objective of creating net value, building a
competitive infrastructure, leveraging worldwide logistics,
synchronizing supply with demand and measuring performance
globally.
In current year, endurable and entire renewable
energy resources are extensively used in electrical energy
generation system. Mainly, solar energy conservation systems
are apply in stand -alone system. Solar panels covert solar
radiation into direct electrical energy. Solar panels are one of
the most potential renewable energy technologies for refreshing
building. In this study, responsibility analysis of a solar system
installed in my collage academic block and hostel is
investigated. The system includes solar panel, battery,
generator, converter and loads. In this study we calculate
overall load in academic block (Electrical engineering
department and round building) and only boy hostel. After
knowing overall loads result for these buildings we simulate
this data through HOMER tool and we obtain the best result
which is presented in this paper.
The result obtained from the optimization gives the initial
capital cost as 296.000$ while operating cost is 2,882$/yr. Total
net present cost (NPC) is 332,846$ and the cost of energy
(COE) is 0.212$/kWh.
The main purpose of this research paper is that the
maximum demand of energy consumption for both academic
block and hostel are simulated through solar panel, for this
purpose which amount of solar panel and battery is required.
To help corporations survive amidst worldwide
quality competition, the authors have focused on the strategic
development of a Higher-Cycled Product Design CAE Model
employing a Highly Reliable CAE Analysis Technology
Component Model. Their efforts are part of principle-based
research aimed at evolving product design and CAE development
processes to ensure better quality assurance. To satisfy the
requirements of developing and producing high quality products
while also reducing costs and shortening development times, the
effectiveness of this model was verified by successfully applying it
to the technological problems of loosening bolts and other
product design bottlenecks at auto manufacturers.
A young astronomer’s by now ten years old
results are re-told and put in perspective. The implications are
far-reaching. Angular-momentum shows its clout not only in
quantum mechanics where this is well known, but is also a
major player in the space-time theory of the equivalence
principle and its ramifications. In general relativity, its
fundamental role was largely neglected for the better part of a
century. A children’s device – a friction-free rotating bicycle
wheel suspended from its hub that can be lowered and pulled
up reversibly – serves as an eye-opener. The consequences are
embarrassingly far-reaching in reviving Einstein’s original
dream
Space-time adaptive processing (STAP) is a signal
processing technique most commonly used in radar systems where
interference is a problem. The radar signal processor is used to
remove the unintentional cluttering effects caused by ground
reflections and echoes due to sea, desert, forest, etc. and intentional
jamming and make the received signal useful. In this paper a new
approach to STAP based on subspace projection has been described
in detail. According to linear algebra and three dimensional
geometry, if we project a range space on to a subspace spanned by
linearly independent vectors, we can suppress data which is
perpendicular to that subspace. In subspace based technique, the
received data is projected on to a subspace which is orthogonal to
clutter subspace to remove the clutter. The probability of target
detection can be find out in order to analyse the performance of the
proposed algorithm. Two existing algorithms, SMI and DPCA are
chosen to do the comparison. while plotting the detection Probability
against SINR , the results obtained are better for subspace technique
than DPCA and SMI. We got the SINR improved for subspace based
technique for same detection probability. The effect of subspace rank
on SINR was also analysed for understanding the computational load
caused by the technique. We also analysed the convergence of the
algorithm by taking plots of SINR against range snapshots.
Suspended nanoparticles in conventional fluids,
called nanofluids, have been the subject of intensive study
worldwide since pioneering researchers recently discovered the
anomalous thermal behavior of these fluids. The heat transfer from
smaller area is achieved through microchannels. The heat transfer
principle states that maximum heat transfer is achieved in
microchannels with maximum pressure drop across it. In this
research work the experimental and numerical investigation for
the improved heat transfer characteristics of serpentine shaped
microchannel heat sink using Al2O3/water nanofluid is done. The
fluid flow characteristics is also analyzed for the serpentine
shaped micrchannel. The experimental results of the heat
transfer using Al2O3 nanofluid is compared with the numerical
values. The calculations in this work suggest that the best heat
transfer enhancement can be obtained by using a system with an
Al2O3–water nanofluid-cooled micro channel with serpentine
shaped fluid flow
The Application Programming Interface restricts
the types of queries that the Web service can answer. For
instance, a Web service might provide a method that returns the
books of a given author in fast manner, but it might not provide a
method that returns the authors of a given book. If the user asks
for the author of some specific book, then the Web service cannot
be called – even though the underlying database might have the
preferred piece of information, this scenario is called asymmetry.
This asymmetry is particularly problematic if the service is used
in a Web service orchestration system. In this survey, we propose
to use on-the-fly information extraction (IE).IE used to collect
values, and then the value can be used as parameter Bindings for
the Web service. This survey shows how the information
extraction can be integrated into a Web service orchestration
system. The proposed approach is fully implemented in a
prototype called Search Using Services and Information
Extraction (SUSIE). Real-life data and services are used to
demonstrate the practical viability and good performance of our
approach
Due to increase demand of energy, increasing price
of petroleum fuels, depletion of petroleum fuels, and
environmental pollution by these fuel emissions, it is very
necessary to find the alternative fuels. This work focused on use
of hybrid blends of Karanja and Cottonseed oil Biodiesels. In this
work 20% and 25% blends are used and the performance and
emission tests were conducted on single cylinder, 4-stroke, water
cooled CI engine by running the engine at a speed of 1500rpm, at
a compression ratio of 16.5:1 and at an injection pressure of
205bar and performance parameters like BP, BSFC, BTE and
the emissions like CO, HC and NOx are compared. It was found
that the blends gave comparatively good results in respect of
performance and emissions.
Performance Enhancement of Wideband Reflectarray Antennas Embedded on Paper S...TELKOMNIKA JOURNAL
This research presents an innovative solution to address the bandwidth limitation of microstrip
reflectarray antennas. Organic substrate materials with controlled compositions have been characterized
to be employed as substrate materials for microstrip reflectarrays. The three proposed materials show low
dielectric permittivity values of 1.81, 1.64 and 1.84 along with loss tangents of 0.053, 0.047 and 0.057
respectively. The proposed substrate materials have been verified by modelling reflectarray unit elements
in CST MWS and measured using a waveguide simulator technique. The comparison between measured
and simulated results show a good agreement with promising broadband performance of 312, 340 and
207 MHz for S1, S2 and S3 substrate materials respectively.
Luigi Giubbolini | Microwave Nondestructive Testing of Composite MaterialsLuigi Giubbolini
Microwave Nondestructive Testing (MNDT) techniques have advantages over other NDT methods (such as radiography, ultrasonics, and eddy current) regarding low cost, good penetration in nonmetallic materials, good resolution and contactless feature of the microwave sensor (antenna).
State of the Art in the Characterization of Nano- and Atomic-Scale CatalystsDevika Laishram
Nanometer and subnanometer particles and films are becoming an essential and
integral part of new technologies and inventions in different areas. Some of the
most common areas include the microelectronic industry, magnetic recordings,
photovoltaic applications, and optical coatings. Because of the ultrasmall size at
atomic levels, the effect of quantum size becomes prominent, and the sensitivity
of size is defined even by a difference of a single atom. Additionally, the effect
is of utmost importance as the single-atom catalysts are far more advantageous
than conventional catalysts as they tend to anchor easily because of their low
coordination. Also, the presence of a single-atom catalyst in reactions creates
efficient charge transfer as it forms a strong interaction with the support.
Furthermore, catalysts in the subnanometer regime exhibit different electronic
states and adsorption capabilities compared to traditional catalysts. Therefore, to
fully appreciate the subnanometer catalysis reactions, it is essential to study the
means of characterizing the prepared subnano catalysts,
Planar Microwave Sensors for Accurate Measurement of Material Characterizatio...TELKOMNIKA JOURNAL
Microwave sensor is used in various industrial applications and requires highly accurate
measurements for material properties. Conventionally, cavity waveguide perturbation, free-space
transmission, open-ended coaxial probe, and planar transmission line technique have been used for
characterizing materials. However, these planar transmission lines are often large and expensive to build,
further restricting their use in many important applications. Thus, this technique is cost effective, easy to
manufacture and due to its compact size, it has the potential to produce sensitivity and a high Q-factor for
various materials. This paper reviews the common characteristics of planar transmission line and
discusses numerous studies about several designs of the microstrip resonator to improve the sensor
performance in terms of the sensitivity and accuracy. This technique enables its use for several industrial
applications such as agriculture and quality control. It is believed that previous studies would lead to a
promising solution of characterizing materials with high sensitivity, particularly in determining a high Qfactor
resonator sensor.
Improved technique for radar absorbing coatings characterization with rectan...IJECEIAES
For materials characterization, several methods have been developed. Most of them need a sample to be machined prior to testing process. Hence, they are destructive and cannot be used for in-situ radar absorbing coatings testing. This requires employing a suitable measurement technique to extract their electromagnetic properties quickly and accurately. In this paper, the swept frequency of probe reflection technique is proposed for broadband nondestructive radar absorbing coatings characterization using finite flange open-ended rectangular waveguide. The technique is based on the fact that the frequency of measurement is an independent variable of probe’s reflection coefficient by which its data set of selected frequency points can be directly measured in one step by varying the frequency. Finite-difference time-domain (FDTD) method was adopted to calculate probe reflection coefficients at different test conditions. Simple interpolation approximation was employed since they are frequency dependent parameters. Error analysis was numerically performed to evaluate the influences of both flange size and coated material thickness on the accuracy of the measurements, which are carried out on several samples of radar absorbing coatings at X-band to verify the proposed technique. Comparing with the existing methods, the proposed technique simplifies and speeds up measurement process and improves its repeatability and accuracy.
Numerical Modeling of Nano Bundle and Nano
Rope-Reinforced Polymer Mechanical Properties by Seyed Hossein Mamanpush*, Zahra Matin Ghahfarokhi and Hossein Golestanian in Evolutions in Mechanical
C Engineering
Microwave Planar Sensor for Determination of the Permittivity of Dielectric M...journalBEEI
This paper proposed a single port rectangular microwave resonator sensor. This sensor operates at the resonance frequency of 4GHz. The sensor consists of micro-strip transmission line and applied the enhancement method. The enhancement method is able to improve the return loss of the sensor, respectively. Plus, the proposed sensor is designed and fabricated on Roger 5880 substrate. Based on the results, the percentage of error for the proposed rectangular sensor is 0.2% to 8%. The Q-factor of the sensor is 174.
Complimentary split ring resonator sensor with high sensitivity based on mate...TELKOMNIKA JOURNAL
A new model of microwave planar sensor established on the complementary split ring resonator (CSRR) as well as an air hole in substrate of the structure is introduced for a precise measurement of materials permittivity. The hole is filled into substrate of the planar microstrip line. The CSRR structure with hole is selected for the sensitivity analysis, the result is established to hold over quite sensitive compared with CSRR structure without hole and thus evidence to be more suitable for the sensor design. The sensor in the form of CSRRs operating at a 1.74–3.4 GHz band is explained. At resonance, it is found that the electric field produced straight the plane of CSRR being highly sensitive for the characterization of sample resident with the sensor. The minimum transmission frequency of sensor shifts from 3.4 to 1.74 GHz as the sample permittivity varies from 1 to 10. A numerical paradigm is introduced herein for the computation of the system resolution as a assignment of resonance frequency and sample permittivity using electromagnetic simulator. It is found that the proposed sensor provides 35% increment in sensitivity more than conventional sensor for same permittivity of the specimen.
Recent joint surgery studies reveal increased
revisions and resurfacing of the metal on metal hip joints. Metal
on metal hip implants were developed more than thirty years ago
and their application has been refined because of availability of
advanced manufacturing techniques and partly by advancements
in material science and engineering. Development of composite
materials may provide greater durability to metal-on-metal hip
implants .This review article is a study of the latest literature of
metal-on-metal hip implants and its various modeling techniques.
Numbers of methods are used for convergence and numerical
solution to investigate the performance of metal-on-metal hip
implant for accurate stable solution. This paper presents analysis
done by various researchers on metal-on-metal hip implants for
wear, lubrication, fatigue, bio-tribo-corrosion, design, toxicity
and resurfacing. After in vivo and in vitro studies, it is found that
all these methods have limitations. There is a need of more
insight for lubrication analysis, geometry of bearings, materials
and input parameters. The information provided in this work is
intended as an aid in the assessment of metal-on-metal hip joints.
Background Hospital contributes significantly tangible and intangible resources on a concurred plan by the scheduling of surgery on the OT list. Postponement decreases efficiency by declining throughput leads to wastage of resources hence burden to the nation. Patients and their family face economic and emotional implication due to the postponement. Postponement rate being a quality indicator controls check mechanism could be developed from the results. Postponement of elective scheduled operations results in inefficient use of the operating room (OR) time on the day of surgery. Inconvenience to patients and families are also caused by postponements. Moreover, the day of surgery (DOS) postponement creates logistic and financial burden associated with extended hospital stay and repetitions of pre-operative preparations to an extent of repetition of investigations in some cases causing escalated costs, wastage of time and reduced income. Methodology A cross-sectional study was done in the operation theaters of a tertiary care hospital in which total ten operation theaters of General Surgery Data of scheduled, performed and postponed surgeries was collected from all the operation theater with effect from March 1st to September 30th, 2018. A questionnaire was developed to find out the reasons for the postponement for all hospital’s stakeholders (surgeons, Anesthetist, Nursing Officer) and they were further evaluated time series analysis of scheduling of Operation Theater for moving average technique. Results Total 958 surgeries were scheduled and 772 surgeries performed were and 186 surgeries were postponed with a postponement rate of 19.42% in the cardiac surgery department during the study period. Month-wise postponement Rate exponential smoothing of time series data shows the dynamic of operating suits. To test throughput Postponement rate was plotted the postponed surgeries and on regression analysis is in a perfect linear relationship.
Introduction: Postponement of elective scheduled operations results in inefficient use of operating room (OR) time on the day of surgery. Inconvenience to patients and families also caused by postponements. Moreover, day of surgery (DOS) postponement creates logistic and financial burden associated with extended hospital stay and repetitions of pre-operative preparations to an extend of repetition of investigations in some cases causing escalated costs, wastage of time and reduced income. Methodology: A cross sectional study was done in the operation theaters of a tertiary care hospital in which total ten operation theaters of General Surgery Data of scheduled, performed and postponed surgeries was collected from all the operation theater with effect from march 1st to September 30th 2018. A questionnaire was developed to find out the reasons for the postponement for all hospital’s stakeholders (Surgeons, Anesthetist, Nursing officer) and they were further evaluated Time series analysis of scheduling of Operation Theater for Moving average Technique. Results: total 2,466 surgeries were scheduled and 1,980 surgeries were performed and 486 surgeries were postponed in the general surgery department during the study period. Month wise postponement forecast was in accordance with the performed surgeries and on regression analysis postponed surgeries were in perfect linear relationship with the postponement Rate.
Solar cell absorber Kesterite- type Cu2ZnSnS4 (CZTS) thin films have been prepared by Chemical Bath Deposition (CBD). UV–vis absorption spectra measurement indicated that the band gap of as-synthesized CZTS was about1.68 eV, which was near the optimum value for photovoltaic solar conversion in a single-band-gap device. The polycrystalline CZTS thin films with kieserite crystal structure have been obtained by XRD. The average of crystalline size of CZTS is 27 nm
Multilevel inverters play a crucial part in the
areas of high and medium voltage applications. Among the three
main multilevel inverters used, the capacitor clamped multilevel
inverter(CCMLI) has advantage with respect to voltage
redundancies. This work proposes a switching pattern to improve
the performance of chosen H-bridge type CCMLI over
conventional CCMLI. The PWM technique used in this work is
Phase Opposition Disposition PWM(PODPWM). The
performance of proposed H-bridge type CCMLI is verified
through MATLAB-Simulink based simulation. It has been
observed that the THD is low in chosen CCMLI compared to
conventional CCMLI.
- In this paper, we introduce a practical mechanism of
compressing a binary phase code modulation (BPCM) signal
according to Barker code with 13 chips in presence of additive
white Gaussian noise (AWGN) by using a digital matched filter
(DMF) corresponding to time domain convolution algorithm of
input and reference signals using Cyclone II EP2C70F896C6
FPGA from ALTERA placed on education and development
board DE2-70 with the following parameters: frequency of
BPCM signal fIF=2 MHz, sampling frequency
f MHz SAM 50
,pulse period
T 200s
, pulse width
S 13sc
, chip width
CH 1sc
, compressing factor
KCOM 13
, SNRinp=1/1, 1/2, 1/3, 1/4, 1/5 and processing
gain factor SNRout/SNRinp=11.14 dB.
The results of filter operation are evaluated using a digital
oscilloscope GDS-1052U to display the input and output signals
for different SNRinp.
Flooding is one of the most devastating natural
disasters in Nigeria. The impact of flooding on human activities
cannot be overemphasized. It can threaten human lives, their
property, environment and the economy. Different techniques
exist to manage and analyze the impact of flooding. Some of these
techniques have not been effective in management of flood
disaster. Remote sensing technique presents itself as an effective
and efficient means of managing flood disaster. In this study,
SPOT-10 image was used to perform land cover/ land use
classification of the study area. Advanced Space borne Thermal
Emission and Reflection Radiometer (ASTER) image of 2010 was
used to generate the Digital Elevation Model (DEM). The image
focal statistics were generated using the Spatial Analyst/
Neighborhood/Focal Statistics Tool in ArcMap. The contour map
was produced using the Spatial Analyst/ Surface/ Contour Tools.
The DEM generated from the focal statistics was reclassified into
different risk levels based on variation of elevation values. The
depression in the DEM was filled and used to create the flow
direction map. The flow accumulation map was produced using
the flow direction data as input image. The stream network and
watershed were equally generated and the stream vectorized. The
reclassified DEM, stream network and vectorized land cover
classes were integrated and used to analyze the impact of flood on
the classes. The result shows that 27.86% of the area studied will
be affected at very high risk flood level, 35.63% at high risk,
17.90% at moderate risk, 10.72% at low risk, and 7.89% at no
risk flood level. Built up area class will be mostly affected at very
high risk flood level while farmland will be affected at high risk
flood level. Oshoro, Imhekpeme, and Weppa communities will be
affected at very high risk flood inundation while Ivighe, Uneme,
Igoide and Iviari communities will be at risk at high risk flood
inundation level. It is recommended among others that buildings
that fall within the “Very High Risk” area should be identified
and occupants possibly relocated to other areas such as the “No
Risk” area.
Without water, humans cannot live. Since time began,
we have lived by the water and vast tracts of waterless land have
been abandoned as it is too difficult to inhabit. At any given
moment, the earth’s atmosphere contains 4,000 cubic miles of
water, which is just 0.000012% of the 344 million cubic miles of
water on earth. Nature maintains this ratio via evaporation and
condensation, irrespective of the activities of man.
There is a certain need for an alternative to solve the water
scarcity. Obtaining water from the atmosphere is nothing new -
since the beginning of time, nature’s continuous hydrologic cycle
of evaporation and condensation in the form of rain or snow has
been the sole source and means of regenerating wholesome water
for all forms of life on earth.
An effective method to generate water is by the separation of
moisture present in air by condensation. In this study, the water
present in air is condensed on the surface of a container and then
collected in an external jacket provided on the container.
Insulations are provided to optimize the inner temperature of the
container.
The method is although uncommon but has certain advantages
which make it a success. The process is economical and does not
require a lot of utilities. It also helps in further reducing the
carbon footprint.
In every moment of functioning the Li-Ion
battery must provide the power required by the user, to have a
long operating life and to and to provide high reliability in
operation. The methods for analysis and testing batteries are
ensuring that all these conditions imposed to the batteries are
met by being tested depending on their intended use.
The success rate of real estate project is
decreasing as there is large scale of project and participation of
entities. It is necessary to study the risk factors involved in the
project. This paper focused on types of risks involved in the
project, risk factors, risk management tools & techniques.
Identification of risk of the project in terms of the total cost of the
project has been divided under Technical, Financial, Sociopolitical
and Statutory cost centers. Large real estate projects
have to tackle the following issues: land acquisition, skilledlabour
shortage, non-availability of skilled project managers, and
mechanization of the construction process to cater to the growing
demands. Non- availability of supporting infrastructure, political
issues like instability of the government leading to regulatory
issues, social issues, marketing forms an important part in these
projects as this is a onetime investment and the purchase cycle is
long , long development period makes the same project be at
different points in the real estate value cycle.
- In the present scenario carbon emission and sand
mining are major concern due to its hazardous effect to
environment and making serious imbalance to the ecosystem.
Various studies have been conducted to reduce severe effect on
environment, using byproducts like copper slag as partial
replacement of fine aggregate. Different researchers have also
revealed numerous uses of copper slag as a replacing agent in
determining the strength of concrete. A comprehensive review of
studies has been presented in this paper for scope of replacement
of fine aggregate from copper slag in concrete
- Security is a concept similar to being cautious
or alert against any danger. Network security is the condition of
being protected against any danger or loss. Thus safety plays a
important role in bank transactions where disclosure of any data
results in big loss. We can define networking as the combination
of two or more computers for the purpose of resource sharing.
Resources here include files, database, emails etc. It is the
protection of these resources from unauthorized users that
brought the development of network security. It is a measure
incorporated to protect data during their transmission and also
to ensure the transmitted is protected and authentic.
Security of online bank transactions here has been
improved by increasing the number of bits while establishing the
SSL connection as well as in RSA asymmetric key encryption
along with SHA1 used for digital signature to authenticate the
user
Background: Septoplasty is a common surgical
procedure performed by otolaryngologists for the correction of
deviated nasal septum. This surgery may be associated with
numerous complications. To minimize these complications,
otolaryngologists frequently pack both nasal cavities with
different types of nasal packing. Despite all its advantages,
nasal packing is also associated with some disadvantages. To
avoid these issues, many surgeons use suturing techniques to
obviate the need for packing after surgery.
Objective: To determine the efficacy and safety of trans-septal
suture technique in preventing complications and decreasing
morbidity after septoplasty in comparison with nasal packing.
Patients and methods: Prospective comparative study. This
study was conducted in the department of Otolaryngology -
Head and Neck Surgery, Rizgary Teaching Hospital - Erbil,
from the 6th of May 2014 to the 30th of November 2014.
A total of 60 patients aged 18-45 years, undergoing septoplasty,
were included in the study. Before surgery, patients were
randomly divided into two equal groups. Group (A) with transseptal
suture technique was compared with group (B) in which
nasal packing with Merocel was done. Postoperative morbidity
in terms of pain, bleeding, postnasal drip, sleep disturbance,
dysphagia, headache and epiphora along with postoperative
complications including septal hematoma, septal perforation,
crustation and synechiae formation were assessed over a follow
up period of four weeks.
Results: Out of 60 patients, 37 patients were males (61.7%)
and 23 patients were females (38.3%). Patients with nasal
packing had significantly more postoperative pain (P<0.05)><0.05). There was no significant difference between
the two groups with respect to nasal bleeding, septal
hematoma, septal perforation, crustation and synechiae
formation.
Conclusion: Septoplasty can be safely performed using transseptal
suturing technique without nasal packing.
The basic reason behind the need to
monitor water quality is to verify whether the examined
water quality is suitable for intended usage or not. This
study is conducted on Al -Shamiya al- sharqi drain in
Diwaniya city in Iraq to make valid assessment for the
level of parameters measured and to realize their effects
on irrigation. In order to assess the drainage water
quality for irrigation purposes with a high accuracy, the
Irrigation Water Quality Index (IWQI) will be examined
and upgraded (integrated with GIS) to make a
classification for drainage water. For this purpose, ten
samples of drainage water were taken from different ten
location of the stuay area. The collected samples were
analyzed chemically for different elements which affect
water quality for irrigation.These elements are :
Calcium(Ca+2), Sodium(Na+
), Magnesium(Mg+2),
Chloride( ), Potassium(K+
), Bicarbonate(HCO3),
Nitrate(NO3), Sulfate( , Phosphate( , Electrical
Conductivity(EC), Total Dissolved Solids (TDS), Total
Suspended Solids (TSS) and pH-values (PH). Sodium
Adsorption Ratio (SAR) and Sodium Content (Na%)
have been also calculated. Results suggest that, the use of
GIS and Water Quality Index (WQI) methods could
provide an extremely interesting as well as efficient tool
to water resource management. The results analysis of
(IWQI) maps confirms that: 52% of the drainage water
in study area falls within the "Low restriction" (LR) and
47%of study area has water with (Moderate
restriction)(MR),While 1% of drainage water in the
study area classified as (Sever restriction) (SR). So, the
drainage water should be used with the soil having high
permeability with some constraints imposed on types of
plant for specified tolerance of salts
The cable-hoisting method and rail cable-lifting
method are widely used in the construction of suspension bridge.
This paper takes a suspension bridge in Hunan as an example,
and expounds the two construction methods, and analyzes their
respective merits and disadvantages.
Baylis-Hillman reaction has been achieved on
different organic motifs but with completion times of three to
six days. Micellar medium of CTAB in water along with the
organic base DABCO has been used to effect the BaylisHillman
reaction on a steroidal nucleus of Withaferin-A for the
first time with different aromatic aldehydes within a day to
synthesize a library of BH adducts (W1a –W14a) and (W1bW14b)
as a mixture of two isomers and W15 as a single
compound. The isomers were separated on column and the
major components were chosen for bio-evaluation. Cytotoxic
activity of the synthesized compounds was screened against a
panel of four cancer cell lines Lung A-549, Breast MCF-7,
Colon HCT-116 and Leukemia THP-1 along with 5-florouracil
and Mitomycin-C as references. All the compounds exhibited
promising activity against screened cell lines and were found to
possess enhaunced activity than parent compound. BH adducts
with aromatic systems having methoxy and nitro groups were
found to be more active.
This paper presents the details on the
experimental investigation carried out to get the desired fresh
properties of the SCC. Tests were performed on various mixtures
to obtain the required SCC. In the present research work we
have replaced 15% of cement with class F fly ash. By varying the
quantity of water and sand the mortar mix was prepared. Later
varying percentage of coarse aggregate was added to the mortar
to obtain the desired SCC.
The batteries used in electric and hybrid vehicles
consists of several cells with voltages between 3.6V battery and
4.2 V in series or parallel combinations of configurations for
obtaining the necessary available voltages in the operation of a
hybrid electric vehicle. How malfunction of a single cell affects
the behavior of the entire battery pack, BMS main function is to
protect individual cells against over-discharge, overload or
overheating. This is done by correct balancing of the cells. In
addition BMS estimates the battery charge status
This project aims at using (PD-MCPWM) Phase
disposition multi carrier pulse width modulation technique to
reduce leakage current in a transformerless cascaded multilevel
inverter for PV systems. Advantages of transformerless PV
inverter topology is as follows, simple structure, low weight and
provides higher efficiency , but however this topology provides a
path for the leakage current to flow through the parasitic
capacitance formed between the PV module and the ground.
Modulation technique reduces leakage current with an added
advantage without adding any extra components.
Many people in Africa depend on water from
rivers and borehole, but purity of the drinking water from these
sources remains questionable. Mudzira River being the longest
River in a village called Vimtim located in Mubi North Local
government area in Adamawa State was studied in the months of
September to December, 2012 to ascertain the suitability of the
water for human consumption and other related uses. Five study
point: inlet (A, B), middle (C) and out let (D, E) were adopted for
monitoring the physico-chemical parameters using standard
procedures. The mean total temperature values were A (25.000C),
B (24.500C), C (25.500C), D (24.000C) and E (24.000C. Average P
H
values were A (8.00), B (7.87), C (8.20) D (8.37) and E (8.13). The
average conductivity values were A (73.90
1 cms
), B
(73.11
1 cms
), D (74.00
1 cms
) and E (73. 80
1 cms
). The
average total dissolves solid value of each sample were A (17.10),
B (17.10), C (20.00) D (21.64) and E (21.60.). The average
turbidity value of sample were A (47.00), B (47.00), C (50.00) D
(53.00) and E (50.00). Average total hardness value are A
(20.00mg/l), B (20.00mg/L) C (24.00mg/L), D (20.00mg/L) and E
(20.00mg/L). The average chloride content were A (12mg/L), B
(16mg/L), C (12 mg/L) D (16mg/L) and E (16mg/L). The average
calcium content were A (0.3mg/L) B (0.4mg/L), C (0.3mg/L), D
(0.3mg/L) and E (0.2mg/L). The average content of magnesium
were A (12mg/L), B (16mg/L), C (16Mg/L), D (12mg/L), E
(12mg/L). The lead content of River Mudzira water was
negligible. The mean coliform count were A (4), B (3), C (6), D
(7), and E (4).The values of the parameters studied were within
the WHO/NAFDAC recommended standards, excepts for total
coliform levels. In conclusion Vimtim residents consuming
untreated water from River Mudzira are potentially exposed to
possible acute, sub chronic or even chronic water borne diseases
like typhoid fever, dysentery, diarrhea etc.
More from International Journal of Technical Research & Application (20)
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
How world-class product teams are winning in the AI era by CEO and Founder, P...
Characterization Techniques of Metamaterials
1. International Journal of Technical Research and Applications e-ISSN:2320-8163,
www.ijtra.com Volume 2, Issue 1 (jan-feb 2014), PP. 45-48
45 | P a g e
Characterization Techniques of Metamaterials
Mohit Anand
Department of Electrical Communication,
Indian Institute of Science,
Bangalore-560012, India
Abstract—This paper focuses on the numerous techniques that
have been proposed over the years for metamaterial
characterization. These techniques are categorized into
analytical, field averaging and experimental methods, which
provide various methods to determine the complex permittivity,
complex permeability and refractive index of metamaterials.
Index Terms— Metamaterials (MTM), Dual negative materials
(DNG), Bi-anisotropy.
I. INTRODUCTION
Material characterization is all about using special
techniques and methods to investigate and show the character
of material. The most general description of a linear medium
requires the determination of 36 frequency dependent, complex
constitutive parameters. Bi-anisotropic media are very general
linear media, and the four material dydics comprise together 36
parameters responsible for the magneto-electric behavior.
Anisotropic, bi-isotropic, and isotropic media can be
considered as subclasses of bi-anisotropic materials with 18, 4,
and 2 degrees of freedom; respectively. The extraction of this
large number of coefficients is complicated and not required
for most of the practical structures.
Fig. 1. Different types of materials.
Metamaterials provides properties which usually cannot be
found in the nature; for example, simultaneous negative values
of permittivity, permeability and refractive index. These
properties have found many applications in antennas, optical
communication, radar, microwaves and biomedical
engineering.
The complete characterization of the effective medium is
obtained by retrieving all the components of the permeability
and permittivity tensors and accounting for both spatial and
time dispersion. The capability of characterizing the anisotropy
is especially useful in design and realization of metamaterials
which requires a full control of all the components of the
constitutive dyadic parameters. [1]
II. ANALYTICAL METHOD
Nowadays, an impressively large number of powerful
characterization techniques are being used to solve analytical
research problems; especially those related to the investigation
of the properties of new materials for advanced applications.
Analytical models give insight into the relationship between
the physical properties and the geometrical characteristics of
metamaterials with complicated structures. Metamaterials can
be analyzed at two levels of abstraction:
Microstructures: described as a single unit cell of periodic
lattice. The unit-cell analysis based on the following three
steps:
(a) Extraction of effective constitutive parameters from
scattering matrix with the method of parameter fitting of
dispersive models.
(b) Analysis of the dispersion diagrams obtained by the
solution of a periodic boundary eigen value problem.
(c) Higher order modes analysis based on the simulation of a
multimode scattering matrix.
The above-mentioned steps, respectively provides:
(a) Effective electric permittivity and magnetic permeability in
the frequency range of interest.
(b) Band structure of the periodic lattice with the frequency
range and type (forward or backward) of the propagating
mode.
2. International Journal of Technical Research and Applications e-ISSN:2320-8163,
www.ijtra.com Volume 2, Issue 1 (jan-feb 2014), PP. 45-48
46 | P a g e
(c) Frequency band, where the homogenized model of the
metamaterial lattice is not valid due to the significant
contribution of the higher order modes.
Macrostructures: Macrostructure approach allows for the
observation of various interesting phenomena related to
metamaterials and predicted from unit-cell level simulations.
Macrostructure level simulations allow for prediction and
visualization of the phenomena characteristic to metamaterials.
The macrostructure can be simulated in two forms:
(a) As a homogeneous effective structure or
(b) As a rigorous detailed lattice
Effective medium theory was proposed by Lord Rayleigh
in 1892. It was further studied and modified by Lorentz,
Maxwell-Garnett and Bruggeman is used to characterize
composite medium. This theory compares the average field
propagating inside a composite medium with respect to field
propagating inside a homogeneous medium to derive the
electrical characteristics of the medium. The two main
techniques to approximate the effective permittivity and
conductivity of composite medium are:
A. Bruggeman's Model: Effective medium theories and
approximations are developed from averaging the multiple
values of the constituents that directly make up the composite
material [2]. Bruggeman approximation assumes:
- Particles can still be considered spherical/ellipsoidal
- assume particles experience ‘average dielectric environment’
Berremann model [3] approximates the effective permittivity
as:
(1)
(2)
n Euclidean spatial dimension
𝛅 Fraction of each component
Effective permittivity of medium
Permittivity matrix
Ellipsoidal/spherical inclusions of permittivity
Appropriate doublet/triplet of depolarization factors
which is governed by the ratios between the axis of the
ellipsoid.
B. Maxwell-Garnett Model:
This technique of effective medium approximation [4]
consists of a matrix medium with permittivity εm and
inclusions with permittivity εi.
(3)
(4)
Effective permittivity of medium
Matrix medium
One of inclusions of permittivity
Volume fraction of inclusions
C. Higher order mode analysis: The homogenized effective
description and dispersion diagrams constitute valuable tools
that allow one to predict properties of a metamaterial from the
analysis of its single unit cell. However, the frequency ranges
of stop-bands and pass-bands obtained by Eigen mode solver
simulations do not always match the spectrum bands
characterizing effective parameters retrieved from transmission
and reflection coefficients[5] [6] [7].
D. Bloch mode analysis: Metamaterials occupy a special niche
between homogeneous media and photonic crystals. For that
reason, Bloch analysis and computation of band structures
constitute important tools in the modeling of metamaterials As
metamaterials do not rigorously satisfy the effective medium
limit and are located conceptually between homogeneous
materials and photonic crystals. [8]
E. Polarization and susceptibility Technique: This method was
proposed in [9] and it uses the fact that the permittivity and
permeability describe the interaction of a material with electric
and magnetic fields. This behavior depends on the ability of the
electromagnetic fields to polarize the particles in that material.
Thus in this method, the susceptibility of the material is used to
extract the effective material parameters (permittivity and
permeability).
F. Circuit analysis Technique: This technique was proposed in
[9] and it utilizes the circuit model of the material and uses
those values to extract the material parameters from
electromagnetic field equations. These field equations are
mapped from the circuit telegrapher’s equations via Ampere’s
law and the definition of potential. The permittivity and
permeability are therefore relates directly to per unit length
capacitance and inductance of a line [10].
III. FIELD AVERAGING METHODS
3. International Journal of Technical Research and Applications e-ISSN:2320-8163,
www.ijtra.com Volume 2, Issue 1 (jan-feb 2014), PP. 45-48
47 | P a g e
It is the most popular approach for the extraction of
metamaterial constitutive parameters from transmission and
reflection characteristics. Appropriate averages of local fields
are obtained from analytic or full wave analysis within unit
cell. This method provides accurate results when the
dimensions of periodicity of unit cell are very small compare to
the wavelength. Main drawback of this technique is that it does
not take into account spatial dispersion and may fail in
correctly predicting the scattering from metamaterial slab.
Following three versions of field averaging techniques are
available: Pendry’s field averaging method, Smith’s field
averaging method and Acher’s field averaging method. These
approaches are particularly valuable for the study of
metamaterials with gradient properties that can be modeled
through electromagnetic simulation software. [11] [12] [13].
IV. EXPERIMENTAL CHARACTERIZATION METHOD
Metamaterials can be experimentally characterized by the
following methods:
A. Nicolson-Ross-Weir Technique (NRW): This is the
popular approach for the extraction of metamaterial
constitutive parameters from transmission and reflection
characteristics. These methods are commonly used in
laboratories as an experimental way to find effective
parameters of a material sample under test. This technique
uses scattering parameters to derive the expressions of
impedance and admittance of structures. These values can
then be used to determine the reflection and transmission
coefficients that can be used to determine the index of
refraction and wave vector, which can be used for
permittivity and permeability determination. [14] [15] [16]
B. Nicolson-Ross-Weir Variant Technique (NRW-
Variant): This technique is similar to NRW technique but
relations to extract the impedance and the index of
refraction are different [17].In this method, firstly the
transmission coefficient, transmission coefficient and wave
vector of incident wave are determined after that the
complex expression for refractive index is determined the
from refractive index and impedance the values of effective
permittivity and effective permeability are determined. Due
to the explicit relation that results in unambiguous
handedness of the resulting material, this technique is
called NRW variant technique.
C. Resonator Method: This method provides high
accuracy but it is narrowband in nature. Thus individual
measurement setup should be required for retrieval of
constitutive parameters.[18]
D. Free Space Method: This method requires very
expensive setups.
E. Waveguide Toolkit Method: This method is based on
transmission and reflection coefficient of the structure. It
also requires expensive test setups.
F. Microstrip Topology: This method is a low cost and
wide band method but it works only if effective permeability
is positive hence not suitable for DNG materials.
G. Stripline Topology: This method is a low cost and
wide band method and it works of all values effective
permittivity and hence suitable for DNG materials.
V. CONCLUSION
In conclusion, we have seen that exact evaluation of the
refractive index and constitutive parameters (permittivity,
permeability, etc.) of metamaterials is in fact very difficult and
the classical methods of retrieving the constitutive parameters
have often led to non-physical results. By appropriately
modeling the space dispersion and anisotropy, we can
accurately predict the metamaterial constitutive parameters
using suitable techniques.
ACKNOWLEDGMENT
This work has been supported by Space Applications
Centre, Ahmedabad. The author wishes to thank Mr. K. J.
Vinoy, Associate Professor, Department of Electrical
communication, Indian Institute of Science, Bangalore, for
useful discussions, valuable comments and guidance.
REFERENCES
[1] J.B. Pendry, D. Schurig, and D.R. Smith, “Controlling
electromagnetic fields”, Science, vol. 312, pp. 1780–1782, June,
2006.
[2] R. Landauer, “Electrical conductivity in inhomogeneous media,”
AIP Conf. Proc. 40, 2, 1978.
[3] D.J. Bergman and K.J. Dunn, “Bulk effective dielectric constant
of a composite with a periodic microgeometry”, Physical
Review B, Condens. Matter, vol. 45 no. 23, pp. 13262-13271,
1992.
[4] J. Jamnik , J.R. Kalnin , E.A. Kotomin , J. Maier, “Generalised
Maxwell-Garnett equation: application to electrical and chemical
transport.,” Phys Chem Chem Phys. 2006 Mar 21;8(11):1310-4.
Epub, Feb, 2006.
[5] S. Foteinopoulou and C.M. Soukoulis. ,,Negative refraction and
left-handed behavior in two-dimensional photonic crystals”.
Physical Review B, 67:235107(1–5), 22 Mar, 2003.
4. International Journal of Technical Research and Applications e-ISSN:2320-8163,
www.ijtra.com Volume 2, Issue 1 (jan-feb 2014), PP. 45-48
48 | P a g e
[6] B. Gralak, S. Enoch, and G. Tayeb. ,,Anomalous refractive
properties of photonic crystals”. Journal of the Optical Society of
America A, 17(6):1012–1020, Jun, 2000.
[7] D. Seetharamdoo, R. Sauleau, K. Mahdjoubi, and A.-C.
Tarot.”Effective parameters of resonant negative refractive index
metamaterials: interpretation and validity”. Journal of Applied
Physics, 98:063505(1–4), 19 Sept, 2005.
[8] A. Andryieuski, S. Ha, Andrey A. Sukhorukov, Y. S. Kivshar, A.
V. Lavrinenko, “ Bloch-mode analysis for retrieving effective
parameters of metamaterials,” Phys. Rev. B 86, 035127 , 2012
[9] D. R. Smith, D.C. Vier, N. Kroll, and S. Schultz, “Direct
calculation of permittivity and permeability for a left-handed
metamaterial.” Applied Physics Letters 77(14), pp.2246-2248,
October, 2000.
[10] George V. Eleftheriades, Ashwin K. Iyer, and Peter C. Kremer,
“Planar negative refractive index media using periodically l-c
loaded transmission lines.” IEEE Transactions on Microwave
Theory and Techniques. 50(12): p.2702-2712, December, 2002.
[11] D.R. Smith, J.B. Pendry, “Homogenization of metamaterials by
field averaging”, Journal Optical Soc. Am., vol. 23, no. 3, pp.
391-403, 2006.
[12] D. R. Smith,S. Schultz, P. Markos, and C.M. Soukoulis,
”Determination of effective permittivity and permeability of
metamaterials from reflection and transmission coefficients.”
Physical Review B. 65(195104):p.1-5,April, 2002.
[13] D. R. Smith, D.C. Vier,N. Kroll, and S. Schultz, “Direct
calculation of permittivity and permeability for a left-handed
metamaterial.” Applied Physics Letters 77(14):p.2246-
2248,October, 2000.
[14] James Baker-Jarvis, Eric J. Vanzura, and William A. Kissick.,
“Improved technique for determining complex permittivity with
the transmission/ reflection method.” IEEE Transactions on
Microwave Theory and Techniques. 38(8):p.1096-1103, August
1990.
[15] D.K. Ghodgaonakar,V.V. Varadan, and V.K. Varadan “Free
space measurement of complex permittivity and complex
permeability of magnetic materials at microwave frequencies,”
IEEE Transactions on Instrumentation and Measurement,39(2):
pp.387-394,April 1990.
[16] Richard W. Ziolkowski, “Design fabrication, and testing of
double negative metamaterials.” IEEE Transactions on Antenna
and Propagation. 51(7):p.1516-1529, July, 2003.
[17] D. R. Smith,S. Schultz, P. Markos, and C.M. Soukoulis,
“Determination of effective permittivity and permeability of
metamaterials from reflection and transmission coefficients .”
Physical Review B. 65(195104):p.1-5, April, 2002.
[18] Li, J.N, “Practical method for determining inductance and
capacitance of metamaterial resonators,” Electronics Letters, vol.
48, pp.225-227, February, 2012.