The present study analyzes the amount of water-alcohol separation by pervaporation and use of polymer membranes with help of Artificial Neural Network and COMSOL Multiphysics. The influence of such parameters as volumetric flow rate, temperature, separation factor and permeate flux over the efficiency of dehydration process was analyzed through Artificial Neural Network. The reserarcher in this study used a
Feed Forward multilayer Perceptron neural network with a back propogation algorithm and LevenbergMarquardt function with two inputs and two outputs. The Tansig transfer function was used for the hudden layer and Purelin was used for the output layer; five nerons were defined for the hidden layer. After data
precessing, 70 percent of the data was allocated for learning, 15 percent was allocated for validation, and 25 percent was allocated for testing. The output values of Artificial Neural Network modelling were compard with the real values of pervaporation for separation of water from Ethanol, Acetone, and butanol. The results revealed that the proposed model had a good performance. Moreover, the output of COMSOL
software for pervaporation of five different alcohols were compared with the real values, and the error percentage of the actual amount of flux was calculated with the modeling value by means of related membranes. The results of COMSOL modeling showed that the error percentages of 3.049, 3.7, 3.51, 2.88, and 3.82 were respectively achieved for dehydration process of Acetone, Butanol, Ethanol, Isopropanol
and Methanol
Study of Membrane Transport for Protein Filtration Using Artificial Neural Ne...IJERDJOURNAL
ABSTRACT: Artificial Neural Networks (ANNs) are nonlinear mapping structures which functions same as human brain. Modeling can be made stronger especially while the underlying data relationship is not known. ANNs may recognize and learn inter-related patterns between input data sets and related target values. After training, ANNs may be utilized to judge the output of new independent input data. Thus ANNs are used best for the modeling of membrane processes, like ultra filtration and microfiltration. This allows us to judge the permeate flux and membrane rejection as functions of process variables. The aim is modeling of membrane transport for protein filtration is to analyze membrane systems by means of ANNs. To analyze this different ANNs are developed with the help of Mat lab. [1a][9]
Preparation and characterization of nimesulide loaded cellulose acetate hydro...Jing Zang
This document summarizes a study that prepared nimesulide-loaded cellulose acetate hydrogen phthalate nanoparticles using the salting out technique. The effect of drug concentration and polymer concentration on nanoparticle size, shape, and uniformity was investigated. Increasing the polymer concentration decreased nanoparticle size and improved uniformity. Drug concentration did not affect size. Nanoparticles were characterized using SEM, zeta potential analysis, and photon correlation spectroscopy. The mean nanoparticle size was 548.2 nm with a zeta potential of -19.8 mV, indicating stability.
This document discusses the techniques of correlated cryo-light microscopy (CLM) and soft X-ray tomography (SXT) for visualizing cell architecture and molecular location. CLM provides high-resolution fluorescent imaging of cells at cryogenic temperatures to localize molecules, while SXT uses soft X-rays to visualize internal cell structure. The two techniques are performed sequentially on the same sample and the data is merged to provide a composite 3D view of both cell structure and molecular location within cells. Specimen preparation, 3D reconstruction, alignment using fiducial markers, and segmentation based on linear absorption coefficients are discussed.
THREE DIMENSIONAL ELECTRON MICROSCOPY AND IN SILICOChetan Meena
This document provides a review of recent advancements in three dimensional electron microscopy (3DTEM) and associated tools. It discusses specimen preparation techniques for 3DTEM like negative staining, cryo-freezing, and cryo-negative staining. It also describes single particle analysis, the process of reconstructing 3D models from 2D particle images. Important in silico tools are discussed that aid 3DTEM, including those for generating electron density maps, contrast transfer function correction, resolution measurement, and B-factor correction. Visualization software for electron density maps generated by single particle analysis are also summarized.
The document provides an overview of characterization techniques for nanoparticles. It discusses how characterization refers to studying the features, composition, structure and properties of materials. Nanoparticles are defined as particles between 1 to 100 nanometers in at least one dimension. Their small size results in unique physical, chemical and biological properties compared to bulk materials. A variety of characterization techniques are described including optical microscopy techniques like dynamic light scattering, electron microscopy techniques like scanning electron microscopy, and other methods like photon spectroscopy. The techniques allow analyzing properties of nanoparticles like size, shape, structure and chemical composition.
Methods of Protein structure determination EL Sayed Sabry
This document summarizes several methods for determining protein structure: X-ray crystallography, nuclear magnetic resonance spectroscopy, and cryo-electron microscopy. X-ray crystallography involves growing protein crystals, exposing them to X-rays to generate diffraction patterns, and using the patterns to build 3D electron density maps of the protein. Nuclear magnetic resonance spectroscopy measures distances between atomic nuclei in soluble proteins by analyzing spectra from radiofrequency pulses applied in strong magnetic fields. Cryo-electron microscopy images frozen, hydrated protein samples with an electron microscope to determine large protein structures without the need for crystallization.
The document summarizes research on the effect of the speed of the support material on the structure of electrospun polyamide 6.6 (PA6.6) nanofiber webs. Nanofiber webs were produced at three different speeds and analyzed. It was found that decreasing the speed, and thus increasing the covering time, resulted in thicker nanofibers being formed. While the average fiber diameter did not significantly change, the distribution of fiber diameters shifted to include more larger fibers. The document proposes using parameters like the percentage of fibers in the first distribution peak and the average diameter of the two main peaks as better ways to characterize nanofiber web structure compared to just the average diameter.
Study of Membrane Transport for Protein Filtration Using Artificial Neural Ne...IJERDJOURNAL
ABSTRACT: Artificial Neural Networks (ANNs) are nonlinear mapping structures which functions same as human brain. Modeling can be made stronger especially while the underlying data relationship is not known. ANNs may recognize and learn inter-related patterns between input data sets and related target values. After training, ANNs may be utilized to judge the output of new independent input data. Thus ANNs are used best for the modeling of membrane processes, like ultra filtration and microfiltration. This allows us to judge the permeate flux and membrane rejection as functions of process variables. The aim is modeling of membrane transport for protein filtration is to analyze membrane systems by means of ANNs. To analyze this different ANNs are developed with the help of Mat lab. [1a][9]
Preparation and characterization of nimesulide loaded cellulose acetate hydro...Jing Zang
This document summarizes a study that prepared nimesulide-loaded cellulose acetate hydrogen phthalate nanoparticles using the salting out technique. The effect of drug concentration and polymer concentration on nanoparticle size, shape, and uniformity was investigated. Increasing the polymer concentration decreased nanoparticle size and improved uniformity. Drug concentration did not affect size. Nanoparticles were characterized using SEM, zeta potential analysis, and photon correlation spectroscopy. The mean nanoparticle size was 548.2 nm with a zeta potential of -19.8 mV, indicating stability.
This document discusses the techniques of correlated cryo-light microscopy (CLM) and soft X-ray tomography (SXT) for visualizing cell architecture and molecular location. CLM provides high-resolution fluorescent imaging of cells at cryogenic temperatures to localize molecules, while SXT uses soft X-rays to visualize internal cell structure. The two techniques are performed sequentially on the same sample and the data is merged to provide a composite 3D view of both cell structure and molecular location within cells. Specimen preparation, 3D reconstruction, alignment using fiducial markers, and segmentation based on linear absorption coefficients are discussed.
THREE DIMENSIONAL ELECTRON MICROSCOPY AND IN SILICOChetan Meena
This document provides a review of recent advancements in three dimensional electron microscopy (3DTEM) and associated tools. It discusses specimen preparation techniques for 3DTEM like negative staining, cryo-freezing, and cryo-negative staining. It also describes single particle analysis, the process of reconstructing 3D models from 2D particle images. Important in silico tools are discussed that aid 3DTEM, including those for generating electron density maps, contrast transfer function correction, resolution measurement, and B-factor correction. Visualization software for electron density maps generated by single particle analysis are also summarized.
The document provides an overview of characterization techniques for nanoparticles. It discusses how characterization refers to studying the features, composition, structure and properties of materials. Nanoparticles are defined as particles between 1 to 100 nanometers in at least one dimension. Their small size results in unique physical, chemical and biological properties compared to bulk materials. A variety of characterization techniques are described including optical microscopy techniques like dynamic light scattering, electron microscopy techniques like scanning electron microscopy, and other methods like photon spectroscopy. The techniques allow analyzing properties of nanoparticles like size, shape, structure and chemical composition.
Methods of Protein structure determination EL Sayed Sabry
This document summarizes several methods for determining protein structure: X-ray crystallography, nuclear magnetic resonance spectroscopy, and cryo-electron microscopy. X-ray crystallography involves growing protein crystals, exposing them to X-rays to generate diffraction patterns, and using the patterns to build 3D electron density maps of the protein. Nuclear magnetic resonance spectroscopy measures distances between atomic nuclei in soluble proteins by analyzing spectra from radiofrequency pulses applied in strong magnetic fields. Cryo-electron microscopy images frozen, hydrated protein samples with an electron microscope to determine large protein structures without the need for crystallization.
The document summarizes research on the effect of the speed of the support material on the structure of electrospun polyamide 6.6 (PA6.6) nanofiber webs. Nanofiber webs were produced at three different speeds and analyzed. It was found that decreasing the speed, and thus increasing the covering time, resulted in thicker nanofibers being formed. While the average fiber diameter did not significantly change, the distribution of fiber diameters shifted to include more larger fibers. The document proposes using parameters like the percentage of fibers in the first distribution peak and the average diameter of the two main peaks as better ways to characterize nanofiber web structure compared to just the average diameter.
Design of fragment screening libraries (Feb 2010 version)Peter Kenny
I have lectured on design of fragment screening libraries a number of times and, to be honest, my material is getting a bit dated. This presentation is from Feb 2010 when I was visiting CSIRO and the photo in the title slide was taken in Tierra del Fuego.
Study of magnetic and structural and optical properties of Zn doped Fe3O4 nan...Nanomedicine Journal (NMJ)
Objective(s):
This paper describes synthesizing of magnetic nanocomposite with co-precipitation
method.
Materials and Methods:
Magnetic ZnxFe3-xO4 nanoparticles with 0-14% zinc doping (x=0, 0.025, 0.05, 0.075, 0.1 and 0.125) were successfully synthesized by co-precipitation method. The prepared zinc-doped Fe3O4 nanoparticles were characterized by X-ray diffraction (XRD), transmission electron microscopy (TEM), Fourier transform infrared spectroscopy (FTIR), vibrating sample magnetometer (VSM) and UV-Vis spectroscopy.
Results:
results obtained from X-ray diffraction pattern have revealed the formation of single phase nanoparticles with cubic inverse spinal structures which size varies from 11.13 to 12.81 nm. The prepared nanoparticles have also possessed superparamagnetic properties at room temperature and high level of saturation magnetization with the maximum level of 74.60 emu/g for x=0.075. Ms changing in pure magnetite nanoparticles after impurities addition were explained based on two factors of “particles size” and “exchange interactions”. Optical studies results revealed that band gaps in all Zn-doped NPs are higher than pure Fe3O4. As doping percent increases, band gap value decreases from 1.26 eV to 0.43 eV.
Conclusion:
These magnetic nanocomposite structures since having superparamagnetic property
offer a high potential for biosensing and biomedical application.
X-ray crystallography allows us to determine protein structures at the atomic level by visualizing the electron density map generated from X-ray diffraction data of protein crystals. Several steps are involved including growing high quality protein crystals, mounting crystals in the X-ray beam to collect diffraction data, solving the phase problem to produce an electron density map, building and refining an atomic model that fits the map, and validating the final protein structure. This technique provides insights into protein function and enables structure-based drug design.
This document discusses how nanotechnology can be used to detect cancer. It describes how nanoparticles like gold nanoparticles, quantum dots, and nanorobots could be engineered to target cancer cells for detection and treatment. Gold nanoparticles in particular can accumulate in tumors due to leaky vasculature and be used for imaging. Quantum dots consisting of materials like cadmium selenide can be conjugated to biomarkers to image different parts of cells. Future nanorobots may help distinguish cell types, deliver targeted radiation treatment, and be retrieved after their task is complete, representing a promising new approach to medicine.
The document discusses various biomolecular interaction analysis (BIA) techniques for studying interactions between biomolecules and small molecules. It begins with an introduction to BIA and its importance for understanding biology, drug discovery, and diagnostics. The document then outlines and describes several BIA techniques categorized as biochemistry and biophysics methods, molecular methods, computer-aided techniques, and novel creative approaches. Specific techniques discussed in detail include fluorescence spectroscopy, circular dichroism spectroscopy, nuclear magnetic resonance spectroscopy, mass spectrometry, and isothermal titration calorimetry. The document provides information on the principles, applications, advantages, and limitations of each technique.
1) The document discusses various methods for determining the 3D structure of proteins, including x-ray crystallography, NMR spectroscopy, and cryo-electron microscopy.
2) X-ray crystallography involves purifying the protein, crystallizing it, collecting diffraction data from x-rays hitting the crystal, using this data to determine phases and calculate an electron density map, and building an atomic model through refinement.
3) NMR spectroscopy involves dissolving the purified protein and using nuclear magnetic resonance to measure distances between atomic nuclei, allowing the structure to be calculated.
This document provides an overview of protein structure analysis tools and techniques:
1) It describes exploring the Protein Data Bank (PDB) to view and analyze X-ray crystallography and NMR protein structures, comparing similar structures, and using tools like FoldX for in silico mutagenesis and homology modeling.
2) Key concepts covered include PDB file formats, atomic coordinates, B-factors, resolution, RMSD, and the principles of X-ray crystallography, NMR structure determination, and homology modeling.
3) Visualization software like YASARA, SwissPDBViewer and PyMOL are introduced for viewing protein structures from the PDB.
Turbidity is a measure of water quality. Excessive turbidity poses a threat to health and causes pollution. Most of the available mathematical models of water treatment plants do not capture turbidity. A reliable model is essential for effective removal of turbidity in the water treatment plant. This paper presents a comparison of Hammerstein Wiener and neural network technique for estimating of turbidity in water treatment plant. The models were validated using an experimental data from Tamburawa water treatment plant in Kano, Nigeria. Simulation results demonstrated that the neural network model outperformed the Hammerstein-Wiener model in estimating the turbidity. The neural network model may serve as a valuable tool for predicting the turbidity in the plant.
Neural Network Model Development with Soft Computing Techniques for Membrane ...IJECEIAES
Membrane bioreactor employs an efficient filtration technology for solid and liquid separation in wastewater treatment process. Development of membrane filtration model is significant as this model can be used to predict filtration dynamic which is later utilized in control development. Most of the available models only suitable for monitoring purpose, which are too complex, required many variables and not suitable for control system design. This work focusing on the simple time seris model for membrane filtration process using neural network technique. In this paper, submerged membrane filtration model developed using recurrent neural network (RNN) train using genetic algorithm (GA), inertia weight particle swarm optimization (IWPSO) and gravitational search algorithm (GSA). These optimization algorithms are compared in term of its accuracy and convergent speed in updating the weights and biases of the RNN for optimal filtration model. The evaluation of the models is measured using three performance evaluations, which are mean square error (MSE), mean absolute deviation (MAD) and coefficient of determination (R2). From the results obtained, all methods yield satisfactory result for the model, with the best results given by IW-PSO.
Protein structure determinationand our software toolsMark Berjanskii
Protein structure determinationand our software tools. Presentation is related to: biochemistry, bioinformatics, biology, biophysics, mark berjanskii, molecular biology, molecular dynamics, molecular modeling, nmr spectroscopy, protein nmr, public speaking, python programming, sparse data, structural biology, structure determination, teaching, web design, web development, web programming, Wishart group, hybrid data, X-ray crystallography, CryoEM, Mass Spectrometry
ДНК составляет лишь половину объёма хромосомAnatol Alizar
The document describes a study that used a technique called 3D-CLEM (correlative light and electron microscopy) to analyze the structure of mitotic chromosomes at high resolution. Some of the key findings include:
1) 3D-CLEM revealed that prophase chromosomes have a smaller volume than metaphase chromosomes, likely due to the absence of a chromosome periphery structure at that stage of mitosis.
2) The extra volume observed in metaphase chromosomes compared to prophase chromosomes can be almost entirely accounted for by the nucleolar volume.
3) Analysis of wild-type and Ki-67 depleted chromosomes found that the periphery structure comprises 30-47% of the entire chromosome volume and more than 33%
X-ray Crystallography & Its Applications in Proteomics Akash Arora
X-ray crystallography is a technique that uses X-rays to determine the atomic structure of crystals. It involves crystallizing molecules and bombarding them with X-rays, which produce a diffraction pattern. This pattern is used to deduce the molecular structure. X-ray crystallography has many applications in proteomics, including determining protein structures, studying protein interactions, and elucidating enzyme catalysis mechanisms. It provides atomic-level insights that advance understanding of protein function.
Proteomics, definatio , general concept, signficanceKAUSHAL SAHU
INTRODUCTION
GENERAL CONCEPT
WHY PROTEIOMIC NECESERY?
WHAT PROTEOMIC CAN ANSWER?
PRTEOMICS- ANALYSIS AND IDENTIFICATION OF PROTEIN
TWO-DIMENSIONAL SDS-PAGE
MASS SPECTROMETERS
SIGNIFICANCE OF STUDY AN ITS IMPORTANCE
APPLICATIONS
CHALLENGES
CONCLUSIONS
REFERENCES
This document describes a new microscopy technique called SPRAIPAINT that enables three-dimensional superresolution imaging of intracellular protein structures and the cell surface in living bacteria. SPRAIPAINT uses single molecule localization microscopy to image enhanced yellow fluorescent protein (eYFP) fusions within the cell and a dye binding to the cell surface sequentially using a single laser. This allows colocalization of an intracellular cytoskeletal protein (Crescentin-eYFP) with the cell membrane in living Caulobacter crescentus bacteria with 20-40nm precision over a large field of view. Imaging is done using a double-helix point spread function microscope for three-dimensional localization. SPRAIPAINT provides a
This document summarizes a study that uses coarse-grained molecular dynamics (MD) simulations to model the coordination between competing RNA base pairs. The study uses two RNA systems, each with two separated complementary strands, to test the coarse-grained model. For the first system, the simulation showed little conformational change, but the second system indicated a base pairing interaction as the RMSD sharply decreased partway through the simulation, suggesting the strands bent back together. The coarse-grained model reduces computational expenses compared to all-atom MD while optimizing accuracy and speed for simulating macromolecules like RNA, which could aid drug development efforts.
Formulation and evaluation of nanoparticles as a drug delivery systems Tarun Kumar Reddy
Nanomaterials fall into a size range similar to proteins and other macromolecular structures found inside living cells. As such, nanomaterials are poised to take advantage of existing cellular machinery to facilitate the delivery of drugs. Nanoparticles containing encapsulated, dispersed, absorbed or conjugated drugs have unique characteristics that can lead to enhanced performance in a variety of dosage forms.
Analysis of Thermoplastic by using Spectroscopic TechniqueNidaSehar
This document provides an overview of 6 spectroscopic techniques used to analyze thermoplastics: infrared spectroscopy, ultraviolet-visible spectroscopy, Raman spectroscopy, atomic absorption spectroscopy, X-ray fluorescence spectroscopy, and nuclear magnetic resonance spectroscopy. It describes the basic principles, instrumentation, and applications of each technique for identifying polymers, functional groups, fillers, and contaminants in thermoplastic samples.
This document discusses characterizing polymeric membranes under large deformations using an artificial neural network model. It presents an experimental study of blowing circular thermoplastic ABS membranes using free blowing technique. A multilayer neural network is used to model the non-linear behavior of the membrane under biaxial deformation. The neural network results are compared to experimental data and a finite difference model using a hyperelastic Mooney-Rivlin model. The neural network accurately reproduces the membrane behavior with minimal error margins compared to experimental measurements.
Launching digital biology, 12 May 2015, Bremenbioflux
Intro. It is not a secret that in biology laboratories hours of manual work are considered a compulsory part of the experiment. During a day of work, lab researchers have to pipette the right amounts of fluids in tubes, carry them from one machine to another, program and handle each machine individually, label and document carefully each step and then convert the results to data and analyse it. For a simple routine experiment, each of the mentioned tasks is performed at least 10 times/day. Past decade, a big effort has been done to produce machines (e.g., pipetting robots) that would automate some of the tasks in the lab. However, these machines were developed under the industrial mindset to maximize the throughput of a single task. Thus, these machines are of large size, task-specific, difficult to use (they usually come with dedicated drivers and software) and most importantly, extremely expensive. A solution is the use of digital microfluidics to enable the advance from automated biology to digital biology. In my vision, a digital lab should be:
• fully integrated, running all the tasks on the same machine
• easy to use, with a web-based software for biological design of new experiments and hardware control
• general-purpose, allowing easy reconfiguration and design of new experiments
• cheap, offering open-source and do-it-yourself assembly kits
Talk. In the talk, I will present an overview of the road to digital biology, covering all the main aspects, from computer-aided design to hardware production and biological applications.
Hands on. Also, prepare for some real engineering action :). I will execute live a biochemical application (enzymatic reaction of β-galactosidase with Xgal) on my homemade digital biochip. We will then discuss the current challenges in the development process and everyone will get a chance to play with the device. And of course, I will happily consider any engineering advice or idea you have :).
Design of fragment screening libraries (Feb 2010 version)Peter Kenny
I have lectured on design of fragment screening libraries a number of times and, to be honest, my material is getting a bit dated. This presentation is from Feb 2010 when I was visiting CSIRO and the photo in the title slide was taken in Tierra del Fuego.
Study of magnetic and structural and optical properties of Zn doped Fe3O4 nan...Nanomedicine Journal (NMJ)
Objective(s):
This paper describes synthesizing of magnetic nanocomposite with co-precipitation
method.
Materials and Methods:
Magnetic ZnxFe3-xO4 nanoparticles with 0-14% zinc doping (x=0, 0.025, 0.05, 0.075, 0.1 and 0.125) were successfully synthesized by co-precipitation method. The prepared zinc-doped Fe3O4 nanoparticles were characterized by X-ray diffraction (XRD), transmission electron microscopy (TEM), Fourier transform infrared spectroscopy (FTIR), vibrating sample magnetometer (VSM) and UV-Vis spectroscopy.
Results:
results obtained from X-ray diffraction pattern have revealed the formation of single phase nanoparticles with cubic inverse spinal structures which size varies from 11.13 to 12.81 nm. The prepared nanoparticles have also possessed superparamagnetic properties at room temperature and high level of saturation magnetization with the maximum level of 74.60 emu/g for x=0.075. Ms changing in pure magnetite nanoparticles after impurities addition were explained based on two factors of “particles size” and “exchange interactions”. Optical studies results revealed that band gaps in all Zn-doped NPs are higher than pure Fe3O4. As doping percent increases, band gap value decreases from 1.26 eV to 0.43 eV.
Conclusion:
These magnetic nanocomposite structures since having superparamagnetic property
offer a high potential for biosensing and biomedical application.
X-ray crystallography allows us to determine protein structures at the atomic level by visualizing the electron density map generated from X-ray diffraction data of protein crystals. Several steps are involved including growing high quality protein crystals, mounting crystals in the X-ray beam to collect diffraction data, solving the phase problem to produce an electron density map, building and refining an atomic model that fits the map, and validating the final protein structure. This technique provides insights into protein function and enables structure-based drug design.
This document discusses how nanotechnology can be used to detect cancer. It describes how nanoparticles like gold nanoparticles, quantum dots, and nanorobots could be engineered to target cancer cells for detection and treatment. Gold nanoparticles in particular can accumulate in tumors due to leaky vasculature and be used for imaging. Quantum dots consisting of materials like cadmium selenide can be conjugated to biomarkers to image different parts of cells. Future nanorobots may help distinguish cell types, deliver targeted radiation treatment, and be retrieved after their task is complete, representing a promising new approach to medicine.
The document discusses various biomolecular interaction analysis (BIA) techniques for studying interactions between biomolecules and small molecules. It begins with an introduction to BIA and its importance for understanding biology, drug discovery, and diagnostics. The document then outlines and describes several BIA techniques categorized as biochemistry and biophysics methods, molecular methods, computer-aided techniques, and novel creative approaches. Specific techniques discussed in detail include fluorescence spectroscopy, circular dichroism spectroscopy, nuclear magnetic resonance spectroscopy, mass spectrometry, and isothermal titration calorimetry. The document provides information on the principles, applications, advantages, and limitations of each technique.
1) The document discusses various methods for determining the 3D structure of proteins, including x-ray crystallography, NMR spectroscopy, and cryo-electron microscopy.
2) X-ray crystallography involves purifying the protein, crystallizing it, collecting diffraction data from x-rays hitting the crystal, using this data to determine phases and calculate an electron density map, and building an atomic model through refinement.
3) NMR spectroscopy involves dissolving the purified protein and using nuclear magnetic resonance to measure distances between atomic nuclei, allowing the structure to be calculated.
This document provides an overview of protein structure analysis tools and techniques:
1) It describes exploring the Protein Data Bank (PDB) to view and analyze X-ray crystallography and NMR protein structures, comparing similar structures, and using tools like FoldX for in silico mutagenesis and homology modeling.
2) Key concepts covered include PDB file formats, atomic coordinates, B-factors, resolution, RMSD, and the principles of X-ray crystallography, NMR structure determination, and homology modeling.
3) Visualization software like YASARA, SwissPDBViewer and PyMOL are introduced for viewing protein structures from the PDB.
Turbidity is a measure of water quality. Excessive turbidity poses a threat to health and causes pollution. Most of the available mathematical models of water treatment plants do not capture turbidity. A reliable model is essential for effective removal of turbidity in the water treatment plant. This paper presents a comparison of Hammerstein Wiener and neural network technique for estimating of turbidity in water treatment plant. The models were validated using an experimental data from Tamburawa water treatment plant in Kano, Nigeria. Simulation results demonstrated that the neural network model outperformed the Hammerstein-Wiener model in estimating the turbidity. The neural network model may serve as a valuable tool for predicting the turbidity in the plant.
Neural Network Model Development with Soft Computing Techniques for Membrane ...IJECEIAES
Membrane bioreactor employs an efficient filtration technology for solid and liquid separation in wastewater treatment process. Development of membrane filtration model is significant as this model can be used to predict filtration dynamic which is later utilized in control development. Most of the available models only suitable for monitoring purpose, which are too complex, required many variables and not suitable for control system design. This work focusing on the simple time seris model for membrane filtration process using neural network technique. In this paper, submerged membrane filtration model developed using recurrent neural network (RNN) train using genetic algorithm (GA), inertia weight particle swarm optimization (IWPSO) and gravitational search algorithm (GSA). These optimization algorithms are compared in term of its accuracy and convergent speed in updating the weights and biases of the RNN for optimal filtration model. The evaluation of the models is measured using three performance evaluations, which are mean square error (MSE), mean absolute deviation (MAD) and coefficient of determination (R2). From the results obtained, all methods yield satisfactory result for the model, with the best results given by IW-PSO.
Protein structure determinationand our software toolsMark Berjanskii
Protein structure determinationand our software tools. Presentation is related to: biochemistry, bioinformatics, biology, biophysics, mark berjanskii, molecular biology, molecular dynamics, molecular modeling, nmr spectroscopy, protein nmr, public speaking, python programming, sparse data, structural biology, structure determination, teaching, web design, web development, web programming, Wishart group, hybrid data, X-ray crystallography, CryoEM, Mass Spectrometry
ДНК составляет лишь половину объёма хромосомAnatol Alizar
The document describes a study that used a technique called 3D-CLEM (correlative light and electron microscopy) to analyze the structure of mitotic chromosomes at high resolution. Some of the key findings include:
1) 3D-CLEM revealed that prophase chromosomes have a smaller volume than metaphase chromosomes, likely due to the absence of a chromosome periphery structure at that stage of mitosis.
2) The extra volume observed in metaphase chromosomes compared to prophase chromosomes can be almost entirely accounted for by the nucleolar volume.
3) Analysis of wild-type and Ki-67 depleted chromosomes found that the periphery structure comprises 30-47% of the entire chromosome volume and more than 33%
X-ray Crystallography & Its Applications in Proteomics Akash Arora
X-ray crystallography is a technique that uses X-rays to determine the atomic structure of crystals. It involves crystallizing molecules and bombarding them with X-rays, which produce a diffraction pattern. This pattern is used to deduce the molecular structure. X-ray crystallography has many applications in proteomics, including determining protein structures, studying protein interactions, and elucidating enzyme catalysis mechanisms. It provides atomic-level insights that advance understanding of protein function.
Proteomics, definatio , general concept, signficanceKAUSHAL SAHU
INTRODUCTION
GENERAL CONCEPT
WHY PROTEIOMIC NECESERY?
WHAT PROTEOMIC CAN ANSWER?
PRTEOMICS- ANALYSIS AND IDENTIFICATION OF PROTEIN
TWO-DIMENSIONAL SDS-PAGE
MASS SPECTROMETERS
SIGNIFICANCE OF STUDY AN ITS IMPORTANCE
APPLICATIONS
CHALLENGES
CONCLUSIONS
REFERENCES
This document describes a new microscopy technique called SPRAIPAINT that enables three-dimensional superresolution imaging of intracellular protein structures and the cell surface in living bacteria. SPRAIPAINT uses single molecule localization microscopy to image enhanced yellow fluorescent protein (eYFP) fusions within the cell and a dye binding to the cell surface sequentially using a single laser. This allows colocalization of an intracellular cytoskeletal protein (Crescentin-eYFP) with the cell membrane in living Caulobacter crescentus bacteria with 20-40nm precision over a large field of view. Imaging is done using a double-helix point spread function microscope for three-dimensional localization. SPRAIPAINT provides a
This document summarizes a study that uses coarse-grained molecular dynamics (MD) simulations to model the coordination between competing RNA base pairs. The study uses two RNA systems, each with two separated complementary strands, to test the coarse-grained model. For the first system, the simulation showed little conformational change, but the second system indicated a base pairing interaction as the RMSD sharply decreased partway through the simulation, suggesting the strands bent back together. The coarse-grained model reduces computational expenses compared to all-atom MD while optimizing accuracy and speed for simulating macromolecules like RNA, which could aid drug development efforts.
Formulation and evaluation of nanoparticles as a drug delivery systems Tarun Kumar Reddy
Nanomaterials fall into a size range similar to proteins and other macromolecular structures found inside living cells. As such, nanomaterials are poised to take advantage of existing cellular machinery to facilitate the delivery of drugs. Nanoparticles containing encapsulated, dispersed, absorbed or conjugated drugs have unique characteristics that can lead to enhanced performance in a variety of dosage forms.
Analysis of Thermoplastic by using Spectroscopic TechniqueNidaSehar
This document provides an overview of 6 spectroscopic techniques used to analyze thermoplastics: infrared spectroscopy, ultraviolet-visible spectroscopy, Raman spectroscopy, atomic absorption spectroscopy, X-ray fluorescence spectroscopy, and nuclear magnetic resonance spectroscopy. It describes the basic principles, instrumentation, and applications of each technique for identifying polymers, functional groups, fillers, and contaminants in thermoplastic samples.
This document discusses characterizing polymeric membranes under large deformations using an artificial neural network model. It presents an experimental study of blowing circular thermoplastic ABS membranes using free blowing technique. A multilayer neural network is used to model the non-linear behavior of the membrane under biaxial deformation. The neural network results are compared to experimental data and a finite difference model using a hyperelastic Mooney-Rivlin model. The neural network accurately reproduces the membrane behavior with minimal error margins compared to experimental measurements.
Launching digital biology, 12 May 2015, Bremenbioflux
Intro. It is not a secret that in biology laboratories hours of manual work are considered a compulsory part of the experiment. During a day of work, lab researchers have to pipette the right amounts of fluids in tubes, carry them from one machine to another, program and handle each machine individually, label and document carefully each step and then convert the results to data and analyse it. For a simple routine experiment, each of the mentioned tasks is performed at least 10 times/day. Past decade, a big effort has been done to produce machines (e.g., pipetting robots) that would automate some of the tasks in the lab. However, these machines were developed under the industrial mindset to maximize the throughput of a single task. Thus, these machines are of large size, task-specific, difficult to use (they usually come with dedicated drivers and software) and most importantly, extremely expensive. A solution is the use of digital microfluidics to enable the advance from automated biology to digital biology. In my vision, a digital lab should be:
• fully integrated, running all the tasks on the same machine
• easy to use, with a web-based software for biological design of new experiments and hardware control
• general-purpose, allowing easy reconfiguration and design of new experiments
• cheap, offering open-source and do-it-yourself assembly kits
Talk. In the talk, I will present an overview of the road to digital biology, covering all the main aspects, from computer-aided design to hardware production and biological applications.
Hands on. Also, prepare for some real engineering action :). I will execute live a biochemical application (enzymatic reaction of β-galactosidase with Xgal) on my homemade digital biochip. We will then discuss the current challenges in the development process and everyone will get a chance to play with the device. And of course, I will happily consider any engineering advice or idea you have :).
A Multiset Rule Based Petri net Algorithm for the Synthesis and Secretary Pat...ijsc
Membrane computing is a branch of Natural computing aiming to abstract computing models from the structure and functioning of the living cell. A comprehensive introduction to membrane computing is meant to offer both computer scientists and non-computer scientists an up-to date overview of the field. In this paper, we consider a uniform way of treating objects and rules in P Systems with the help of Multiset rewriting rules. Here the synthesis and secretary pathway of glycoprotein in epithelial cells of small intestine is considered as an example. A natural and finite link is explored between Petri nets and membrane Computing. A Petri net (PN) algorithm combined with P Systems is implemented for the synthesis and secretary pathway of Glycoprotein. To capture the compartmentalization of P Systems, the Petri net is extended with localities and to show how to adopt the notion of a Petri net process accordingly. The algorithm uses symbolic representations of multisets of rules to efficiently generate all the regions associated with the membrane. In essence, this algorithm is built from transport route sharing a set of places modeling the availability of system resources. The algorithm when simulated shows a significant pathway of safe Petri nets.
A Multiset Rule Based Petri net Algorithm for the Synthesis and Secretary Pat...ijsc
Membrane computing is a branch of Natural computing aiming to abstract computing models from the structure and functioning of the living cell. A comprehensive introduction to membrane computing is meant to offer both computer scientists and non-computer scientists an up-to date overview of the field. In
this paper, we consider a uniform way of treating objects and rules in P Systems with the help of Multiset rewriting rules. Here the synthesis and secretary pathway of glycoprotein in epithelial cells of small intestine is considered as an example. A natural and finite link is explored between Petri nets and membrane Computing. A Petri net (PN) algorithm combined with P Systems is implemented for the synthesis and secretary pathway of Glycoprotein. To capture the compartmentalization of P Systems, the Petri net is extended with localities and to show how to adopt the notion of a Petri net process accordingly.
The algorithm uses symbolic representations of multisets of rules to efficiently generate all the regions associated with the membrane. In essence, this algorithm is built from transport route sharing a set of places modeling the availability of system resources. The algorithm when simulated shows a significant
pathway of safe Petri nets.
Finite Element Simulation of Microfluidic Biochip for High Throughput Hydrody...TELKOMNIKA JOURNAL
In this paper, a microfluidic device capable of trapping a single cell in a high throughput manner
and at high trapping efficiency is designed simply through a concept of hydrodynamic manipulation. The
microfluidic device is designed with a series of trap and bypass microchannel structures for trapping
individual cells without the need for microwell, robotic equipment, external electric force or surface
modification. In order to investigate the single cell trapping efficiency, a finite element model of the
proposed design has been developed using ABAQUS-FEA software. Based on the simulation, the
geometrical parameters and fluid velocity which affect the single cell trapping are extensively optimized.
After optimization of the trap and bypass microchannel structures via simulations, a single cell can be
trapped at a desired location efficiently.
Sk microfluidics and lab on-a-chip-ch5stanislas547
This document discusses Lab-on-a-Chip (LOC) technology and its applications in biomedical fields. LOC systems integrate full laboratory functions onto a single microchip to handle extremely small fluid volumes. Key points:
- LOCs deal with fluid transport and analysis on the microscale and are a type of microfluidic device. They can perform complex analyses like DNA separation and detection.
- LOC research grew in the 1990s as groups developed micropumps and sensors to integrate fluid processing. Genomics and military applications further drove research.
- LOCs have advantages like low sample/reagent use, fast analysis, compact size, and potential for point-of-care medical diagnostics. Examples include
Modeling of Nitrogen Oxide Emissions in Fluidized Bed Combustion Using Artifi...Waqas Tariq
The reduction of harmful emissions is affecting increasingly the modern-day production of energy, while higher objectives are set also for the efficiency of combustion processes. Therefore it is necessary to develop such data analysis and modeling methods that can respond to these demands. This paper presents an overview of how the formation of nitrogen oxides (NOx) in a circulating fluidized bed (CFB) boiler was modeled by using a sub-model -based artificial neural network (ANN) approach. In this approach, the process data is processed first by using a self-organizing map (SOM) and k-means clustering to generate subsets representing the separate process states in the boiler. These primary process states represent the higher level process conditions in the combustion, and can include for example start-ups, shutdowns, and idle times in addition to the normal process flow. However, the primary states of process may contain secondary states that represent more subtle phenomena in the process, which are more difficult to observe. The data from secondary combustion conditions can involve information on e.g. instabilities in the process. In this study, the aims were to identify these secondary process states and to show that in some cases the simulation accuracy can be improved by creating secondary sub-models. The results show that the approach presented can be a fruitful way to get new information from combustion processes.
Osteo Intraorganelle Nanoporation under Electrical StimuliIJERD Editor
Nanoporation is a highly effective method to increase permeability of intraorganelle membrane by using series of pico electric pulses. Using this technique, we can introduce specific drugs into the intraorganelle of regid cell like osreoblast and it has various application in medical science. The effect of phospholipids in respond to external pico electric fields, behaviour of water dipoles in the complex electric field landscape of the membrane interface and reorganization of water dipoles in pore formation process have been proposed in these studies. Pore characteristics such as life time, ion selectivity, size, kinetics of formation as well as number of pores are significant factors in the present study.
Quantic Analysis of the Adherence of a Gram-Negative Bacteria in A HEPA FilterIJAEMSJORNAL
It is known that Gram-negative bacteria (GNB) are the most frequent bacteria in hospital units. It is also known that GNBs generate a greater number of nosocomial infections in critical areas. In the present work, the adhesion of the bacterial cell wall (BCW) to the compounds of the material layers of a high efficiency filter (HEPA) was analyzed. The analysis was carried out by means of molecular simulation and quantum chemistry. The BCW and HEPA molecules were designed using Hyperchem software for simulation. The calculations of the quantum interactions of the molecules were carried out using the theory of the electron transfer coefficient (ETC). It obtained from 4 to 6 compounds that are more likely to interact even as a chemical reaction. The compounds of the glass fibers are the ones that work best for the adhesion and destruction of the BCW.
This study aims to simulate and understand the transport properties of nanotubes used for controlled release applications. A 3D Monte Carlo model is used to simulate molecular release from halloysite clay nanotubes. Comparisons to experimental data show excellent matches. Release occurs in two phases: an initial burst followed by a longer saturation phase. Adding end caps to the nanotubes provides an effective way to control molecular release rates.
Real-Time Simulation for Laser-Tissue Interaction ModelVictor Espigares
The document summarizes a three-layer model for real-time simulation of laser-tissue interaction:
1) A Monte Carlo simulation determines the irradiance distribution of light through tissue.
2) A bioheat transfer model uses the irradiance to calculate spatial and temporal temperature distributions.
3) Thermal damage is predicted using an Arrhenius model based on the temperature distributions.
The model is implemented in parallel using a hybrid communication approach to achieve real-time results for medical applications such as laser surgery planning.
We present a micro fluidic device that allows rapid and defined delivery of discrete and homogeneous reagents or samples to allow kinetic studies of surface-tethered biomolecules. We developed an Electro Osmotic Flow (EOF) based device consisting of an asymmetric Y-junction with an incorporated fast, pressure driven valve, two embedded measurement electrodes and reservoirs. The EOF is used to circumvent kinetic limitations on reagent transport to the surface of these tethered biomolecules due to the slow diffusion across parabolic concentration gradients in conventional pressure-driven flow devices. Using Fluorescein isothiocyanate (FITC) as the pH sensitive surface-immobilized biomolecules, we show that the reagent solution can be repeatedly exchanged within milliseconds, and that by using a synchronized triggering scheme we can monitor the reaction of our sensor biomolecules to the change of the environment on a time scale of 10ms.
Prediction of Bioprocess Production Using Deep Neural Network MethodTELKOMNIKA JOURNAL
Deep learning enhanced the state-of-the-art methods in genomics allows it to be used
in analysing the biological data with high prediction. The training process of neural network with
several hidden layers which has been facilitated by deep learning has been subjected into
increased interest in achieving remarkable results in various fields. Thus, the extraction of
bioprocess production can be implemented by pathway prediction in genomic metabolic network
in eschericia coli. As metabolic engineering involves the manipulation of genes which have the
potential to increase the yield of metabolite production. A mathematical model of this network is
the foundation for the development of computational procedure that directs genetic
manipulations that would eventually lead to optimized bioprocess production. Due to the ability
of deep learning to be well suited in terms of genomics, modelling for biological network can be
implemented. Each layer reveal the insight of biological network which enable pathway analysis
to be implemented in order to extract the target bioprocess production. In this study, deep
neural network has been to identify any set of gene deletion models that offers optimal results in
xylitol production and its growth yield.
IRJET- Overview of Artificial Neural Networks Applications in Groundwater...IRJET Journal
This document provides an overview of applications of artificial neural networks (ANNs) in groundwater studies. It discusses how ANNs mimic the human brain and can be used to model complex groundwater systems. It then summarizes several ways that ANNs have been successfully applied in groundwater hydrodynamics, water resources management, time series forecasting, and other areas. These include using ANNs to model coastal aquifers, predict groundwater levels, forecast water quality, and combine ANNs with other models for improved results. In summary, ANNs are a powerful tool for solving hydrogeological problems and have been widely used in groundwater research.
This document presents research using artificial neural networks to identify toxic gases in real time. A multi-layer perceptron neural network was trained using data from a multi-sensor system that detected hydrogen sulfide, nitrogen dioxide, and their mixture. Features extracted from the sensor responses were used as inputs to the neural network. The network was trained online using backpropagation and achieved 100% accuracy classifying gases during training and 96.6% accuracy during testing, with low error rates. This model achieved better performance than previous methods and can identify low concentrations of toxic gases in real time, which has applications for air quality monitoring and safety.
This document presents a method for detecting natural gas pipeline leaks using a binary matrix analyzer and neural network. The method involves gathering image data of pipelines, extracting binary data from the images using a matrix analyzer, inputting the binary data into the Matlab neural network toolbox to train and test an artificial neural network model. The trained neural network was able to detect pipeline leaks with 98.52% accuracy based on simulations. The method provides an intelligent system for automating natural gas pipeline leak detection as a safer and more cost-effective alternative to traditional inspection methods.
New Technique for Measuring and Controlling the Permeability of Polymeric Mem...Editor IJCATR
Membranes have wide uses in industry and medicine applications. Polymer membranes are important materials because of their high chemical resistance, but they are of weak mechanical resistance against high pressures. Therefore, it was essential to modify a permeability measuring technique free from high pressure application. The current work represented a modification for the permeability measuring technique of membranes, where ionic salt was added with known concentration to water as common solvent and the electrolyte current was measured behind the membrane. The electrolysis current was correlated to the flow rate of water across a polyvinyl alcohol (PVA) membrane. Some other problems were raised such that polarization on electrodes and changes in electrolyte contents during the long time of the slow process. Pulsed potential on electrodes resolved these problems and other associated problems like rush in current and the double layer capacitance effect. An empirical equation was suggested to evaluate the permeability of polymer membranes by this modified method. Easy and accurate measurement of permeability helped authors to change the permeability of PVA membranes by adding copper nano particles in membrane to reduce its permeability, and adding silicone dioxide micro particles to the PVA membranes to increase its permeability. Authors suggested a mechanism for these permeability changes. Scanning electron microscope images for the filled PVA membranes supported the suggested mechanism.
New Technique for Measuring and Controlling the Permeability of Polymeric Mem...Editor IJCATR
Membranes have wide uses in industry and medicine applications. Polymer membranes are important materials
because of their high chemical resistance, but they are of weak mechanical resistance against high pressures. Therefore, it was
essential to modify a permeability measuring technique free from high pressure application. The current work represented a
modification for the permeability measuring technique of membranes, where ionic salt was added with known concentration
to water as common solvent and the electrolyte current was measured behind the membrane. The electrolysis current was
correlated to the flow rate of water across a polyvinyl alcohol (PVA) membrane. Some other problems were raised such that
polarization on electrodes and changes in electrolyte contents during the long time of the slow process. Pulsed potential on
electrodes resolved these problems and other associated problems like rush in current and the double layer capacitance effect.
An empirical equation was suggested to evaluate the permeability of polymer membranes by this modified method. Easy and
accurate measurement of permeability helped authors to change the permeability of PVA membranes by adding copper nano
particles in membrane to reduce its permeability, and adding silicone dioxide micro particles to the PVA membranes to
increase its permeability. Authors suggested a mechanism for these permeability changes. Scanning electron microscope
images for the filled PVA membranes supported the suggested mechanism
This document presents a mathematical model for an electrochemically synthesized thin film of a conducting polymer used to immobilize an enzyme. The model describes the steady-state analysis of a mediated amperometric system involving substrate conversion to product via enzyme kinetics and reoxidation of reduced mediator at an electrode. Rate equations are developed and steady-state assumptions are applied to derive an expression for observed flux in terms of kinetic parameters and substrate concentration. The model can be used to predict system behavior and experimentally test characteristics.
Pentacene-Based Organic Field-Effect Transistors: Analytical Model and Simula...IDES Editor
Organic Field-Effect Transistors, OFETs, attract
much interest recently and their proficiency and hence
applications are being enhanced increasingly. However, only
analytical model of old field-effect transistors, developed for
silicon-based transistors, and their relevant numerical
analyses have been used for such devices, so far. Increasing
precision of such models and numerical methods are essential
now in order to modify OFETs and propose more effective
models and methods. This study pegs at comparing current
analytical model, simulation methods and experiment data
and their fitness with each other. Certainly, four aspects of
results of three abovementioned approaches were examined
comparatively: sub-threshold slope, on-state drain current,
threshold voltage and carrier mobility. We embark to analyze
related experiment data of OFETs made by pentacene, as the
organic material, along with various organic gate insulators
including CyEP, PVP, PMMA, Parylene-C and Polyimide and
then to offer their results, comparatively.
Similar to MODELING DEHYDRATION OF ORGANIC COMPOUNDS BY MEANS OF POLYMER MEMBRANES WITH HELP OF ARTIFICIAL NEURAL NETWORK AND COMSOL MULTIPHYSICS (20)
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
How to Add Chatter in the odoo 17 ERP ModuleCeline George
In Odoo, the chatter is like a chat tool that helps you work together on records. You can leave notes and track things, making it easier to talk with your team and partners. Inside chatter, all communication history, activity, and changes will be displayed.
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
This presentation was provided by Steph Pollock of The American Psychological Association’s Journals Program, and Damita Snow, of The American Society of Civil Engineers (ASCE), for the initial session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session One: 'Setting Expectations: a DEIA Primer,' was held June 6, 2024.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
This presentation includes basic of PCOS their pathology and treatment and also Ayurveda correlation of PCOS and Ayurvedic line of treatment mentioned in classics.
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
Strategies for Effective Upskilling is a presentation by Chinwendu Peace in a Your Skill Boost Masterclass organisation by the Excellence Foundation for South Sudan on 08th and 09th June 2024 from 1 PM to 3 PM on each day.
The simplified electron and muon model, Oscillating Spacetime: The Foundation...RitikBhardwaj56
Discover the Simplified Electron and Muon Model: A New Wave-Based Approach to Understanding Particles delves into a groundbreaking theory that presents electrons and muons as rotating soliton waves within oscillating spacetime. Geared towards students, researchers, and science buffs, this book breaks down complex ideas into simple explanations. It covers topics such as electron waves, temporal dynamics, and the implications of this model on particle physics. With clear illustrations and easy-to-follow explanations, readers will gain a new outlook on the universe's fundamental nature.
The simplified electron and muon model, Oscillating Spacetime: The Foundation...
MODELING DEHYDRATION OF ORGANIC COMPOUNDS BY MEANS OF POLYMER MEMBRANES WITH HELP OF ARTIFICIAL NEURAL NETWORK AND COMSOL MULTIPHYSICS
1. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
25
MODELING DEHYDRATION OF ORGANIC
COMPOUNDS BY MEANS OF POLYMER
MEMBRANES WITH HELP OF ARTIFICIAL
NEURAL NETWORK AND COMSOL
MULTIPHYSICS
Mansoor Kazemimoghadam1
*, Zahra Amiri2
1
Department of Chemical Engineering, Malek-Ashtar University of Technology,
2
Department of Chemical Engineering, South Tehran Branch,
Islamic Azad University, Tehran, Iran
ABSTRACT
The present study analyzes the amount of water-alcohol separation by pervaporation and use of polymer
membranes with help of Artificial Neural Network and COMSOL Multiphysics. The influence of such
parameters as volumetric flow rate, temperature, separation factor and permeate flux over the efficiency of
dehydration process was analyzed through Artificial Neural Network. The reserarcher in this study used a
Feed Forward multilayer Perceptron neural network with a back propogation algorithm and Levenberg-
Marquardt function with two inputs and two outputs. The Tansig transfer function was used for the hudden
layer and Purelin was used for the output layer; five nerons were defined for the hidden layer. After data
precessing, 70 percent of the data was allocated for learning, 15 percent was allocated for validation, and
25 percent was allocated for testing. The output values of Artificial Neural Network modelling were
compard with the real values of pervaporation for separation of water from Ethanol, Acetone, and butanol.
The results revealed that the proposed model had a good performance. Moreover, the output of COMSOL
software for pervaporation of five different alcohols were compared with the real values, and the error
percentage of the actual amount of flux was calculated with the modeling value by means of related
membranes. The results of COMSOL modeling showed that the error percentages of 3.049, 3.7, 3.51, 2.88,
and 3.82 were respectively achieved for dehydration process of Acetone, Butanol, Ethanol, Isopropanol
and Methanol
KEYWORDS:
Modeling, Dehydration, Polymer membrane, COMSOL Multiphysics, Artificial Neural Network
1. INTRODUCTION
The traditional purification methods like distillation and extraction require much energy
consumption. Moreover, due to heating the product or mixing it with a solvent, it is possible that
the final product would be decomposed or impure. Furthermore, the traditional systems usually
bring about environmental pollutions due to their purification systems. As a result, searching for
an alternative more economic method that would be able to solve the environmental pollution
problems (while dismissing the previous restrictions) seems absolutely reasonable.
2. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
26
Membrane separation is the most previledged method among the whole other separation methods.
In a membrane separation process, the feed includes a mixture of two or more parts. While
passing through a semi-permeable environment (membrane), these parts partially get purified.
The feed gets divided into two parts in a membrane process. The part of the feed which does not
pass through the membrane is called “Retentate”. Another part that passes through the membrane
is called “Permeate” [1]. The common membrane separation technology has such advantages as
lower energy consumption for separation, enabling conduction of separation process in situ, easy
access to the separated phases, conduction of separation process by means of light and compact
equipment, easy installation and operation, and the minimum requirements for control,
maintanance and repairment [2, 3]. The six major processes in membrane separation include
microfiltration, ultrafiltration, reverse osmosis, electrodialysis, gas separation, and pervaporation
(in such fields as water purification, chemical and food processing, medical application, and
biological separations) [4].
Use of pervaporation for separation of organic compounds has attracted the attention of many
researchers in recent years. Pervaporation is one of the complicated membrane separation
processes in which transfer through non-porous (non)polymer takes place in three phases of
absorption, penetration, and evaporation (or disposal) [5]. As a result, selectivity and permeation
is generally based on the interaction between membrane and penetrating molecules, the size of
leaking molecules, and the empty volume of membrane. This process is generally very good for
excluding little impurity from a liquid mixture. In general, pervaporation is accompanied with
penetrating material phase change from liquid to gas. The passed product through the membrane
will be separated as low-pressure steam from the other side of the membrane. The product will be
collected after changing into liquid. In fact, this process is the known evaporation process, while a
membrane is used between the two phases of liquid and gas. The presence of membrane adds
selectivity to the process and increases the advantages of the process. With the help of such
process, it is possible to separate two liquids from each other [6]. Due to such advantages as
excellent performance and high energy efficiency, this process has recently gained attention of
many industries. In most pervaporation processes, the driving force is the pressure difference
between the feed stream and the permeate stream. The vacuum pump provides the driving force
for mass transfer of components [7].
The results of this study by use of ANN reflected a suitable accuracy. The graph of error
percentage for the real outputs of separation factor and flux and the modeled separation factor and
flux by the related membranes for pervaporation performance were drawn in dehydration of
ethanol, acetone, and butanol. Moreover, the error percentage for the real flux and modeled flux
by the related membranes for each of the five alcohols were modeled by the COMSOL
Multiphysics.
2. EXPERIMENTAL
2.1 Artificial Neural Network (ANN)
Recently, there has been a number of research conducted on data processing for problems for
which there is no solution, or problems that are not easily solvable. The ANN pattern is inspired
by the neural system of living organisms that includes some constituent units called ‘Neuron’.
Most of the neurons are composed of the three main parts including cell body (that includes
3. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
27
necluous and other protective parts), dendrites, and axon. The last two parts are the
communicative parts of the neuron. Figure 1 displays the structure of a neuron.
Figure 1. Major parts of a biological cell
Dendrites, as electric signal receiving areas, are composed of cell fybers with unsmooth surface
and many splitted extensions. That is why they are called tree-like receiving networks. The
dendrites transfer the electrical signals into cell nucleous. The cell body provides the required
energy for neuron activity that can be easily modelled through an addidition and camparison with
threshold level. Unlike Dendrites, axon has a smoother surface and fewer extensions. Axon is
longer and transfers the received electro-chemical signal from the cell nucluous to other neurons.
The cinfluence of a cell’s axon and dendrites is called synapse. Synapses are small functional
structural units that enable the communication among nuerons. Synapses have different types,
from which one of the most important ones is the chemical synanpse.
Artificial nueral cell is a mathematical equation in which p represents an input signal. After
strengthening or weakening as much as a parameter w (in mathematical terms, it is called weight
parameter), an electric signal with a value of will enter the neuron. In order to simplify the
mathetical equation, it is assumed that the input signal is added to another signal with b value in
the nucluous. Before getting out of the cell, the final signal with a value of + will undergo
another process that is called “Transfer function” in technical terms. This operation is displayed
as a box in Figure 2 on which is written. The input of this box is the + signal and the
output is displayed by a. Mathematically, we will have:
= ( + )
Figure 2. Mathematical model of a neuron
Putting together a great number of the above-mentioned cells brings about a big nueral network.
As a result, the network developer must assign values for a huge number of and prameters;
this process is called learning process.
4. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
28
Within the structure of neural networks, some times it is needed to stack up a number of neurons
in a layer. Moreover, it is possible to take advantage of nueron crowds in different layers to
increase the system efficiency. In this situation, the network will be designed with a certain
number of inputs and outputs too; while the difference is that there would be more than one layer
(instead of having only one layer). In this manner (multi-layer network), the input layer is the
layer through which the inputs are given to the system, the output layer is the layer in which the
desired the results are delivered, and the other layers are called hidden layer. Figure 3 displays a
neural network with three layers. Input layer, output layer, and hidden layer (that is only one layer
in this figure). Through changing the number of hidden layers, and changing the number of
present neurons in each layer, it is possible to enhance the network capabilities [8].
Figure 3. A schematic view of Neural Network and its constituent layers [8].
2.2 COMSOL Multiphysics software
COMSOL Multiphysics is a modeling software that is able to go through the whole phases in
modeling process. This modeling software is able to go through each and every phase of
modeling as below:
- Drawing the geometrical structure of the model;
- Model meshing;
- Displaying model’s principal physique;
- Model solvation; and
- Displaying the results of modeling in a graphical manner.
Due to existence of default physical structures in this software, model creation can be done very
quickly. By use of this software, it is possible to analyze a wide range of mechanical and electro-
magnetic structures. It is possible to define the materials’ characteristics, resource parts, and the
approximate borders of objects as arbitrary functions of independent variables in this software.
2.3 Modeling dehydration of organic compounds by use of Neural Network
In this research, the influence of ANN input parameters (volumetric flow and temperature) as
well as the feed characteristics (the feeds are the network output) (separation factor and flux) on
the efficiency of dehydration process. Two ANNs were designed for analysis of the separation
5. Chemical Engineering: An International Journal (CEIJ), Vol. 1
factor and flux parameters. Feed
function with two inputs and two outputs were used. The Tansig transfer function was used for
the hidden layer, and Purelin was utilized for the output layer. Five neurons were
the hidden layer. After data processing, 70 percent was dedicated for learning, 15 percent was
dedicated for validation, and the remaining 15 percent was dedicated for testing. Such organic
compounds as ethanol, acetone, and butanol were sel
R2012a (7.14.0.739) was used. The results of the study are displayed in figures below. Figure 4
displays a schematic view of a two
are multiplied by a value, and there is a bias factor (b) that is added to the input (bias is a fixed
value that is added to the input in order to increase the accuracy). Afterward, the result will
undergo a function and the resulted value will be multiplied by a weight and
The final result will pass another function (with different form and functionality) and output is
made. There are five neurons and two
the output layer is the same as the numbe
Figure 4
The following points about the algorithms must be considered:
- The Data Division compartment totally scrambles the defined data for the system. This
compartment randomly defines the Train, Validation, and Test data, so that there will be
samples from everywhere of the environment.
- Levenberg-Marquardt function was used in Training phase.
- The Mean Squared Error (MSE) functions for performance measurement.
- The default settings were used for derivative issue.
Figure 5.
The number of data for modeling ethanol dehydration was 326. By use of polydimethylsiloxane
and Polyvinylidene fluoride membrane for the output of separation factor, the following results
were achieved [9]:
Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
lux parameters. Feed-forward multilayer perceptron ANN and Levenberg
function with two inputs and two outputs were used. The Tansig transfer function was used for
the hidden layer, and Purelin was utilized for the output layer. Five neurons were determined for
the hidden layer. After data processing, 70 percent was dedicated for learning, 15 percent was
dedicated for validation, and the remaining 15 percent was dedicated for testing. Such organic
compounds as ethanol, acetone, and butanol were selected in this research; and, Matlab version
R2012a (7.14.0.739) was used. The results of the study are displayed in figures below. Figure 4
displays a schematic view of a two-layer ANN with only one hidden and output layer. The inputs
value, and there is a bias factor (b) that is added to the input (bias is a fixed
value that is added to the input in order to increase the accuracy). Afterward, the result will
undergo a function and the resulted value will be multiplied by a weight and added with a bias.
The final result will pass another function (with different form and functionality) and output is
made. There are five neurons and two inputs on the first layer; however, the number of neurons in
the output layer is the same as the number of outputs.
Figure 4. A schematic view of the ANN
The following points about the algorithms must be considered:
The Data Division compartment totally scrambles the defined data for the system. This
compartment randomly defines the Train, Validation, and Test data, so that there will be
samples from everywhere of the environment.
Marquardt function was used in Training phase.
The Mean Squared Error (MSE) functions for performance measurement.
The default settings were used for derivative issue.
Figure 5. Algorithms compartment in ANN
The number of data for modeling ethanol dehydration was 326. By use of polydimethylsiloxane
and Polyvinylidene fluoride membrane for the output of separation factor, the following results
29
forward multilayer perceptron ANN and Levenberg-Marquardt
function with two inputs and two outputs were used. The Tansig transfer function was used for
determined for
the hidden layer. After data processing, 70 percent was dedicated for learning, 15 percent was
dedicated for validation, and the remaining 15 percent was dedicated for testing. Such organic
ected in this research; and, Matlab version
R2012a (7.14.0.739) was used. The results of the study are displayed in figures below. Figure 4
layer ANN with only one hidden and output layer. The inputs
value, and there is a bias factor (b) that is added to the input (bias is a fixed
value that is added to the input in order to increase the accuracy). Afterward, the result will
added with a bias.
The final result will pass another function (with different form and functionality) and output is
inputs on the first layer; however, the number of neurons in
The Data Division compartment totally scrambles the defined data for the system. This
compartment randomly defines the Train, Validation, and Test data, so that there will be
The number of data for modeling ethanol dehydration was 326. By use of polydimethylsiloxane
and Polyvinylidene fluoride membrane for the output of separation factor, the following results
6. Chemical Engineering: An International Journal (CEIJ), Vol. 1
The whole procedure is displayed through some st
initial values are displayed on the left side of the status bar, and the present value is displayed on
the right side.
Epoch is accepted from iteration 0 to 1000. It means the weights consecutively changed for 1
times based on the Levenberg-Marquardt function, and the training procedure was done. If the
iteration number reaches 1000, the procedure stops (here it stopped at 24). There was no limit for
time (but it could be set for training to stop after 30 seco
The performance rate was 3.08 at the beginning. If this rate reaches 0, the error would be
acceptable. Finally, the error rate reached 0.00403.
Gradient is the error function and it means the value of derivatives. The range for this
starts from 9.77, and would be acceptable if it reaches 1.00e
Mu is one of the Levenberg-Marquardt algorithm parameters.
Validation check is the maximum number of times that network failure can be tolerated.
Figure 6. Graph of Water and ethanol dehydration progress by
The performance graph shows the number of phases based on the errors. As shown in Figure 7,
the network performance for Train,
Phase 18 that is marked with a circle was the best validation performance; i.e. there were fewer
errors before the circle, and excessive training phase started after the circle.
Figure 7. The water- ethanol dehydration performance by Polydimethylsiloxane polymer membrane and
The training state graph shows different status in training phase. The first graph is for gradient
error function. The second is for Mu, and t
the third graph reached 6 in the vertical axis and stopped, it shows failure. Moreover, the
Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
The whole procedure is displayed through some status bars in the progress compartment. The
initial values are displayed on the left side of the status bar, and the present value is displayed on
Epoch is accepted from iteration 0 to 1000. It means the weights consecutively changed for 1
Marquardt function, and the training procedure was done. If the
iteration number reaches 1000, the procedure stops (here it stopped at 24). There was no limit for
time (but it could be set for training to stop after 30 seconds for example).
The performance rate was 3.08 at the beginning. If this rate reaches 0, the error would be
acceptable. Finally, the error rate reached 0.00403.
Gradient is the error function and it means the value of derivatives. The range for this
starts from 9.77, and would be acceptable if it reaches 1.00e-05. Finally, it reached 0.00171.
Marquardt algorithm parameters.
Validation check is the maximum number of times that network failure can be tolerated.
Graph of Water and ethanol dehydration progress by Polydimethylsiloxane polymer membrane
and Polyvinylidene fluoride
The performance graph shows the number of phases based on the errors. As shown in Figure 7,
the network performance for Train, Validation, and Test has decreased to an acceptable level.
Phase 18 that is marked with a circle was the best validation performance; i.e. there were fewer
errors before the circle, and excessive training phase started after the circle.
ethanol dehydration performance by Polydimethylsiloxane polymer membrane and
Polyvinylidene fluoride
The training state graph shows different status in training phase. The first graph is for gradient
error function. The second is for Mu, and the third is for validation fail. Regarding the fact that
the third graph reached 6 in the vertical axis and stopped, it shows failure. Moreover, the
30
atus bars in the progress compartment. The
initial values are displayed on the left side of the status bar, and the present value is displayed on
Epoch is accepted from iteration 0 to 1000. It means the weights consecutively changed for 1000
Marquardt function, and the training procedure was done. If the
iteration number reaches 1000, the procedure stops (here it stopped at 24). There was no limit for
The performance rate was 3.08 at the beginning. If this rate reaches 0, the error would be
Gradient is the error function and it means the value of derivatives. The range for this variable
05. Finally, it reached 0.00171.
Validation check is the maximum number of times that network failure can be tolerated.
Polydimethylsiloxane polymer membrane
The performance graph shows the number of phases based on the errors. As shown in Figure 7,
Validation, and Test has decreased to an acceptable level.
Phase 18 that is marked with a circle was the best validation performance; i.e. there were fewer
ethanol dehydration performance by Polydimethylsiloxane polymer membrane and
The training state graph shows different status in training phase. The first graph is for gradient
he third is for validation fail. Regarding the fact that
the third graph reached 6 in the vertical axis and stopped, it shows failure. Moreover, the
7. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
31
validation fail graph shows that the system has been stable for 18 times, and failed 6 times
afterward; consequently, excessive training happened for it.
Figure8. State training graph for water- ethanol dehydration by Polydimethylsiloxane polymer membrane
and Polyvinylidene fluoride
The regression graph demonstrates regression separately. The horizontal axis displays the outputs
of the target parameters (in fact, what to be achieved). The vertical axis displays the ANN output.
As a result, the graph is drawn based on these two parameters. If the ANN would be able to
model exactly, the graph should be placed the = line (a line with a slope of 1 that passes the
origin of coordinates). In order to statistically calculate the best line with the lowest error, the
linear equation achieved in all graph must be used:
= 0.99 × + 0.13
The result would be better if the value of F will be closer to 1. This shows the fitting desirability,
i.e. there is a low difference between the target outputs and the ANN outputs in the modeling. In
general, the regression coefficient for all the data was calculated to be 0.99871 that is considered
a very good result.
Figure 9. Regression graph for water-ethanol dehydration by Polydimethylsiloxane polymer membrane and
Polyvinylidene fluoride
8. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
32
Figure 10 displays the water- ethanol figure graph for Polydimethylsiloxane polymer membrane
and Polyvinylidene fluoride. According to this figure, with increase of temperature and decrease
of volumetric flow rate in ethanol dehydration, the separation factor increases in the beginning,
and decreases afterward.
Figure 10. Figure graph for Water- ethanol dehydration by Polydimethylsiloxane polymer membrane and
Polyvinylidene fluoride
The graph for calculation of error percentage of real output and modeling output is as below.
Using the related formula, the error percentage of the real data and modeling data can be
achieved. The lower percentage of error would be more desirable. As an example, a number of
accidental cases of data were selected, and their error percentage were calculated. A comparison
between the separation factor of the real values and the modeling values was conducted then. The
results of the comparison revealed that there was a little difference between the real data and the
modeling data. As a result, the modeling has been successful.
As shown in Figure 11, with increase of temperature and decrease of volumetric flow rate, the
separation factor increases first, and decreases afterward.The cause of this phenomenon is that the
propulsion increases with the increase of temperature at first. However, as the temperature
continuously raises, the difference between the water- athanol solubility and diffusion rate
decreases, and the separation factor declines accordingly.
Figure 11. A comparison between the error percentage of real separation factor and the modeling separation
factor for water and ethanol Polydimethylsiloxane polymer membrane and Polyvinylidene fluoride
The output of overall flux in water- ethanol dehydration through ethanol Polydimethylsiloxane
polymer membrane and Polyvinylidene fluoride, with 326 data is as follows.
In the performance graph, the best validation performance was in the twenty-first repetition, and
the excessive learning started afterward.
9. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
33
Figure 12. Performance graph for water- ethanol dehydration by ethanol Polydimethylsiloxane polymer
membrane and Polyvinylidene fluoride
= 0.99 × + 92
Figure 13. Regression graph for water- ethanol dehydration by ethanol Polydimethylsiloxane polymer
membrane and Polyvinylidene fluoride
As shownin the figure below, with the increase of temperature and volumetric flow rate in
dehydration of ethanol, the total flux increases. With the increase of temperature, the driving
force of mass transfer and the saturated vapor pressure of useful compounds while penetration in
membrane increases, and the flux increases accordingly.
Figure 14. Figure graph for water- ethanol dehydration by ethanol Polydimethylsiloxane polymer
membrane and Polyvinylidene fluoride
10. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
34
The figure below displays a comparison between the error percentage of the real outputs and the
modeling outputs. According to this figure, with increase of temperature and volumetric flow rate
in water- ethanol dehydration, the overall flux increases.
Figure 15. Comparison of error percentage for overall flux in reality and modeling for water- ethanol
dehydration by ethanol Polydimethylsiloxane polymer membrane and Polyvinylidene fluoride
91 data were used for dehydration of acetone, and Polyacrylonitrile and polyethylene
glycolmembranes were utilized. The results for separation factor are as below [10]:
The best validation performance in performance graph was in the twenty-seventh repetition.
The regression coefficient for all the data in regression graph was equal to 0.99946 that was a
very good result.
The graph for calculating the error percentage of the real output value and the modeling output
value is displayed below. As reflected in this graph, with increase of temperature and volumetric
flow rate in dehydration of acetone, the separation factor decreases. This phenomenon can be
justified in this manner: continuous increase of the feed temperature decreases the water-acetone
penetration and solubility difference, and decreases the separation factors of acetone accordingly.
Figure 16. Comparison of the error percentage for real separation factor and modeling in water-acetone
dehydration by Polyacrylonitrile and polyethylene glycol membranes
The results of overal flux output in dehydration of acetone by Polyacrylonitrile and polyethylene
glycol membranes are as follows:
The best validation performance in the performance graph was in the seventeenth repetition.
The regression coefficient for all data in the regressio graph was calculated to be 0.99909 that is a
very good result.
11. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
35
The graph for calculation of the error percentage for real flux and modeling is as follows. It can
be seen that with increase of temperature and volumetric flow rate in dehydration of acetone, the
overal flux increases. This phenomenon can be justified in this manner: with increase of
temperature, the driving force of mass transfer and the saturated vapor pressure of useful
compounds while penetration in membrane increases, and the flux increases accordingly.
Figure 17. Comparison of error percentage for overal flux in reality and in modeling of aceton dehydration
by Polyacrylonitrile and polyethylene glycol membranes
302 data objects were used for modeling butanol dehydration, and polydimethylsiloxane
membrane was used. The results of the modeled separation factor are as follows [11]:
In the performance graph, the best validation performance was in the 151 repetition. The
regression coefficient for all the data was 0.9922 that was a good result. In the error percentage
graph for the real outputs and modeling outputs of the buthanol dehydration, it was observed that
with increase of temperature and decrease of volumetric flow rate, the separation factor increased
at the beginning and decreased afterward.
Figure 18. Comparison of the error percentage of real separation factor and modeled separation factor for
buthanol dehydration by polydimethylsiloxane membrane
The results for the dehydration of buthanol by means of polydimethylsiloxane membrane with
302 data objects, the results for flux output are as follows:
The best validation performance in the performance graph was in the 202 repetition.
The regression coefficient for all the data in regression graph was 0.9998 that was a very good
result. In the error percentage graph for the real outputs and modeling outputs of the buthanol
dehydration, it was observed that with increase of temperature and increase of volumetric flow
rate, the overal flux increased.
12. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
36
Figure 19. Comparison of the overal flux error percentage for real and modeled dehydration of buthanol by
polydimethylsiloxane membrane
2.4 Modeling dehydration of organic compounds by use of COMSOL Multiphysics
Software
The organic compounds including acetone, butanol, ethanol, isopropanol, and methanol were
utilized in this study, and the COMSOL Multiphysics version 4.2.0.150 was implemented to data
analysis. The procedure description is as follows:
Mesh is the starting point for the Finite Element method that partitions geometry into simpler and
smaller units.
Figure 20. Meshing the membrane module in the water- alcohol processing by different membranes
In this method, you can see counters in the results section that includes temperature (ht), mass
fraction, flux (chcs), velocity (spf), and pressure (spf).
Under the temperature of 3133.15 kelvin, the separation coefficience of 57, the flux permeability
of1.023 kg/m2
h for dehydration of acetone aqueous solution of 20% wt, the pressure was equal to
70 pascal, and the membrane used was Acrylonitrile and 2-Hydroxyethyl methacrylate grafted
polyvinyl alcohol [12].
In the temperature graph, the input temperature equals with the atmosphere temperature, and it
gradually increases across the membrane, as condensation takes place in the membrane output
due to the vacum condition. Afterward, the temperature decreases termodinamycally in the
membrane output, and becomes cold.
13. Chemical Engineering: An International Journal (CEIJ), Vol. 1
Figure 21. Temperature in water-acetone processing by Acrylonitrile and 2
As can be seen in the mass fraction graph below, the amount of mass fraction is fixed across the
module. The reason is that after balancing the membrane swelling and saturation concentration of
water on the membrane surface, flux and water separation factor change slightly with the increase
of the feed’s water amount.
Figure 22. Mass fraction in water-
As shown in the figure, flux is stable and low in the membrane entries and walls due to the low
driving force. However, the flux increased within the membrane, as the pressure increased. The
error oercentage in flux was 3.049.
Figure 23. Flux graph for water- acetone processing by Acrylonitrile and 2
Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
acetone processing by Acrylonitrile and 2- Hydroxyethyl
grafted polyvinyl alcohol
As can be seen in the mass fraction graph below, the amount of mass fraction is fixed across the
module. The reason is that after balancing the membrane swelling and saturation concentration of
surface, flux and water separation factor change slightly with the increase
acetone processing by Acrylonitrile and 2- Hydroxyethyl methacrylate
grafted polyvinyl alcohol
As shown in the figure, flux is stable and low in the membrane entries and walls due to the low
driving force. However, the flux increased within the membrane, as the pressure increased. The
error oercentage in flux was 3.049.
acetone processing by Acrylonitrile and 2- Hydroxyethyl methacrylate
grafted polyvinyl alcohol
37
methacrylate
As can be seen in the mass fraction graph below, the amount of mass fraction is fixed across the
module. The reason is that after balancing the membrane swelling and saturation concentration of
surface, flux and water separation factor change slightly with the increase
Hydroxyethyl methacrylate
As shown in the figure, flux is stable and low in the membrane entries and walls due to the low
driving force. However, the flux increased within the membrane, as the pressure increased. The
Hydroxyethyl methacrylate
14. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
38
The current velocity in the entry is low, and it decreased across thewals too in the velocity graph;
however, the velocity increased in the membrane. The reason of this phenomenon can be
reduction of temperature across the membrane and increase of water concentration in membrane.
Afterward, the separation factor for water increases due to increase of the driving force for the
mass transfer.
Figure 24. Velocity graph for water- acetone processing by Acrylonitrile and 2- Hydroxyethyl
methacrylate grafted polyvinyl alcohol
As observed in the figure, the input is under the atmosphere pressure, and the pressure decreased
across the membrane module. The reason can be said to be reduction of the driving force across
the membrane.
Figure 25. Pressure graph for water- acetone processing by Acrylonitrile and 2- Hydroxyethyl methacrylate
grafted polyvinyl alcohol
The error percentage for flux in dehydration of butanol, ethanol, isopropanol, and methanol were
as follows:
Under the temperature of 333.15, separation coefficient 26.5, and permeability flow of 3.07
kg/m2
h for dehydration of butanol, 27.6%wt was achieved. The pressure was 33.33 pascal and the
membrane utilized was a combination of polyvinyl alcohol and nylon 66 [13]. The error
percentage in flux was 3.7.
15. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
39
Under the temperature of 333.15, separation coefficient 43, and permeability flow of 0.2 kg/m2
h
for dehydration of ethanol, 95%wt was achieved. The pressure was 40 pascal and the membrane
utilized was Polyvinyl alcohol [14]. The error percentage in flux was 3.51.
Under the temperature of 343.15, separation coefficient 1002, and permeability flow of 1.35
kg/m2
h for dehydration of isopropanol, 50%wt was achieved. The pressure was 180 pascal and
the membrane utilized was The combination of polyvinyl alcohol and poly-electrolyte [15]. The
error percentage in flux was 2.88.
the temperature of 303.15, separation coefficient 13, and permeability flow of 1.8e-5 kg/m2
h for
dehydration of methanol, 10%wt was achieved. The pressure was 130 pascal and the membrane
utilized was Poly-3-hydroxybutyrate [16]. The error percentage in flux was 3.82.
2.5 Comparison of ANN and COMSOL in dehydration of acetone by
Polyacrylonitrile and polyethylene glycol
The error value in the ANN was 1.86, and the error value in COMSOL was 2.11. Regarding the
error value, it can be concluded that both modeling methods were appropriate, and the error
percentage in ANN is lower than the COMSOL. As a result, ANN is more accurate, and the
reason is that ANN considers the problems in detail, while COMSOL consideres the problems in
general.
3. CONCLUSION
In this study, dehydration of water-ethanol, water- aceton, and water butanol by use of
pervaporation process was modeled in ANN. The polymer membranes Polydimethylsiloxane,
Polyvinylidene fluoride, Polyacrylonitrile, and Polyethylene glycol are hydrophilic membranes,
and are appropriate for separation of low amounts of water in alcohols. Moreover, the ANN in
this study reflected the error suitably.
The dehydration of water- acetone, water- butanol, water- ethanol, water- isopropanol, and water-
mathanol by pervaporation process were also modeled in the COMSOL. The hydrophilic
membranes were used that are good for separation of low amounts of water in alcohols.
Moreover, the COMSOL in this study reflected the error suitably.
REFERENCES
[1] Seader and Henely, Separation process principles, John Wiley and Sons, (1998).
[2] Srikanth, G., Membrane separation processes technology and business, Opportunities.Water
conditioning & purification, (2008).
[3] Baker, R.w., Membrane technology and application, John Wiley & Sons, Ltd, (2004).
[4] Kujawski. W, Application of pervaporation and vapor permeation in environmental protection, Polish
journal of environmental studies, 9 (1), 13-26, (2000).
[5] Huang, R. Pervaporation membrane separation process, ELSEVIER, 11(120), 181-191, (1991).
16. Chemical Engineering: An International Journal (CEIJ), Vol. 1, No.1, 2017
40
[6] Feng, X. and R.Y.M. Huang, Liquid separation by membrane pervaporation A review, Industrial
Baker, R.w., Membrane technology and application, (2004).
[7] Mulder, M., Basic principles of membrane technology, ed. S. Edition, Kluwer, (1996).
[8] Rautenbach R. and Albrecht, R., The separation potential of pervaporation: part 1. Discussion of
transport equations and comparison with reserve osmosis, Journal of membrane science, 1 (23),
(1985).
[9] Xia Zhan, Ji-ding Li, Jian Chen and Jun-qi Huang, Pervaporation of ethanol/water mixtures with high
flux by zeolite-filled PDMS/PVDF composite membranes, Chinese Journal of polymer science, 771-
780 (2009).
[10] Qiang Zhao, Jinwen Qian, Quanfu An, Zhihui Zhu, Peng Zhang, Yunxiang Bai, Studies on
pervaporation characteristics of polyacrylonitrile-b-poly (ethylene glycol)-b polyacrylonitrile block
copolymer membrane for dehydration of aqueous acetone solutions, Journal of membrane science
311, 284-293, (2008).
[11] Elsayed A. Fouad, Xianshe Feng, Pervaporation separation of n-butanol from dilute a queous
solutions using silicalite-filled poly (dimethyl siloxane) membranes, Journal of Membrane Science
339, 120-125, (2009).
[12] Merve Olukman Oya Sanli, A novel in situ synthesized magnetite containing acrylonitrile and 2-
hydroxyethylmethacrylate grafted poly (vinyl alcohol) nanocomposite membranes for pervaporation
separation of acetone/water mixtures, Journal of Membrane Science, 1-10, (2015).