Analysing genome rearrangements is a problem
from the vast domain of comparative genomics and
computational biology. Several studies have shown that closely
related species have essentially the same set of genes however
their gene orders differ. The differences in the gene order are
the results of various large-scale evolutionary events of which
reversal is the most common rearrangement event. The
problem of finding the shortest sequence of reversals that can
transform one genome into another is called the sorting by
reversals problem. The length of such a sequence is the
reversal distance between the two genomes. In comparative
genomics, sorting by reversals algorithms are often used to
propose evolutionary scenarios of large-scale genomic
mutations between species. Following the first polynomial
time solution of this problem, several improvements has been
published on the subject. In 2008, Braga et al. proposed an
algorithm to perform the enumeration of traces that sort a
signed permutation by reversals. This algorithm has
exponential complexity in both time and space. To efficiently
handle the traces, Baudet and Dias proposed a depth first
approach in 2010. However, one of the limitations of the
proposed algorithm was that it cannot provide the count of
number of solutions in each trace. In this paper we are
presenting an algorithm to list the normal forms of each trace
in depth first manner and provide count of the total number of
solutions in the solution space.
This document summarizes a framework for automatically extracting human protein-protein interaction data from biomedical literature. It describes benchmarking interaction datasets based on shared functional annotations and known physical interactions. It also outlines a method using a conditional random field tagger to identify protein names in text and two approaches for extracting interactions: co-citation analysis and learning interaction extractors from annotated sentences. Evaluation shows the extracted interactions have accuracy comparable to manually curated databases.
Sequence alignment involves identifying corresponding portions of biological sequences, such as DNA, RNA, and proteins, in order to analyze similarities and differences at the level of individual bases or amino acids. This can provide insights into structural, functional, and evolutionary relationships. Sequence alignment has many applications, including searching databases for similar sequences, constructing phylogenetic trees, and predicting protein structure. It works by designing an optimal correspondence between sequences that preserves the order of residues while maximizing matches and minimizing mismatches. Quantitative measures of sequence similarity, such as Hamming distance and Levenshtein distance, calculate the number of differences between aligned sequences.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
This document presents a genetic algorithm approach for learning classification rules from data. The key aspects of the approach are:
1. Binary encoding is used to represent classification rules, with bits indicating attribute values. Rule consequents are determined by the majority class of training examples matched.
2. The fitness function considers error rate, entropy measure, rule consistency, and hole ratio to evaluate rule sets. Error rate measures accuracy, entropy favors homogeneous rule matches, consistency penalizes ambiguous rules, and hole ratio measures coverage.
3. Adaptive asymmetric mutation is applied, with the mutation probability self-adjusting during the algorithm run. Crossover also utilizes two-point crossover of rules.
4. The approach is
A multimodal brain imaging study of repetition suppression in thefizyoloji12345
This study combined fMRI, MEG, and behavioral measures to investigate the neural mechanisms underlying repetition suppression in the human visual cortex. Using an orientation discrimination task with Gabor patch stimuli presented at interstimulus intervals of 200 or 600 ms, the study found:
1) Behaviorally, subjects had impaired ability to discriminate orientation at the 200 ms interval for repeated (SAME) vs different (DIFF) stimuli.
2) In fMRI, a suppressed BOLD response was observed for SAME vs DIFF stimuli at the 200 ms interval, indicating repetition suppression.
3) In MEG, peak amplitude was suppressed for SAME vs DIFF stimuli at the 200 ms interval, consistent with neuronal repetition suppression.
SIMILARITY ANALYSIS OF DNA SEQUENCES BASED ON THE CHEMICAL PROPERTIES OF NUCL...csandit
The DNA sequences similarity analysis approaches have been based on the representation and the frequency of sequences components; however, the position inside sequence is important information for the sequence data. Whereas, insufficient information in sequences
representations is important reason that causes poor similarity results. Based on three classifications of the DNA bases according to their chemical properties, the frequencies and
average positions of group mutations have been grouped into two twelve-components vectors,the Euclidean distances among introduced vectors applied to compare the coding sequences of the first exon of beta globin gene of 11 species.
Comparative Study of Morphological, Correlation, Hybrid and DCSFPSS based Mor...IDES Editor
This paper proposes comparative study of two basic
approaches such as Morphological Approach (MA) and
Correlation Approach (CA) and three modified algorithms
over the basic approaches for detection of micronatured defects
occurring in plain weave fabrics. A Hybrid of CA followed by
MA was developed and has shown to overcome the drawbacks
of the basic methods. As automation of MA using DC
Suppressed Fourier Power Spectrum Sum (DCSFPSS),
DCSFPSSMA could not yield improvement in Overall
Detection Accuracy (ODA) for micronatured defects,
automation of modified Hybrid Approach (HA) was proposed
leading to the development of Tribrid Approach (TA). Modified
Hybrid approach involves cascade operation of CA and MA
both automated using DCSFPSS. Texture periodicity of defect
free fabric was obtained using DCSFPSS which was extended
for the design and extraction of defect independent template
for CA and for the design of the size of structuring element
for morphological filtering process. Overall Detection
Accuracy was used by adopting simple binary based defect
search algorithm as the last step in the experimentation to
detect the defects. Overall Detection Accuracy was found to be
~100%/97.41%/ 98.7 % for 247 samples of warp break defect/
double pick/ normal samples and 96.1% /99% for 205 thick
place defect samples/normal samples belonging to two
different plain grey fabric classes. Robustness of the
performance of TA scheme was tested by comparing TA with
two traditional algorithms viz., CA and MA and our previously
proposed hybrid algorithm and DCSFPSSMA. This TA
algorithm outperformed when compared to CA-only, MA-only,
HA and DCSFPSSMA by yielding an overall ODA of more
than 98% for the defect and defect free samples of different
fabric classes. Secondly, the recognition of defect area less
than 1 mm2 which has not been reported in the literature yet,
was possible using this algorithm. We propose to use this
method as a means to grade the grey fabric similar to the
standard fabric grading system.
Development of Product Configurator for a Pressure Booster SystemIDES Editor
This paper outlines the process of developing a
product configurator for a pressure booster system. The ideas
presented here were developed when developing a configurator
in the CS-Enterprise package for a pressure booster system.
This paper chronicles the design aspects, the configurator
development planning, rule-based configurator development
and testing and validation, including best practices developed
during the process. Further work possible is also discussed.
This document summarizes a framework for automatically extracting human protein-protein interaction data from biomedical literature. It describes benchmarking interaction datasets based on shared functional annotations and known physical interactions. It also outlines a method using a conditional random field tagger to identify protein names in text and two approaches for extracting interactions: co-citation analysis and learning interaction extractors from annotated sentences. Evaluation shows the extracted interactions have accuracy comparable to manually curated databases.
Sequence alignment involves identifying corresponding portions of biological sequences, such as DNA, RNA, and proteins, in order to analyze similarities and differences at the level of individual bases or amino acids. This can provide insights into structural, functional, and evolutionary relationships. Sequence alignment has many applications, including searching databases for similar sequences, constructing phylogenetic trees, and predicting protein structure. It works by designing an optimal correspondence between sequences that preserves the order of residues while maximizing matches and minimizing mismatches. Quantitative measures of sequence similarity, such as Hamming distance and Levenshtein distance, calculate the number of differences between aligned sequences.
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
This document presents a genetic algorithm approach for learning classification rules from data. The key aspects of the approach are:
1. Binary encoding is used to represent classification rules, with bits indicating attribute values. Rule consequents are determined by the majority class of training examples matched.
2. The fitness function considers error rate, entropy measure, rule consistency, and hole ratio to evaluate rule sets. Error rate measures accuracy, entropy favors homogeneous rule matches, consistency penalizes ambiguous rules, and hole ratio measures coverage.
3. Adaptive asymmetric mutation is applied, with the mutation probability self-adjusting during the algorithm run. Crossover also utilizes two-point crossover of rules.
4. The approach is
A multimodal brain imaging study of repetition suppression in thefizyoloji12345
This study combined fMRI, MEG, and behavioral measures to investigate the neural mechanisms underlying repetition suppression in the human visual cortex. Using an orientation discrimination task with Gabor patch stimuli presented at interstimulus intervals of 200 or 600 ms, the study found:
1) Behaviorally, subjects had impaired ability to discriminate orientation at the 200 ms interval for repeated (SAME) vs different (DIFF) stimuli.
2) In fMRI, a suppressed BOLD response was observed for SAME vs DIFF stimuli at the 200 ms interval, indicating repetition suppression.
3) In MEG, peak amplitude was suppressed for SAME vs DIFF stimuli at the 200 ms interval, consistent with neuronal repetition suppression.
SIMILARITY ANALYSIS OF DNA SEQUENCES BASED ON THE CHEMICAL PROPERTIES OF NUCL...csandit
The DNA sequences similarity analysis approaches have been based on the representation and the frequency of sequences components; however, the position inside sequence is important information for the sequence data. Whereas, insufficient information in sequences
representations is important reason that causes poor similarity results. Based on three classifications of the DNA bases according to their chemical properties, the frequencies and
average positions of group mutations have been grouped into two twelve-components vectors,the Euclidean distances among introduced vectors applied to compare the coding sequences of the first exon of beta globin gene of 11 species.
Comparative Study of Morphological, Correlation, Hybrid and DCSFPSS based Mor...IDES Editor
This paper proposes comparative study of two basic
approaches such as Morphological Approach (MA) and
Correlation Approach (CA) and three modified algorithms
over the basic approaches for detection of micronatured defects
occurring in plain weave fabrics. A Hybrid of CA followed by
MA was developed and has shown to overcome the drawbacks
of the basic methods. As automation of MA using DC
Suppressed Fourier Power Spectrum Sum (DCSFPSS),
DCSFPSSMA could not yield improvement in Overall
Detection Accuracy (ODA) for micronatured defects,
automation of modified Hybrid Approach (HA) was proposed
leading to the development of Tribrid Approach (TA). Modified
Hybrid approach involves cascade operation of CA and MA
both automated using DCSFPSS. Texture periodicity of defect
free fabric was obtained using DCSFPSS which was extended
for the design and extraction of defect independent template
for CA and for the design of the size of structuring element
for morphological filtering process. Overall Detection
Accuracy was used by adopting simple binary based defect
search algorithm as the last step in the experimentation to
detect the defects. Overall Detection Accuracy was found to be
~100%/97.41%/ 98.7 % for 247 samples of warp break defect/
double pick/ normal samples and 96.1% /99% for 205 thick
place defect samples/normal samples belonging to two
different plain grey fabric classes. Robustness of the
performance of TA scheme was tested by comparing TA with
two traditional algorithms viz., CA and MA and our previously
proposed hybrid algorithm and DCSFPSSMA. This TA
algorithm outperformed when compared to CA-only, MA-only,
HA and DCSFPSSMA by yielding an overall ODA of more
than 98% for the defect and defect free samples of different
fabric classes. Secondly, the recognition of defect area less
than 1 mm2 which has not been reported in the literature yet,
was possible using this algorithm. We propose to use this
method as a means to grade the grey fabric similar to the
standard fabric grading system.
Development of Product Configurator for a Pressure Booster SystemIDES Editor
This paper outlines the process of developing a
product configurator for a pressure booster system. The ideas
presented here were developed when developing a configurator
in the CS-Enterprise package for a pressure booster system.
This paper chronicles the design aspects, the configurator
development planning, rule-based configurator development
and testing and validation, including best practices developed
during the process. Further work possible is also discussed.
Structuring Ideation Map using Oriented Directed Acyclic Graph with Privacy P...IDES Editor
E-Brainstorming is a computerized version of sharing
ideas and it replaces verbal communication. The productivity
of ideas generated is viewed as the dominant measure of EBrainstorming.
In Agent-based E-Brainstorming, Idea
Ontology was used to map user’s knowledge with idea names
and relationships between idea instances. In this paper
Oriented Directed Acyclic Graph (ODAG) method is used to
construct the ideation map for diverse ideas and their
relationship. Privacy Preference Ontology is integrated to
provide privacy preference for user’s data like access control,
condition, access space and restriction. Here the Idea
Knowledge Base is applied and it enfolds a collection of idea
instances of different domains to denote a client’s knowledge.
Performance Analysis of Continuous Flow Intersection in Mixed Traffic Condition IDES Editor
This document summarizes a study that evaluates the performance of a Continuous Flow Intersection (CFI) using computer simulation under mixed traffic conditions. The study compares the average delays of vehicles at a CFI to those at a Normal Flow Intersection (NFI) for different traffic volumes and proportions of right-turning traffic. The results show that the CFI has lower average delays than the NFI for all traffic scenarios tested, with delays reduced by 30-60% on average. The CFI design provides benefits without requiring additional land and can better utilize existing road infrastructure capacity.
VFT Application for Asynchronous Power TransferIDES Editor
1) The VFT is a variable frequency transformer that can transfer power between asynchronous power systems by controlling the magnitude and direction of power flow. It functions similar to a conventional induction machine, with power systems connected to the stator and rotor windings.
2) Power flow is controlled by applying a torque to the rotor via a drive motor, which adjusts the rotor position relative to the stator. In one direction of torque, power flows from the stator to the rotor, and vice versa with opposite torque.
3) MATLAB simulation models the VFT as a wound rotor induction machine to study power transfer under different torque conditions between two asynchronous power systems of different voltages and frequencies.
Circular Waves in Thermoelastic Plates Bordered with Viscous Liquid IDES Editor
The paper concentrates on the study of propagation of thermoelastic waves in a homogeneous, transversely isotropic, thermally conducting elastic plate bordered with layers (or half-spaces) of viscous liquid on both sides in the
context of non classical theories of thermoelasticity. Complex secular equations for symmetric and antisymmetric wave motion of the circular plate, in completely separate terms, are derived. Finally, in order to illustrate the analytical results, the numerical solution is carried out for transversely isotropic plate of cobalt material bordered with water by using the functional iteration method.
Simulating Performance Impacts of Bus Priority Measures IDES Editor
Public transport has an important role to play in the
provision of reliable travel in congested conditions as it makes
excellent use of limited road space, carrying many more
passengers than a private car for a given amount of road space.
This paper involves study and analysis of various Bus Priority
Measures in terms of change in delay with respect to normal
intersection for the buses and also for whole traffic flow
(including buses and all other vehicles) with the help of results
of VISSIM simulation software for various volumes of traffic
flow. These measures can be applied to give buses priority to
make them a more attractive alternative to the private vehicles
and reducing road congestion.
Este documento describe una actividad integradora para un grupo de estudiantes. El objetivo es que los estudiantes investiguen y difundan información sobre temas como el medio ambiente, la educación o la cultura maya utilizando las redes sociales. Se divide el trabajo en tres roles: investigador documental, experimentador y experto en TIC. Cada estudiante asumirá un rol para recopilar y compartir información que pueda transformar vidas.
Tag snp selection using quine mc cluskey optimization method-2IAEME Publication
This document summarizes a research paper that proposes using the Quine-McCluskey optimization method to select tag SNPs. The paper begins with background on tag SNPs and linkage disequilibrium. It then provides an overview of the Quine-McCluskey method for Boolean minimization and describes how it can be applied to select a minimal set of tag SNPs that represent the variation in a larger set of SNPs. The proposed method generates minterms and prime implicants to select essential tag SNPs in a three-step process to find the minimum tag SNP set. Experimental results reportedly show the method selects a feasible and effective number of tag SNPs.
The document compares five evolutionary optimization algorithms: genetic algorithms, memetic algorithms, particle swarm optimization, ant colony systems, and shuffled frog leaping. It provides a brief description of each algorithm, including how they are inspired by natural processes and behaviors. It also includes pseudocode to facilitate implementing each algorithm. The document then presents benchmark comparisons of the five algorithms on continuous and discrete optimization problems in terms of processing time, convergence speed, and solution quality. It discusses the performance of evolutionary algorithms and provides guidelines for determining the best parameters for each.
The document summarizes a study that uses an information-based similarity index to classify the SARS coronavirus. Key points:
1) The study develops a novel alignment-free method to measure genetic sequence similarity based on word frequencies and information content.
2) The method is first validated on human influenza and mitochondrial DNA, correctly reconstructing known phylogenies.
3) The method is then applied to classify SARS coronavirus, finding it is most closely related to group 1 coronaviruses, with some matches to groups 2 and 3.
4) The information-based similarity index provides a new tool for large-scale genomic analysis without sequence alignment.
Genetic Algorithm for the Traveling Salesman Problem using Sequential Constru...CSCJournals
This paper develops a new crossover operator, Sequential Constructive crossover (SCX), for a genetic algorithm that generates high quality solutions to the Traveling Salesman Problem (TSP). The sequential constructive crossover operator constructs an offspring from a pair of parents using better edges on the basis of their values that may be present in the parents' structure maintaining the sequence of nodes in the parent chromosomes. The efficiency of the SCX is compared as against some existing crossover operators; namely, edge recombination crossover (ERX) and generalized N-point crossover (GNX) for some benchmark TSPLIB instances. Experimental results show that the new crossover operator is better than the ERX and GNX.
Genetic algorithms are computational models inspired by biological evolution. They work by encoding potential solutions to a problem as strings called chromosomes. An initial population of random chromosomes is generated. The chromosomes are then evaluated and reproductive opportunities are allocated based on fitness, with better solutions more likely to reproduce. Operators like crossover and mutation combine parts of existing chromosomes to form new ones for the next generation. This process is repeated until a termination criterion is reached, with the goal of evolving better and better solutions over generations based on the principle of survival of the fittest.
Genetic information is stored in DNA molecules as sequences of nucleotides. DNA exists as paired strands that run in opposite directions and are complementary to each other. The genome contains an organism's complete set of DNA and is organized into chromosomes. Genomes can differ between species through point mutations that change single nucleotides or genome rearrangements that modify multiple nucleotides. Rearrangements include reversals, translocations, fissions, and fusions that change the order of genes within and between chromosomes. The minimum number of edits needed to transform one genome into another, including point mutations and rearrangements, defines their edit distance and can provide insights into evolutionary relationships.
This document provides information about genetic algorithms including:
1. Definitions of genetic algorithms from Grefenstette and Goldberg that describe genetic algorithms as search algorithms based on biological evolution and natural selection.
2. An overview of genetic algorithms including the basic concepts of populations, chromosomes, genes, fitness functions, selection, crossover, and mutation.
3. Examples of genetic representations like binary encoding and permutation encoding.
4. Descriptions of genetic operators like selection, crossover, and mutation that maintain genetic diversity between generations.
This document discusses using genetic algorithms to optimize resource allocation in mobile networks. It reviews 4 papers on this topic from the 1980s to present. The first paper from 1987 proposed using genetic algorithms and classifier systems for machine learning. Later papers applied genetic algorithms to optimize wireless network topology in 1995, 3G mobile network planning in 2004, and 5G network energy optimization in 2016. All showed genetic algorithms can help optimize complex resource allocation problems in mobile networks.
Genomics is the study and application of genetic mapping, sequencing, and bioinformatics to analyze genomes. It includes structural genomics, which maps genomes, functional genomics which analyzes gene function, and comparative genomics which compares genomes across species. Comparative genomics enables insights from model organisms to be applied to other species through identifying commonalities and differences between genomes. High-throughput bioinformatics tools can be used to analyze bacterial and fungal genomes from public databases to identify potential drug targets.
An analogy of algorithms for tagging of single nucleotide polymorphism and evIAEME Publication
1. The document discusses algorithms for selecting tag SNPs from large datasets of SNPs. Tag SNPs are a small subset of SNPs that can represent the larger dataset while reducing computational requirements.
2. It describes several algorithms studied - Gauss, Gauss-Jordan elimination, greedy algorithm, and binary optimization. Gauss and Gauss-Jordan algorithms use matrix operations to select tag SNPs. The greedy algorithm selects SNPs that distinguish the most haplotype patterns. Binary optimization uses fitness functions to evaluate windows of SNPs.
3. The algorithms are evaluated based on how well selected tag SNPs capture linkage disequilibrium in the full dataset, with the goal of finding an optimal small set of tag SNPs for analysis and computation.
This document discusses genetic algorithms and provides an overview of their key concepts and components. It describes how genetic algorithms are inspired by Darwinian evolution and use techniques like selection, crossover and mutation to evolve solutions to optimization problems. It also outlines various parameters and strategies used in genetic algorithms, including chromosome representation, population size, selection methods, and termination criteria. A wide range of applications are mentioned where genetic algorithms have been applied successfully.
FINE GRAIN PARALLEL CONSTRUCTION OF NEIGHBOUR-JOINING PHYLOGENETIC TREES WITH...ijdpsjournal
In biological research, scientists often need to use the information of the species to infer the evolutionary relationship among them. The evolutionary relationships are generally represented by a labeled binary tree, called the evolutionary tree (or phylogenetic tree). The phylogeny problem is computationally intensive, and thus it is suitable for parallel computing environment. In this paper, a fast algorithm for
constructing Neighbor-Joining phylogenetic trees has been developed. The CPU time is drastically reduced as compared with sequential algorithms. The new algorithm includes three techniques: Firstly, a linear array A[N] is introduced to store the sum of every row of the distance matrix (the same as SK),
which can eliminate many repeated (redundancy) computations, and the value of A[i] are computed only once at the beginning of the algorithm, and are updated by three elements in the iteration. Secondly, a very compact formula for the sum of all the branch lengths of OTUs (Operational Taxonomic Units) i and
j has been designed. Thirdly, multiple parallel threads are used for computation of nearest neighboring pair.
A Review On Genetic Algorithm And Its ApplicationsKaren Gomez
This document provides an overview of genetic algorithms and their applications. It begins with an introduction to genetic algorithms, explaining that they are inspired by Darwin's theory of evolution and use techniques like mutation and crossover to evolve solutions to problems. The document then covers biological concepts related to genetics like chromosomes, genes, alleles, and reproduction. It discusses how genetic algorithms represent potential solutions as chromosomes and use selection, crossover and mutation operators to evolve new solutions. The document also covers genetic algorithm parameters and applications to problems like the traveling salesman problem.
1. Statistical analysis of big data sets from microarrays and RNA-seq is used to identify differentially expressed genes. Heat maps and volcano plots are commonly used to visualize the data.
2. Gene ontology, gene set enrichment analysis, and transcription factor analysis are used to analyze lists of genes and identify biological processes, pathways, and regulatory relationships.
3. Networks can be constructed by integrating gene lists with protein-protein and gene regulatory interaction databases to build signaling, regulatory, and interaction networks for further analysis.
This document discusses various techniques for normalizing gene expression data from microarray experiments, including total intensity normalization, normalization using regression techniques, and normalization using ratio statistics. It also covers calculating distances between gene expression profiles using measures like Euclidean distance and Pearson correlation coefficient. Finally, it examines clustering analysis methods like hierarchical and non-hierarchical clustering that can group genes or samples based on similar expression patterns.
Two-Stage Eagle Strategy with Differential EvolutionXin-She Yang
The document describes a two-stage optimization strategy called the Eagle Strategy (ES) that combines global and local search algorithms to improve search efficiency. It evaluates applying ES to differential evolution (DE), a popular evolutionary algorithm. ES first uses randomization like Levy flights for global exploration, then switches to DE for intensive local search around promising solutions. The authors validate ES-DE on test functions, finding it requires only 9.7-24.9% of the function evaluations of pure DE. They also apply it to real-world pressure vessel and gearbox design problems, achieving solutions with 14.9-17.7% fewer function evaluations than pure DE.
Structuring Ideation Map using Oriented Directed Acyclic Graph with Privacy P...IDES Editor
E-Brainstorming is a computerized version of sharing
ideas and it replaces verbal communication. The productivity
of ideas generated is viewed as the dominant measure of EBrainstorming.
In Agent-based E-Brainstorming, Idea
Ontology was used to map user’s knowledge with idea names
and relationships between idea instances. In this paper
Oriented Directed Acyclic Graph (ODAG) method is used to
construct the ideation map for diverse ideas and their
relationship. Privacy Preference Ontology is integrated to
provide privacy preference for user’s data like access control,
condition, access space and restriction. Here the Idea
Knowledge Base is applied and it enfolds a collection of idea
instances of different domains to denote a client’s knowledge.
Performance Analysis of Continuous Flow Intersection in Mixed Traffic Condition IDES Editor
This document summarizes a study that evaluates the performance of a Continuous Flow Intersection (CFI) using computer simulation under mixed traffic conditions. The study compares the average delays of vehicles at a CFI to those at a Normal Flow Intersection (NFI) for different traffic volumes and proportions of right-turning traffic. The results show that the CFI has lower average delays than the NFI for all traffic scenarios tested, with delays reduced by 30-60% on average. The CFI design provides benefits without requiring additional land and can better utilize existing road infrastructure capacity.
VFT Application for Asynchronous Power TransferIDES Editor
1) The VFT is a variable frequency transformer that can transfer power between asynchronous power systems by controlling the magnitude and direction of power flow. It functions similar to a conventional induction machine, with power systems connected to the stator and rotor windings.
2) Power flow is controlled by applying a torque to the rotor via a drive motor, which adjusts the rotor position relative to the stator. In one direction of torque, power flows from the stator to the rotor, and vice versa with opposite torque.
3) MATLAB simulation models the VFT as a wound rotor induction machine to study power transfer under different torque conditions between two asynchronous power systems of different voltages and frequencies.
Circular Waves in Thermoelastic Plates Bordered with Viscous Liquid IDES Editor
The paper concentrates on the study of propagation of thermoelastic waves in a homogeneous, transversely isotropic, thermally conducting elastic plate bordered with layers (or half-spaces) of viscous liquid on both sides in the
context of non classical theories of thermoelasticity. Complex secular equations for symmetric and antisymmetric wave motion of the circular plate, in completely separate terms, are derived. Finally, in order to illustrate the analytical results, the numerical solution is carried out for transversely isotropic plate of cobalt material bordered with water by using the functional iteration method.
Simulating Performance Impacts of Bus Priority Measures IDES Editor
Public transport has an important role to play in the
provision of reliable travel in congested conditions as it makes
excellent use of limited road space, carrying many more
passengers than a private car for a given amount of road space.
This paper involves study and analysis of various Bus Priority
Measures in terms of change in delay with respect to normal
intersection for the buses and also for whole traffic flow
(including buses and all other vehicles) with the help of results
of VISSIM simulation software for various volumes of traffic
flow. These measures can be applied to give buses priority to
make them a more attractive alternative to the private vehicles
and reducing road congestion.
Este documento describe una actividad integradora para un grupo de estudiantes. El objetivo es que los estudiantes investiguen y difundan información sobre temas como el medio ambiente, la educación o la cultura maya utilizando las redes sociales. Se divide el trabajo en tres roles: investigador documental, experimentador y experto en TIC. Cada estudiante asumirá un rol para recopilar y compartir información que pueda transformar vidas.
Tag snp selection using quine mc cluskey optimization method-2IAEME Publication
This document summarizes a research paper that proposes using the Quine-McCluskey optimization method to select tag SNPs. The paper begins with background on tag SNPs and linkage disequilibrium. It then provides an overview of the Quine-McCluskey method for Boolean minimization and describes how it can be applied to select a minimal set of tag SNPs that represent the variation in a larger set of SNPs. The proposed method generates minterms and prime implicants to select essential tag SNPs in a three-step process to find the minimum tag SNP set. Experimental results reportedly show the method selects a feasible and effective number of tag SNPs.
The document compares five evolutionary optimization algorithms: genetic algorithms, memetic algorithms, particle swarm optimization, ant colony systems, and shuffled frog leaping. It provides a brief description of each algorithm, including how they are inspired by natural processes and behaviors. It also includes pseudocode to facilitate implementing each algorithm. The document then presents benchmark comparisons of the five algorithms on continuous and discrete optimization problems in terms of processing time, convergence speed, and solution quality. It discusses the performance of evolutionary algorithms and provides guidelines for determining the best parameters for each.
The document summarizes a study that uses an information-based similarity index to classify the SARS coronavirus. Key points:
1) The study develops a novel alignment-free method to measure genetic sequence similarity based on word frequencies and information content.
2) The method is first validated on human influenza and mitochondrial DNA, correctly reconstructing known phylogenies.
3) The method is then applied to classify SARS coronavirus, finding it is most closely related to group 1 coronaviruses, with some matches to groups 2 and 3.
4) The information-based similarity index provides a new tool for large-scale genomic analysis without sequence alignment.
Genetic Algorithm for the Traveling Salesman Problem using Sequential Constru...CSCJournals
This paper develops a new crossover operator, Sequential Constructive crossover (SCX), for a genetic algorithm that generates high quality solutions to the Traveling Salesman Problem (TSP). The sequential constructive crossover operator constructs an offspring from a pair of parents using better edges on the basis of their values that may be present in the parents' structure maintaining the sequence of nodes in the parent chromosomes. The efficiency of the SCX is compared as against some existing crossover operators; namely, edge recombination crossover (ERX) and generalized N-point crossover (GNX) for some benchmark TSPLIB instances. Experimental results show that the new crossover operator is better than the ERX and GNX.
Genetic algorithms are computational models inspired by biological evolution. They work by encoding potential solutions to a problem as strings called chromosomes. An initial population of random chromosomes is generated. The chromosomes are then evaluated and reproductive opportunities are allocated based on fitness, with better solutions more likely to reproduce. Operators like crossover and mutation combine parts of existing chromosomes to form new ones for the next generation. This process is repeated until a termination criterion is reached, with the goal of evolving better and better solutions over generations based on the principle of survival of the fittest.
Genetic information is stored in DNA molecules as sequences of nucleotides. DNA exists as paired strands that run in opposite directions and are complementary to each other. The genome contains an organism's complete set of DNA and is organized into chromosomes. Genomes can differ between species through point mutations that change single nucleotides or genome rearrangements that modify multiple nucleotides. Rearrangements include reversals, translocations, fissions, and fusions that change the order of genes within and between chromosomes. The minimum number of edits needed to transform one genome into another, including point mutations and rearrangements, defines their edit distance and can provide insights into evolutionary relationships.
This document provides information about genetic algorithms including:
1. Definitions of genetic algorithms from Grefenstette and Goldberg that describe genetic algorithms as search algorithms based on biological evolution and natural selection.
2. An overview of genetic algorithms including the basic concepts of populations, chromosomes, genes, fitness functions, selection, crossover, and mutation.
3. Examples of genetic representations like binary encoding and permutation encoding.
4. Descriptions of genetic operators like selection, crossover, and mutation that maintain genetic diversity between generations.
This document discusses using genetic algorithms to optimize resource allocation in mobile networks. It reviews 4 papers on this topic from the 1980s to present. The first paper from 1987 proposed using genetic algorithms and classifier systems for machine learning. Later papers applied genetic algorithms to optimize wireless network topology in 1995, 3G mobile network planning in 2004, and 5G network energy optimization in 2016. All showed genetic algorithms can help optimize complex resource allocation problems in mobile networks.
Genomics is the study and application of genetic mapping, sequencing, and bioinformatics to analyze genomes. It includes structural genomics, which maps genomes, functional genomics which analyzes gene function, and comparative genomics which compares genomes across species. Comparative genomics enables insights from model organisms to be applied to other species through identifying commonalities and differences between genomes. High-throughput bioinformatics tools can be used to analyze bacterial and fungal genomes from public databases to identify potential drug targets.
An analogy of algorithms for tagging of single nucleotide polymorphism and evIAEME Publication
1. The document discusses algorithms for selecting tag SNPs from large datasets of SNPs. Tag SNPs are a small subset of SNPs that can represent the larger dataset while reducing computational requirements.
2. It describes several algorithms studied - Gauss, Gauss-Jordan elimination, greedy algorithm, and binary optimization. Gauss and Gauss-Jordan algorithms use matrix operations to select tag SNPs. The greedy algorithm selects SNPs that distinguish the most haplotype patterns. Binary optimization uses fitness functions to evaluate windows of SNPs.
3. The algorithms are evaluated based on how well selected tag SNPs capture linkage disequilibrium in the full dataset, with the goal of finding an optimal small set of tag SNPs for analysis and computation.
This document discusses genetic algorithms and provides an overview of their key concepts and components. It describes how genetic algorithms are inspired by Darwinian evolution and use techniques like selection, crossover and mutation to evolve solutions to optimization problems. It also outlines various parameters and strategies used in genetic algorithms, including chromosome representation, population size, selection methods, and termination criteria. A wide range of applications are mentioned where genetic algorithms have been applied successfully.
FINE GRAIN PARALLEL CONSTRUCTION OF NEIGHBOUR-JOINING PHYLOGENETIC TREES WITH...ijdpsjournal
In biological research, scientists often need to use the information of the species to infer the evolutionary relationship among them. The evolutionary relationships are generally represented by a labeled binary tree, called the evolutionary tree (or phylogenetic tree). The phylogeny problem is computationally intensive, and thus it is suitable for parallel computing environment. In this paper, a fast algorithm for
constructing Neighbor-Joining phylogenetic trees has been developed. The CPU time is drastically reduced as compared with sequential algorithms. The new algorithm includes three techniques: Firstly, a linear array A[N] is introduced to store the sum of every row of the distance matrix (the same as SK),
which can eliminate many repeated (redundancy) computations, and the value of A[i] are computed only once at the beginning of the algorithm, and are updated by three elements in the iteration. Secondly, a very compact formula for the sum of all the branch lengths of OTUs (Operational Taxonomic Units) i and
j has been designed. Thirdly, multiple parallel threads are used for computation of nearest neighboring pair.
A Review On Genetic Algorithm And Its ApplicationsKaren Gomez
This document provides an overview of genetic algorithms and their applications. It begins with an introduction to genetic algorithms, explaining that they are inspired by Darwin's theory of evolution and use techniques like mutation and crossover to evolve solutions to problems. The document then covers biological concepts related to genetics like chromosomes, genes, alleles, and reproduction. It discusses how genetic algorithms represent potential solutions as chromosomes and use selection, crossover and mutation operators to evolve new solutions. The document also covers genetic algorithm parameters and applications to problems like the traveling salesman problem.
1. Statistical analysis of big data sets from microarrays and RNA-seq is used to identify differentially expressed genes. Heat maps and volcano plots are commonly used to visualize the data.
2. Gene ontology, gene set enrichment analysis, and transcription factor analysis are used to analyze lists of genes and identify biological processes, pathways, and regulatory relationships.
3. Networks can be constructed by integrating gene lists with protein-protein and gene regulatory interaction databases to build signaling, regulatory, and interaction networks for further analysis.
This document discusses various techniques for normalizing gene expression data from microarray experiments, including total intensity normalization, normalization using regression techniques, and normalization using ratio statistics. It also covers calculating distances between gene expression profiles using measures like Euclidean distance and Pearson correlation coefficient. Finally, it examines clustering analysis methods like hierarchical and non-hierarchical clustering that can group genes or samples based on similar expression patterns.
Two-Stage Eagle Strategy with Differential EvolutionXin-She Yang
The document describes a two-stage optimization strategy called the Eagle Strategy (ES) that combines global and local search algorithms to improve search efficiency. It evaluates applying ES to differential evolution (DE), a popular evolutionary algorithm. ES first uses randomization like Levy flights for global exploration, then switches to DE for intensive local search around promising solutions. The authors validate ES-DE on test functions, finding it requires only 9.7-24.9% of the function evaluations of pure DE. They also apply it to real-world pressure vessel and gearbox design problems, achieving solutions with 14.9-17.7% fewer function evaluations than pure DE.
Analysis of Genomic and Proteomic Sequence Using Fir FilterIJMER
Bioinformatics is a field of science that implies the use of techniques from mathematics, informatics, statistics, computer science, artificial intelligence, chemistry, and biochemistry to solve biological problems usually on the molecular level. Digital Signal Processing (DSP) applications in genomic sequence analysis have received great attention in recent years.DSP principles are used to analyse genomic and proteomic sequences. The DNA sequence is mapped into digital signals in the form of binary indicator sequences. Signal processing techniques such as digital filtering is applied to genomic sequences to identify protein coding region. Frequency response of genomic sequences is used to solve many optimization problems in science, medicine and many other applications. The aim of this paper is to describe a method of generating Finite Impulse Response (FIR) of the genomic sequence. The same DNA sequence is used to convert into proteomic sequence using transcription and translation, and also digital filtering technique such as FIR filter applied to know the frequency response. The frequency response is same for both gene and proteomic sequence.
Particle Swarm Optimization for Gene cluster IdentificationEditor IJCATR
The understanding of gene regulation is the most basic need for the classification of genes within a DNA. These genes
within the DNA are grouped together into clusters also known as Transcription Units. The genes are grouped into transcription units
for the purpose of construction and regulation of gene expression and synthesis of proteins. This knowledge further contributes as
essential information for the process of drug design and to determine the protein functions of newly sequenced genomes. It is possible
to use the diverse biological information across multiple genomes as an input to the classification problem. The purpose of this work is
to show that Particle Swarm Optimization may provide for more efficient classification as compared to other algorithms. To validate
the approach E.Coli complete genome is taken as the benchmark genome.
Survey and Evaluation of Methods for Tissue Classificationperfj
This document summarizes and compares several methods for classifying tissue samples using gene expression data. It first describes a general framework for tissue classification involving data evaluation, method selection, cross-validation, and significance testing. It then discusses various gene selection methods for identifying informative genes correlated with phenotypes, including t-tests, Golub's method, Dudoit's method, TNoM, and Park's method. Finally, it outlines several classification methods that can be used to construct predictors for classifying unlabeled samples, including nearest neighbor classification and linear discriminant analysis.
Survey of softwares for phylogenetic analysisArindam Ghosh
The document discusses the process of phylogenetic analysis using cytochrome c oxidase subunit 1 (COX1) gene sequences from several organisms: human, bovine, zebrafish, pig, and sheep. It provides the COX1 protein sequences for each organism downloaded from UniProt. The sequences will be aligned using Clustal Omega and a phylogenetic tree will be constructed using Clustal W2 to analyze the evolutionary relationships between the organisms.
Similar to Exploring the Solution Space of Sorting by Reversals: A New Approach (20)
Power System State Estimation - A ReviewIDES Editor
This document provides a review of power system state estimation techniques. It discusses both static and dynamic state estimation algorithms. For static state estimation, it covers weighted least squares, decoupled, and robust estimation methods. Weighted least squares is commonly used but can have numerical instability issues. Decoupled state estimation approximates the gain matrix for faster computation. Robust estimation uses M-estimators and other techniques to handle outliers and bad data. Dynamic state estimation applies Kalman filtering, leapfrog algorithms, and other methods to continuously monitor system states over time.
Artificial Intelligence Technique based Reactive Power Planning Incorporating...IDES Editor
This document summarizes a research paper that proposes using artificial intelligence techniques and FACTS controllers for reactive power planning in real-time power transmission systems. The paper formulates the reactive power planning problem and incorporates flexible AC transmission system (FACTS) devices like static VAR compensators (SVC), thyristor controlled series capacitors (TCSC), and unified power flow controllers (UPFC). Evolutionary algorithms like evolutionary programming (EP) and differential evolution (DE) are applied to find the optimal locations and settings of the FACTS controllers to minimize losses and costs. Simulation results on IEEE 30-bus and 72-bus Indian test systems show that UPFC performs best in reducing losses compared to SVC and TCSC.
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...IDES Editor
Damping of power system oscillations with the help
of proposed optimal Proportional Integral Derivative Power
System Stabilizer (PID-PSS) and Static Var Compensator
(SVC)-based controllers are thoroughly investigated in this
paper. This study presents robust tuning of PID-PSS and
SVC-based controllers using Genetic Algorithms (GA) in
multi machine power systems by considering detailed model
of the generators (model 1.1). The effectiveness of FACTSbased
controllers in general and SVC-based controller in
particular depends upon their proper location. Modal
controllability and observability are used to locate SVC–based
controller. The performance of the proposed controllers is
compared with conventional lead-lag power system stabilizer
(CPSS) and demonstrated on 10 machines, 39 bus New England
test system. Simulation studies show that the proposed genetic
based PID-PSS with SVC based controller provides better
performance.
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...IDES Editor
This paper presents the need to operate the power
system economically and with optimum levels of voltages has
further led to an increase in interest in Distributed
Generation. In order to reduce the power losses and to improve
the voltage in the distribution system, distributed generators
(DGs) are connected to load bus. To reduce the total power
losses in the system, the most important process is to identify
the proper location for fixing and sizing of DGs. It presents a
new methodology using a new population based meta heuristic
approach namely Artificial Bee Colony algorithm(ABC) for
the placement of Distributed Generators(DG) in the radial
distribution systems to reduce the real power losses and to
improve the voltage profile, voltage sag mitigation. The power
loss reduction is important factor for utility companies because
it is directly proportional to the company benefits in a
competitive electricity market, while reaching the better power
quality standards is too important as it has vital effect on
customer orientation. In this paper an ABC algorithm is
developed to gain these goals all together. In order to evaluate
sag mitigation capability of the proposed algorithm, voltage
in voltage sensitive buses is investigated. An existing 20KV
network has been chosen as test network and results are
compared with the proposed method in the radial distribution
system.
Line Losses in the 14-Bus Power System Network using UPFCIDES Editor
Controlling power flow in modern power systems
can be made more flexible by the use of recent developments
in power electronic and computing control technology. The
Unified Power Flow Controller (UPFC) is a Flexible AC
transmission system (FACTS) device that can control all the
three system variables namely line reactance, magnitude and
phase angle difference of voltage across the line. The UPFC
provides a promising means to control power flow in modern
power systems. Essentially the performance depends on proper
control setting achievable through a power flow analysis
program. This paper presents a reliable method to meet the
requirements by developing a Newton-Raphson based load
flow calculation through which control settings of UPFC can
be determined for the pre-specified power flow between the
lines. The proposed method keeps Newton-Raphson Load Flow
(NRLF) algorithm intact and needs (little modification in the
Jacobian matrix). A MATLAB program has been developed to
calculate the control settings of UPFC and the power flow
between the lines after the load flow is converged. Case studies
have been performed on IEEE 5-bus system and 14-bus system
to show that the proposed method is effective. These studies
indicate that the method maintains the basic NRLF properties
such as fast computational speed, high degree of accuracy and
good convergence rate.
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...IDES Editor
The size and shape of opening in dam causes the
stress concentration, it also causes the stress variation in the
rest of the dam cross section. The gravity method of the analysis
does not consider the size of opening and the elastic property
of dam material. Thus the objective of study is comprises of
the Finite Element Method which considers the size of
opening, elastic property of material, and stress distribution
because of geometric discontinuity in cross section of dam.
Stress concentration inside the dam increases with the opening
in dam which results in the failure of dam. Hence it is
necessary to analyses large opening inside the dam. By making
the percentage area of opening constant and varying size and
shape of opening the analysis is carried out. For this purpose
a section of Koyna Dam is considered. Dam is defined as a
plane strain element in FEM, based on geometry and loading
condition. Thus this available information specified our path
of approach to carry out 2D plane strain analysis. The results
obtained are then compared mutually to get most efficient
way of providing large opening in the gravity dam.
Assessing Uncertainty of Pushover Analysis to Geometric ModelingIDES Editor
Pushover Analysis a popular tool for seismic
performance evaluation of existing and new structures and is
nonlinear Static procedure where in monotonically increasing
loads are applied to the structure till the structure is unable
to resist the further load .During the analysis, whatever the
strength of concrete and steel is adopted for analysis of
structure may not be the same when real structure is
constructed and the pushover analysis results are very sensitive
to material model adopted, geometric model adopted, location
of plastic hinges and in general to procedure followed by the
analyzer. In this paper attempt has been made to assess
uncertainty in pushover analysis results by considering user
defined hinges and frame modeled as bare frame and frame
with slab modeled as rigid diaphragm and results compared
with experimental observations. Uncertain parameters
considered includes the strength of concrete, strength of steel
and cover to the reinforcement which are randomly generated
and incorporated into the analysis. The results are then
compared with experimental observations.
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...IDES Editor
This document summarizes and analyzes secure multi-party negotiation protocols for electronic payments in mobile computing. It presents a framework for secure multi-party decision protocols using lightweight implementations. The main focus is on synchronizing security features to avoid agreement manipulation and reduce user traffic. The paper describes negotiation between an auctioneer and bidders, showing multiparty security is better than existing systems. It analyzes the performance of encryption algorithms like ECC, XTR, and RSA for use in the multiparty negotiation protocols.
Selfish Node Isolation & Incentivation using Progressive ThresholdsIDES Editor
The problems associated with selfish nodes in
MANET are addressed by a collaborative watchdog approach
which reduces the detection time for selfish nodes thereby
improves the performance and accuracy of watchdogs[1]. In
the related works they make use of credit based systems, reputation
based mechanisms, pathrater and watchdog mechanism
to detect such selfish nodes. In this paper we follow an approach
of collaborative watchdog which reduces the detection
time for selfish nodes and also involves the removal of such
selfish nodes based on some progressively assessed thresholds.
The threshold gives the nodes a chance to stop misbehaving
before it is permanently deleted from the network.
The node passes through several isolation processes before it
is permanently removed. Another version of AODV protocol
is used here which allows the simulation of selfish nodes in
NS2 by adding or modifying log files in the protocol.
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...IDES Editor
Wireless sensor networks are networks having non
wired infrastructure and dynamic topology. In OSI model each
layer is prone to various attacks, which halts the performance
of a network .In this paper several attacks on four layers of
OSI model are discussed and security mechanism is described
to prevent attack in network layer i.e wormhole attack. In
Wormhole attack two or more malicious nodes makes a covert
channel which attracts the traffic towards itself by depicting a
low latency link and then start dropping and replaying packets
in the multi-path route. This paper proposes promiscuous mode
method to detect and isolate the malicious node during
wormhole attack by using Ad-hoc on demand distance vector
routing protocol (AODV) with omnidirectional antenna. The
methodology implemented notifies that the nodes which are
not participating in multi-path routing generates an alarm
message during delay and then detects and isolate the
malicious node from network. We also notice that not only
the same kind of attacks but also the same kind of
countermeasures can appear in multiple layer. For example,
misbehavior detection techniques can be applied to almost all
the layers we discussed.
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...IDES Editor
The recent advancements in the wireless technology
and their wide-spread deployment have made remarkable
enhancements in efficiency in the corporate and industrial
and Military sectors The increasing popularity and usage of
wireless technology is creating a need for more secure wireless
Ad hoc networks. This paper aims researched and developed
a new protocol that prevents wormhole attacks on a ad hoc
network. A few existing protocols detect wormhole attacks but
they require highly specialized equipment not found on most
wireless devices. This paper aims to develop a defense against
wormhole attacks as an Anti-worm protocol which is based on
responsive parameters, that does not require as a significant
amount of specialized equipment, trick clock synchronization,
no GPS dependencies.
Cloud Security and Data Integrity with Client Accountability FrameworkIDES Editor
This document summarizes a proposed cloud security and data integrity framework that provides client accountability. The framework aims to address issues like lack of user control over cloud data, need for data transparency and tracking, and ensuring data integrity. It proposes using JAR (Java Archive) files for data sharing due to benefits like portability. The framework incorporates client-side verification using MD5 hashing, digital signature-based authentication of JAR files, and use of HMAC to ensure data integrity. It also uses password-based encryption of log files to keep them tamper-proof. The framework is intended to provide both accountability and security for data sharing in cloud environments.
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetIDES Editor
A System state in HTTP botnet uses HTTP protocol
for the creation of chain of Botnets thereby compromising
other systems. By using HTTP protocol and port number 80,
attacks can not only be hidden but also pass through the
firewall without being detected. The DPR based detection
leads to better analysis of botnet attacks [3]. However, it
provides only probabilistic detection of the attacker and also
time consuming and error prone. This paper proposes a Genetic
algorithm based layered approach for detecting as well as
preventing botnet attacks. The paper reviews p2p firewall
implementation which forms the basis of filtering.
Performance evaluation is done based on precision, F-value
and probability. Layered approach reduces the computation
and overall time requirement [7]. Genetic algorithm promises
a low false positive rate.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
This document summarizes a research paper that proposes a method for enhancing data security in cloud computing through steganography. The method hides user data in digital images stored on cloud servers. When data needs to be accessed, it is extracted from the images. The document outlines the cloud architecture and security issues addressed. It then describes the proposed system architecture, security model, and data storage and retrieval process. Data is partitioned and hidden in multiple images to improve security. The goal is to prevent unauthorized access to user data stored on cloud servers.
The main tasks of a Wireless Sensor Network
(WSN) are data collection from its nodes and communication
of this data to the base station (BS). The protocols used for
communication among the WSN nodes and between the WSN
and the BS, must consider the resource constraints of nodes,
battery energy, computational capabilities and memory. The
WSN applications involve unattended operation of the network
over an extended period of time. In order to extend the lifetime
of a WSN, efficient routing protocols need to be adopted. The
proposed low power routing protocol based on tree-based
network structure reliably forwards the measured data towards
the BS using TDMA. An energy consumption analysis of the
WSN making use of this protocol is also carried out. It is
found that the network is energy efficient with an average
duty cycle of 0:7% for the WSN nodes. The OmNET++
simulation platform along with MiXiM framework is made
use of.
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...IDES Editor
The security of authentication of internet based
co-banking services should not be susceptible to high risks.
The passwords are highly vulnerable to virus attacks due to
the lack of high end embedding of security methods. In order
for the passwords to be more secure, people are generally
compelled to select jumbled up character based passwords
which are not only less memorable but are also equally prone
to insecurity. Multiple use of distributed shares has been
studied to solve the problem of authentication by algorithms
based on thresholding of pixels in image processing and visual
cryptography concepts where the subset of shares is considered
for the recovery of the original image for authentication using
correlation function[1][2].The main disadvantage in the above
study is the plain storage of shares and also one of the shares
is being supplied to the customer, which will lead to the
possibility of misuse by a third party. This paper proposes a
technique for scrambling of pixels by key based random
permutation (KBRP) within the shares before the
authentication has been attempted. Total number of shares to
be created is dependent on the multiplicity of ownership of
the account. By this method the problem of uncertainty among
the customers with regard to security, storage, retrieval of
holding of half of the shares is minimized.
This paper presents a trifocal Rotman Lens Design
approach. The effects of focal ratio and element spacing on
the performance of Rotman Lens are described. A three beam
prototype feeding 4 element antenna array working in L-band
has been simulated using RLD v1.7 software. Simulated
results show that the simulated lens has a return loss of –
12.4dB at 1.8GHz. Beam to array port phase error variation
with change in the focal ratio and element spacing has also
been investigated.
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesIDES Editor
Hyperspectral images can be efficiently compressed
through a linear predictive model, as for example the one
used in the SLSQ algorithm. In this paper we exploit this
predictive model on the AVIRIS images by individuating,
through an off-line approach, a common subset of bands, which
are not spectrally related with any other bands. These bands
are not useful as prediction reference for the SLSQ 3-D
predictive model and we need to encode them via other
prediction strategies which consider only spatial correlation.
We have obtained this subset by clustering the AVIRIS bands
via the clustering by compression approach. The main result
of this paper is the list of the bands, not related with the
others, for AVIRIS images. The clustering trees obtained for
AVIRIS and the relationship among bands they depict is also
an interesting starting point for future research.
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...IDES Editor
A microelectronic circuit of block-elements
functionally analogous to two hydrogen bonding networks is
investigated. The hydrogen bonding networks are extracted
from â-lactamase protein and are formed in its active site.
Each hydrogen bond of the network is described in equivalent
electrical circuit by three or four-terminal block-element.
Each block-element is coded in Matlab. Static and dynamic
analyses are performed. The resultant microelectronic circuit
analogous to the hydrogen bonding network operates as
current mirror, sine pulse source, triangular pulse source as
well as signal modulator.
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...IDES Editor
In this paper a method is proposed to discriminate
real world scenes in to natural and manmade scenes of similar
depth. Global-roughness of a scene image varies as a function
of image-depth. Increase in image depth leads to increase in
roughness in manmade scenes; on the contrary natural scenes
exhibit smooth behavior at higher image depth. This particular
arrangement of pixels in scene structure can be well explained
by local texture information in a pixel and its neighborhood.
Our proposed method analyses local texture information of a
scene image using texture unit matrix. For final classification
we have used both supervised and unsupervised learning using
K-Nearest Neighbor classifier (KNN) and Self Organizing
Map (SOM) respectively. This technique is useful for online
classification due to very less computational complexity.
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
A Visual Guide to 1 Samuel | A Tale of Two HeartsSteve Thomason
These slides walk through the story of 1 Samuel. Samuel is the last judge of Israel. The people reject God and want a king. Saul is anointed as the first king, but he is not a good king. David, the shepherd boy is anointed and Saul is envious of him. David shows honor while Saul continues to self destruct.
Philippine Edukasyong Pantahanan at Pangkabuhayan (EPP) CurriculumMJDuyan
(𝐓𝐋𝐄 𝟏𝟎𝟎) (𝐋𝐞𝐬𝐬𝐨𝐧 𝟏)-𝐏𝐫𝐞𝐥𝐢𝐦𝐬
𝐃𝐢𝐬𝐜𝐮𝐬𝐬 𝐭𝐡𝐞 𝐄𝐏𝐏 𝐂𝐮𝐫𝐫𝐢𝐜𝐮𝐥𝐮𝐦 𝐢𝐧 𝐭𝐡𝐞 𝐏𝐡𝐢𝐥𝐢𝐩𝐩𝐢𝐧𝐞𝐬:
- Understand the goals and objectives of the Edukasyong Pantahanan at Pangkabuhayan (EPP) curriculum, recognizing its importance in fostering practical life skills and values among students. Students will also be able to identify the key components and subjects covered, such as agriculture, home economics, industrial arts, and information and communication technology.
𝐄𝐱𝐩𝐥𝐚𝐢𝐧 𝐭𝐡𝐞 𝐍𝐚𝐭𝐮𝐫𝐞 𝐚𝐧𝐝 𝐒𝐜𝐨𝐩𝐞 𝐨𝐟 𝐚𝐧 𝐄𝐧𝐭𝐫𝐞𝐩𝐫𝐞𝐧𝐞𝐮𝐫:
-Define entrepreneurship, distinguishing it from general business activities by emphasizing its focus on innovation, risk-taking, and value creation. Students will describe the characteristics and traits of successful entrepreneurs, including their roles and responsibilities, and discuss the broader economic and social impacts of entrepreneurial activities on both local and global scales.
How to Setup Warehouse & Location in Odoo 17 InventoryCeline George
In this slide, we'll explore how to set up warehouses and locations in Odoo 17 Inventory. This will help us manage our stock effectively, track inventory levels, and streamline warehouse operations.