This document proposes a new hybrid optimization algorithm called ACO-PSO for solving dynamic travelling salesman problems (DTSP). It combines ant colony optimization (ACO) and particle swarm optimization (PSO). ACO is used to find paths between cities, while PSO is used to tune the ACO parameters and balance global and local search. The algorithm is tested on DTSP and shows good performance, finding close-to-optimal solutions. Metaheuristic algorithms like ACO and PSO are well-suited for combinatorial optimization problems like DTSP due to their flexibility, speed and ability to find global solutions.
Artificial Intelligence in Robot Path Planningiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
A Genetic Algorithm on Optimization Test FunctionsIJMERJOURNAL
ABSTRACT: Genetic Algorithms (GAs) have become increasingly useful over the years for solving combinatorial problems. Though they are generally accepted to be good performers among metaheuristic algorithms, most works have concentrated on the application of the GAs rather than the theoretical justifications. In this paper, we examine and justify the suitability of Genetic Algorithms in solving complex, multi-variable and multi-modal optimization problems. To achieve this, a simple Genetic Algorithm was used to solve four standard complicated optimization test functions, namely Rosenbrock, Schwefel, Rastrigin and Shubert functions. These functions are benchmarks to test the quality of an optimization procedure towards a global optimum. We show that the method has a quicker convergence to the global optima and that the optimal values for the Rosenbrock, Rastrigin, Schwefel and Shubert functions are zero (0), zero (0), -418.9829 and -14.5080 respectively
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZERijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic
behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in
the performance of PSO. As far as our investigation is concerned, most of the relevant researches are
based on computer simulations and few of them are based on theoretical approach. In this paper,
theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this
paper, which contains all the information needed for the future evolution. Then the memory-less property of
the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and
suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov
chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method
for parameter selection is proposed.
Improved optimization of numerical association rule mining using hybrid parti...IJECEIAES
Particle Swarm Optimization (PSO) has been applied to solve optimization problems in various fields, such as Association Rule Mining (ARM) of numerical problems. However, PSO often becomes trapped in local optima. Consequently, the results do not represent the overall optimum solutions. To address this limitation, this study aims to combine PSO with the Cauchy distribution (PARCD), which is expected to increase the global optimal value of the expanded search space. Furthermore, this study uses multiple objective functions, i.e., support, confidence, comprehensibility, interestingness and amplitude. In addition, the proposed method was evaluated using benchmark datasets, such as the Quake, Basket ball, Body fat, Pollution, and Bolt datasets. Evaluation results were compared to the results obtained by previous studies. The results indicate that the overall values of the objective functions obtained using the proposed PARCD approach are satisfactory.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Comparison between the genetic algorithms optimization and particle swarm opt...IAEME Publication
The document compares the genetic algorithms optimization and particle swarm optimization methods for designing close range photogrammetry networks. It presents the genetic algorithm and particle swarm optimization as two popular meta-heuristic algorithms inspired by natural evolution and collective animal behavior, respectively. The document develops mathematical models representing the genetic algorithm and particle swarm optimization for close range photogrammetry network design and evaluates them in a test field to reinforce the theoretical aspects.
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...csandit
Software testing is the primary phase, which is performed during software development and it is
carried by a sequence of instructions of test inputs followed by expected output. The Harmony
Search (HS) algorithm is based on the improvisation process of music. In comparison to other
algorithms, the HSA has gain popularity and superiority in the field of evolutionary
computation. When musicians compose the harmony through different possible combinations of
the music, at that time the pitches are stored in the harmony memory and the optimization can
be done by adjusting the input pitches and generate the perfect harmony. The test case
generation process is used to identify test cases with resources and also identifies critical
domain requirements. In this paper, the role of Harmony search meta-heuristic search
technique is analyzed in generating random test data and optimized those test data. Test data
are generated and optimized by applying in a case study i.e. a withdrawal task in Bank ATM
through Harmony search. It is observed that this algorithm generates suitable test cases as well
as test data and gives brief details about the Harmony search method. It is used for test data
generation and optimization
Artificial Intelligence in Robot Path Planningiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
A Genetic Algorithm on Optimization Test FunctionsIJMERJOURNAL
ABSTRACT: Genetic Algorithms (GAs) have become increasingly useful over the years for solving combinatorial problems. Though they are generally accepted to be good performers among metaheuristic algorithms, most works have concentrated on the application of the GAs rather than the theoretical justifications. In this paper, we examine and justify the suitability of Genetic Algorithms in solving complex, multi-variable and multi-modal optimization problems. To achieve this, a simple Genetic Algorithm was used to solve four standard complicated optimization test functions, namely Rosenbrock, Schwefel, Rastrigin and Shubert functions. These functions are benchmarks to test the quality of an optimization procedure towards a global optimum. We show that the method has a quicker convergence to the global optima and that the optimal values for the Rosenbrock, Rastrigin, Schwefel and Shubert functions are zero (0), zero (0), -418.9829 and -14.5080 respectively
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZERijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic
behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in
the performance of PSO. As far as our investigation is concerned, most of the relevant researches are
based on computer simulations and few of them are based on theoretical approach. In this paper,
theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this
paper, which contains all the information needed for the future evolution. Then the memory-less property of
the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and
suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov
chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method
for parameter selection is proposed.
Improved optimization of numerical association rule mining using hybrid parti...IJECEIAES
Particle Swarm Optimization (PSO) has been applied to solve optimization problems in various fields, such as Association Rule Mining (ARM) of numerical problems. However, PSO often becomes trapped in local optima. Consequently, the results do not represent the overall optimum solutions. To address this limitation, this study aims to combine PSO with the Cauchy distribution (PARCD), which is expected to increase the global optimal value of the expanded search space. Furthermore, this study uses multiple objective functions, i.e., support, confidence, comprehensibility, interestingness and amplitude. In addition, the proposed method was evaluated using benchmark datasets, such as the Quake, Basket ball, Body fat, Pollution, and Bolt datasets. Evaluation results were compared to the results obtained by previous studies. The results indicate that the overall values of the objective functions obtained using the proposed PARCD approach are satisfactory.
International Journal of Computational Engineering Research (IJCER) is dedicated to protecting personal information and will make every reasonable effort to handle collected information appropriately. All information collected, as well as related requests, will be handled as carefully and efficiently as possible in accordance with IJCER standards for integrity and objectivity.
Comparison between the genetic algorithms optimization and particle swarm opt...IAEME Publication
The document compares the genetic algorithms optimization and particle swarm optimization methods for designing close range photogrammetry networks. It presents the genetic algorithm and particle swarm optimization as two popular meta-heuristic algorithms inspired by natural evolution and collective animal behavior, respectively. The document develops mathematical models representing the genetic algorithm and particle swarm optimization for close range photogrammetry network design and evaluates them in a test field to reinforce the theoretical aspects.
AUTOMATIC GENERATION AND OPTIMIZATION OF TEST DATA USING HARMONY SEARCH ALGOR...csandit
Software testing is the primary phase, which is performed during software development and it is
carried by a sequence of instructions of test inputs followed by expected output. The Harmony
Search (HS) algorithm is based on the improvisation process of music. In comparison to other
algorithms, the HSA has gain popularity and superiority in the field of evolutionary
computation. When musicians compose the harmony through different possible combinations of
the music, at that time the pitches are stored in the harmony memory and the optimization can
be done by adjusting the input pitches and generate the perfect harmony. The test case
generation process is used to identify test cases with resources and also identifies critical
domain requirements. In this paper, the role of Harmony search meta-heuristic search
technique is analyzed in generating random test data and optimized those test data. Test data
are generated and optimized by applying in a case study i.e. a withdrawal task in Bank ATM
through Harmony search. It is observed that this algorithm generates suitable test cases as well
as test data and gives brief details about the Harmony search method. It is used for test data
generation and optimization
Application of Genetic Algorithm and Particle Swarm Optimization in Software ...IOSR Journals
This document discusses using genetic algorithms and particle swarm optimization techniques to optimize software testing by finding the most error-prone paths in a program. It begins by providing background on software testing and the need for automated techniques. It then describes how genetic algorithms and particle swarm optimization work as meta-heuristic search techniques that can be applied to the problem of generating optimal test cases. The document presents pseudocode for each algorithm and provides a sample implementation of genetic algorithms to optimize a mathematical function. It similarly provides an overview of implementing particle swarm optimization to minimize another mathematical function. The goal is to generate test cases using these algorithms and do a comparative study of their effectiveness.
The document proposes a particle swarm inspired cuckoo search algorithm to improve the balance of exploration and exploitation in the standard cuckoo search algorithm. It adds neighborhood information from the global best solutions to increase diversity and uses two new search strategies with a random probability rule to balance exploration and exploitation. The algorithm is tested on 30 benchmark functions and two real-world problems, and shows better performance than the standard cuckoo search, particle swarm optimization, and other state-of-the-art algorithms.
A comprehensive review on hybrid network traffic prediction model IJECEIAES
Network traffic is a typical nonlinear time series. As such, traditional linear and nonlinear models are inadequate to describe the multi-scale characteristics of traffic, thus compromising the prediction accuracy. Therefore, the research to date has tended to focus on hybrid models rather than the traditional linear and non-linear ones. Generally, a hybrid model adopts two or more methods as combined modelling to analyze and then predict the network traffic. Against this backdrop, this paper will review past research conducted on hybrid network traffic prediction models. The review concludes with a summary of the strengths and limitations of existing hybrid network prediction models which use optimization and decomposition techniques, respectively. These two techniques have been identified as major contributing factors in constructing a more accurate and fast response hybrid network traffic prediction.
A Heuristic Approach for optimization of Non Linear process using Firefly Alg...IJERA Editor
A comparison study of Firefly Algorithm (FA) and Bacterial Foraging Algorithm (BFO) optimization is carried out by applying them to a Non Linear pH neutralization process. In process control engineering, the Proportional, Derivative, Integral controller tuning parameters are deciding the performance of the controller to ensure the good performance of the plant. The FA and BFO algorithms are applied to obtain the optimum values of controller parameters. The performance indicators such as servo response and regulatory response tests are carried out to evaluate the efficiency of the heuristic algorithm based controllers. The error minimization criterion such as Integral Absolute Error (IAE), Integral Square Error (ISE), Integral Time Square Error (ITSE), Integral Time Absolute Error (ITAE) and Time domain specifications – rise time, Peak Overshoot and settling time are considered for the study of the performance of the controllers. The study indicates that, FA tuned PID controller provides marginally better set point tracking, load disturbance rejection, time domain specifications and error minimization for the Non Linear pH neutralization process compared to BFO tuned PID controller.
This document discusses using particle swarm optimization (PSO) to design optimal close-range photogrammetry networks. PSO is introduced as a heuristic optimization algorithm inspired by bird flocking behavior that can be used to solve complex optimization problems. The document then provides an overview of close-range photogrammetry network design and the four design stages. It explains that PSO will be used to optimize the first stage of determining optimal camera station positions. Mathematical models of PSO for close-range photogrammetry network design are developed. Experimental tests are carried out to develop a PSO algorithm that can determine optimum camera positions and evaluate the accuracy of the developed network.
Machine Learning and Model-Based Optimization for Heterogeneous Catalyst Desi...Ichigaku Takigawa
2nd ICReDD International Symposium—Toward Interdisciplinary Research Guided by Theory and Calculation
Nov. 27 (wed) - Nov. 29 (fri), 2019
https://www.icredd.hokudai.ac.jp/event/1229
AUTOMATED INFORMATION RETRIEVAL MODEL USING FP GROWTH BASED FUZZY PARTICLE SW...ijcseit
To mine out relevant facts at the time of need from web has been a tenuous task. Research on diverse fields
are fine tuning methodologies toward these goals that extracts the best of information relevant to the users
search query. In the proposed methodology discussed in this paper find ways to ease the search complexity
tackling the severe issues hindering the performance of traditional approaches in use. The proposed
methodology find effective means to find all possible semantic relatable frequent sets with FP Growth
algorithm. The outcome of which is the further source of fuel for Bio inspired Fuzzy PSO to find the optimal
attractive points for the web documents to get clustered meeting the requirement of the search query
without losing the relevance. On the whole the proposed system optimizes the objective function of
minimizing the intra cluster differences and maximizes the inter cluster distances along with retention of all
possible relationships with the search context intact. The major contribution being the system finds all
possible combinations matching the user search transaction and thereby making the system more
meaningful. These relatable sets form the set of particles for Fuzzy Clustering as well as PSO and thus
being unbiased and maintains a innate behaviour for any number of new additions to follow the herd
behaviour’s evaluations reveals the proposed methodology fares well as an optimized and effective
enhancements over the conventional approaches.
Re-Mining Association Mining Results Through Visualization, Data Envelopment ...ertekg
İndirmek için Bağlantı > https://ertekprojects.com/gurdal-ertek-publications/blog/re-mining-association-mining-results-through-visualization-data-envelopment-analysis-and-decision-trees/
Re-mining is a general framework which suggests the execution of additional data mining steps based on the results of an original data mining process. This study investigates the multi-faceted re-mining of association mining results, develops and presents a practical methodology, and shows the applicability of the developed methodology through real world data. The methodology suggests re-mining using data visualization, data envelopment analysis, and decision trees. Six hypotheses, regarding how re-mining can be carried out on association mining results, are answered in the case study through empirical analysis.
The Effect of Genetic Algorithm Parameters Tuning for Route Optimization in T...Muhammad Irfan Kemal
This study analyzed the effect of genetic algorithm parameter tuning on route optimization for the travelling salesman problem. It used a full factorial design to analyze the effects of population size, crossover probability, mutation probability, and number of iterations on distribution mileage. Testing showed the factors and some interactions significantly affected mileage. The combination of population=90, crossover probability=1.00, mutation probability=0.010, and iterations=800 generated the shortest mean route. This combination was concluded to be the best for optimizing distribution routes in this context.
A hybrid constructive algorithm incorporating teaching-learning based optimiz...IJECEIAES
The document describes a hybrid algorithm that combines a modified multiple operations using statistical tests (MMOST) constructive algorithm with an improved teaching-learning based optimization (ITLBO) algorithm for neural network training. The hybrid algorithm simultaneously optimizes the neural network structure and weights. The MMOST algorithm constructs different network structures, while the ITLBO algorithm finds the optimal weights for each structure. The hybrid algorithm, called MCO-ITLBO, is tested on classification and time series prediction problems and is shown to outperform other algorithms in terms of error rates and network complexity. Experimental results demonstrate that the MCO-ITLBO algorithm provides better performance than algorithms using only constructive or training methods.
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
In the present day huge amount of data is generated in every minute and transferred frequently. Although
the data is sometimes static but most commonly it is dynamic and transactional. New data that is being
generated is getting constantly added to the old/existing data. To discover the knowledge from this
incremental data, one approach is to run the algorithm repeatedly for the modified data sets which is time
consuming. Again to analyze the datasets properly, construction of efficient classifier model is necessary.
The objective of developing such a classifier is to classify unlabeled dataset into appropriate classes. The
paper proposes a dimension reduction algorithm that can be applied in dynamic environment for
generation of reduced attribute set as dynamic reduct, and an optimization algorithm which uses the
reduct and build up the corresponding classification system. The method analyzes the new dataset, when it
becomes available, and modifies the reduct accordingly to fit the entire dataset and from the entire data
set, interesting optimal classification rule sets are generated. The concepts of discernibility relation,
attribute dependency and attribute significance of Rough Set Theory are integrated for the generation of
dynamic reduct set, and optimal classification rules are selected using PSO method, which not only
reduces the complexity but also helps to achieve higher accuracy of the decision system. The proposed
method has been applied on some benchmark dataset collected from the UCI repository and dynamic
reduct is computed, and from the reduct optimal classification rules are also generated. Experimental
result shows the efficiency of the proposed method.
Resource Allocation Using Metaheuristic Searchcsandit
This document discusses using metaheuristic search techniques to solve resource allocation and scheduling problems that are common in software development projects. It evaluates the performance of three algorithms - simulated annealing, tabu search, and genetic algorithms - on test problems representative of resource constrained project scheduling problems (RCPSP). The experimental results found that all three metaheuristics can solve such problems effectively, with genetic algorithms performing slightly better overall than the other two techniques.
1) Molecular modeling techniques such as molecular mechanics, quantum mechanics, and energy minimization methods are used in computer-aided drug design to understand drug-receptor interactions and design new drug molecules.
2) The goal of target-based drug design is to identify or create novel molecules that bind to a selected target and elicit a biological response through techniques like molecular docking, de novo design, and virtual screening.
3) Computer-aided drug design uses molecular modeling to represent molecules in 3D and relate their structure and conformation to energy through mathematical equations in order to optimize properties and design new drugs.
Using particle swarm optimization to solve test functions problemsriyaniaes
In this paper the benchmarking functions are used to evaluate and check the particle swarm optimization (PSO) algorithm. However, the functions utilized have two dimension but they selected with different difficulty and with different models. In order to prove capability of PSO, it is compared with genetic algorithm (GA). Hence, the two algorithms are compared in terms of objective functions and the standard deviation. Different runs have been taken to get convincing results and the parameters are chosen properly where the Matlab software is used. Where the suggested algorithm can solve different engineering problems with different dimension and outperform the others in term of accuracy and speed of convergence.
Improvement of genetic algorithm using artificial bee colonyjournalBEEI
This document proposes using an artificial bee colony algorithm to improve a genetic algorithm. It does this by generating the initial population for the genetic algorithm rather than using random generation. The proposed method is tested on random number generation and the travelling salesman problem. For random number generation, five statistical tests are used to evaluate fitness, with the goal of generating random numbers that pass all tests. For the travelling salesman problem, fitness is based on minimizing the total distance travelled. The results show the proposed method performs better than the traditional genetic algorithm in terms of mean iterations, execution time, error rate, and finding the shortest route.
The location-routing problem is a relatively new branch of logistics system. Its objective is to determine a suitable location for constructing distribution warehouses and proper transportation routing from warehouse to the customer. In this study, the location-routing problem is investigated with considering fuzzy servicing time window for each customer. Another important issue in this regard is the existence of congested times during the service time and distributing goods to the customer. This caused a delay in providing service for customer and imposed additional costs to distribution system. Thus we have provided a mathematical model for designing optimal distributing system. Since the vehicle location-routing problem is Np-hard, thus a solution method using genetic meta-heuristic algorithm was developed and the optimal sequence of servicing for the vehicle and optimal location for the warehouses were determined through an example.
This document discusses a statistical approach for classifying and identifying DDoS attacks using the UCLA dataset. It proposes extracting features from network traffic such as packet count, average packet size, time interval variance, and packet size variance. A packet classification algorithm first classifies packets as normal or attacks. For uncertain cases, a K-NN classifier is used. Then the types of DDoS attacks, including flooding and scanning attacks, are identified based on the feature values. The proposed approach is evaluated using the UCLA dataset and shows mathematical calculations for feature extraction. In conclusion, the statistical approach and packet classification algorithm are effective for classifying common DDoS flooding and scanning attacks.
This document describes the construction and selection of single sampling quick switching variables systems for given control limits that involve minimum sum of risks. It provides the procedure for finding the single sampling quick switching variables system that has the minimum sum of producer's and consumer's risk for a specified acceptable quality level and limiting quality level. A table is constructed that can be used to select a quick switching variables sampling system for given values of AQL and LQL that has the minimum sum of risks. The document also discusses how to design a quick switching variables sampling system with an unknown standard deviation that involves minimum sum of risks.
The document discusses web usage mining, which involves automatically discovering patterns in how users access and interact with web pages on a website by analyzing web server log files. It describes the three main stages of the web usage mining process: data collection and preprocessing, pattern discovery, and pattern analysis. In the preprocessing stage, user access data is cleaned and organized into user sessions. Statistical and machine learning algorithms are then used to find hidden patterns in user behavior. Discovered patterns can be used by applications like recommendation engines. The document provides details on gathering and preprocessing usage data, including identifying unique users and constructing user sessions from server logs. It also discusses applying sequential pattern mining algorithms to discover frequent traversal patterns between pages within user sessions.
This document summarizes a study on automatically detecting boundaries and regions of interest in ultrasound images of focal liver lesions. The researchers used texture analysis and gradient vector flow snakes to extract boundaries after reducing speckle noise. They tested several noise filters and found median filtering worked best, achieving the highest PSNR. Texture analysis via gray-level co-occurrence matrix extraction detected regions more accurately than range or standard deviation filters. Morphological operations and seed point determination were then used to generate the final region of interest. The proposed automatic method facilitates ultrasound image segmentation and analysis of focal liver lesions.
Application of Genetic Algorithm and Particle Swarm Optimization in Software ...IOSR Journals
This document discusses using genetic algorithms and particle swarm optimization techniques to optimize software testing by finding the most error-prone paths in a program. It begins by providing background on software testing and the need for automated techniques. It then describes how genetic algorithms and particle swarm optimization work as meta-heuristic search techniques that can be applied to the problem of generating optimal test cases. The document presents pseudocode for each algorithm and provides a sample implementation of genetic algorithms to optimize a mathematical function. It similarly provides an overview of implementing particle swarm optimization to minimize another mathematical function. The goal is to generate test cases using these algorithms and do a comparative study of their effectiveness.
The document proposes a particle swarm inspired cuckoo search algorithm to improve the balance of exploration and exploitation in the standard cuckoo search algorithm. It adds neighborhood information from the global best solutions to increase diversity and uses two new search strategies with a random probability rule to balance exploration and exploitation. The algorithm is tested on 30 benchmark functions and two real-world problems, and shows better performance than the standard cuckoo search, particle swarm optimization, and other state-of-the-art algorithms.
A comprehensive review on hybrid network traffic prediction model IJECEIAES
Network traffic is a typical nonlinear time series. As such, traditional linear and nonlinear models are inadequate to describe the multi-scale characteristics of traffic, thus compromising the prediction accuracy. Therefore, the research to date has tended to focus on hybrid models rather than the traditional linear and non-linear ones. Generally, a hybrid model adopts two or more methods as combined modelling to analyze and then predict the network traffic. Against this backdrop, this paper will review past research conducted on hybrid network traffic prediction models. The review concludes with a summary of the strengths and limitations of existing hybrid network prediction models which use optimization and decomposition techniques, respectively. These two techniques have been identified as major contributing factors in constructing a more accurate and fast response hybrid network traffic prediction.
A Heuristic Approach for optimization of Non Linear process using Firefly Alg...IJERA Editor
A comparison study of Firefly Algorithm (FA) and Bacterial Foraging Algorithm (BFO) optimization is carried out by applying them to a Non Linear pH neutralization process. In process control engineering, the Proportional, Derivative, Integral controller tuning parameters are deciding the performance of the controller to ensure the good performance of the plant. The FA and BFO algorithms are applied to obtain the optimum values of controller parameters. The performance indicators such as servo response and regulatory response tests are carried out to evaluate the efficiency of the heuristic algorithm based controllers. The error minimization criterion such as Integral Absolute Error (IAE), Integral Square Error (ISE), Integral Time Square Error (ITSE), Integral Time Absolute Error (ITAE) and Time domain specifications – rise time, Peak Overshoot and settling time are considered for the study of the performance of the controllers. The study indicates that, FA tuned PID controller provides marginally better set point tracking, load disturbance rejection, time domain specifications and error minimization for the Non Linear pH neutralization process compared to BFO tuned PID controller.
This document discusses using particle swarm optimization (PSO) to design optimal close-range photogrammetry networks. PSO is introduced as a heuristic optimization algorithm inspired by bird flocking behavior that can be used to solve complex optimization problems. The document then provides an overview of close-range photogrammetry network design and the four design stages. It explains that PSO will be used to optimize the first stage of determining optimal camera station positions. Mathematical models of PSO for close-range photogrammetry network design are developed. Experimental tests are carried out to develop a PSO algorithm that can determine optimum camera positions and evaluate the accuracy of the developed network.
Machine Learning and Model-Based Optimization for Heterogeneous Catalyst Desi...Ichigaku Takigawa
2nd ICReDD International Symposium—Toward Interdisciplinary Research Guided by Theory and Calculation
Nov. 27 (wed) - Nov. 29 (fri), 2019
https://www.icredd.hokudai.ac.jp/event/1229
AUTOMATED INFORMATION RETRIEVAL MODEL USING FP GROWTH BASED FUZZY PARTICLE SW...ijcseit
To mine out relevant facts at the time of need from web has been a tenuous task. Research on diverse fields
are fine tuning methodologies toward these goals that extracts the best of information relevant to the users
search query. In the proposed methodology discussed in this paper find ways to ease the search complexity
tackling the severe issues hindering the performance of traditional approaches in use. The proposed
methodology find effective means to find all possible semantic relatable frequent sets with FP Growth
algorithm. The outcome of which is the further source of fuel for Bio inspired Fuzzy PSO to find the optimal
attractive points for the web documents to get clustered meeting the requirement of the search query
without losing the relevance. On the whole the proposed system optimizes the objective function of
minimizing the intra cluster differences and maximizes the inter cluster distances along with retention of all
possible relationships with the search context intact. The major contribution being the system finds all
possible combinations matching the user search transaction and thereby making the system more
meaningful. These relatable sets form the set of particles for Fuzzy Clustering as well as PSO and thus
being unbiased and maintains a innate behaviour for any number of new additions to follow the herd
behaviour’s evaluations reveals the proposed methodology fares well as an optimized and effective
enhancements over the conventional approaches.
Re-Mining Association Mining Results Through Visualization, Data Envelopment ...ertekg
İndirmek için Bağlantı > https://ertekprojects.com/gurdal-ertek-publications/blog/re-mining-association-mining-results-through-visualization-data-envelopment-analysis-and-decision-trees/
Re-mining is a general framework which suggests the execution of additional data mining steps based on the results of an original data mining process. This study investigates the multi-faceted re-mining of association mining results, develops and presents a practical methodology, and shows the applicability of the developed methodology through real world data. The methodology suggests re-mining using data visualization, data envelopment analysis, and decision trees. Six hypotheses, regarding how re-mining can be carried out on association mining results, are answered in the case study through empirical analysis.
The Effect of Genetic Algorithm Parameters Tuning for Route Optimization in T...Muhammad Irfan Kemal
This study analyzed the effect of genetic algorithm parameter tuning on route optimization for the travelling salesman problem. It used a full factorial design to analyze the effects of population size, crossover probability, mutation probability, and number of iterations on distribution mileage. Testing showed the factors and some interactions significantly affected mileage. The combination of population=90, crossover probability=1.00, mutation probability=0.010, and iterations=800 generated the shortest mean route. This combination was concluded to be the best for optimizing distribution routes in this context.
A hybrid constructive algorithm incorporating teaching-learning based optimiz...IJECEIAES
The document describes a hybrid algorithm that combines a modified multiple operations using statistical tests (MMOST) constructive algorithm with an improved teaching-learning based optimization (ITLBO) algorithm for neural network training. The hybrid algorithm simultaneously optimizes the neural network structure and weights. The MMOST algorithm constructs different network structures, while the ITLBO algorithm finds the optimal weights for each structure. The hybrid algorithm, called MCO-ITLBO, is tested on classification and time series prediction problems and is shown to outperform other algorithms in terms of error rates and network complexity. Experimental results demonstrate that the MCO-ITLBO algorithm provides better performance than algorithms using only constructive or training methods.
Text documents clustering using modified multi-verse optimizerIJECEIAES
In this study, a multi-verse optimizer (MVO) is utilised for the text document clus- tering (TDC) problem. TDC is treated as a discrete optimization problem, and an objective function based on the Euclidean distance is applied as similarity measure. TDC is tackled by the division of the documents into clusters; documents belonging to the same cluster are similar, whereas those belonging to different clusters are dissimilar. MVO, which is a recent metaheuristic optimization algorithm established for continuous optimization problems, can intelligently navigate different areas in the search space and search deeply in each area using a particular learning mechanism. The proposed algorithm is called MVOTDC, and it adopts the convergence behaviour of MVO operators to deal with discrete, rather than continuous, optimization problems. For evaluating MVOTDC, a comprehensive comparative study is conducted on six text document datasets with various numbers of documents and clusters. The quality of the final results is assessed using precision, recall, F-measure, entropy accuracy, and purity measures. Experimental results reveal that the proposed method performs competitively in comparison with state-of-the-art algorithms. Statistical analysis is also conducted and shows that MVOTDC can produce significant results in comparison with three well-established methods.
In the present day huge amount of data is generated in every minute and transferred frequently. Although
the data is sometimes static but most commonly it is dynamic and transactional. New data that is being
generated is getting constantly added to the old/existing data. To discover the knowledge from this
incremental data, one approach is to run the algorithm repeatedly for the modified data sets which is time
consuming. Again to analyze the datasets properly, construction of efficient classifier model is necessary.
The objective of developing such a classifier is to classify unlabeled dataset into appropriate classes. The
paper proposes a dimension reduction algorithm that can be applied in dynamic environment for
generation of reduced attribute set as dynamic reduct, and an optimization algorithm which uses the
reduct and build up the corresponding classification system. The method analyzes the new dataset, when it
becomes available, and modifies the reduct accordingly to fit the entire dataset and from the entire data
set, interesting optimal classification rule sets are generated. The concepts of discernibility relation,
attribute dependency and attribute significance of Rough Set Theory are integrated for the generation of
dynamic reduct set, and optimal classification rules are selected using PSO method, which not only
reduces the complexity but also helps to achieve higher accuracy of the decision system. The proposed
method has been applied on some benchmark dataset collected from the UCI repository and dynamic
reduct is computed, and from the reduct optimal classification rules are also generated. Experimental
result shows the efficiency of the proposed method.
Resource Allocation Using Metaheuristic Searchcsandit
This document discusses using metaheuristic search techniques to solve resource allocation and scheduling problems that are common in software development projects. It evaluates the performance of three algorithms - simulated annealing, tabu search, and genetic algorithms - on test problems representative of resource constrained project scheduling problems (RCPSP). The experimental results found that all three metaheuristics can solve such problems effectively, with genetic algorithms performing slightly better overall than the other two techniques.
1) Molecular modeling techniques such as molecular mechanics, quantum mechanics, and energy minimization methods are used in computer-aided drug design to understand drug-receptor interactions and design new drug molecules.
2) The goal of target-based drug design is to identify or create novel molecules that bind to a selected target and elicit a biological response through techniques like molecular docking, de novo design, and virtual screening.
3) Computer-aided drug design uses molecular modeling to represent molecules in 3D and relate their structure and conformation to energy through mathematical equations in order to optimize properties and design new drugs.
Using particle swarm optimization to solve test functions problemsriyaniaes
In this paper the benchmarking functions are used to evaluate and check the particle swarm optimization (PSO) algorithm. However, the functions utilized have two dimension but they selected with different difficulty and with different models. In order to prove capability of PSO, it is compared with genetic algorithm (GA). Hence, the two algorithms are compared in terms of objective functions and the standard deviation. Different runs have been taken to get convincing results and the parameters are chosen properly where the Matlab software is used. Where the suggested algorithm can solve different engineering problems with different dimension and outperform the others in term of accuracy and speed of convergence.
Improvement of genetic algorithm using artificial bee colonyjournalBEEI
This document proposes using an artificial bee colony algorithm to improve a genetic algorithm. It does this by generating the initial population for the genetic algorithm rather than using random generation. The proposed method is tested on random number generation and the travelling salesman problem. For random number generation, five statistical tests are used to evaluate fitness, with the goal of generating random numbers that pass all tests. For the travelling salesman problem, fitness is based on minimizing the total distance travelled. The results show the proposed method performs better than the traditional genetic algorithm in terms of mean iterations, execution time, error rate, and finding the shortest route.
The location-routing problem is a relatively new branch of logistics system. Its objective is to determine a suitable location for constructing distribution warehouses and proper transportation routing from warehouse to the customer. In this study, the location-routing problem is investigated with considering fuzzy servicing time window for each customer. Another important issue in this regard is the existence of congested times during the service time and distributing goods to the customer. This caused a delay in providing service for customer and imposed additional costs to distribution system. Thus we have provided a mathematical model for designing optimal distributing system. Since the vehicle location-routing problem is Np-hard, thus a solution method using genetic meta-heuristic algorithm was developed and the optimal sequence of servicing for the vehicle and optimal location for the warehouses were determined through an example.
This document discusses a statistical approach for classifying and identifying DDoS attacks using the UCLA dataset. It proposes extracting features from network traffic such as packet count, average packet size, time interval variance, and packet size variance. A packet classification algorithm first classifies packets as normal or attacks. For uncertain cases, a K-NN classifier is used. Then the types of DDoS attacks, including flooding and scanning attacks, are identified based on the feature values. The proposed approach is evaluated using the UCLA dataset and shows mathematical calculations for feature extraction. In conclusion, the statistical approach and packet classification algorithm are effective for classifying common DDoS flooding and scanning attacks.
This document describes the construction and selection of single sampling quick switching variables systems for given control limits that involve minimum sum of risks. It provides the procedure for finding the single sampling quick switching variables system that has the minimum sum of producer's and consumer's risk for a specified acceptable quality level and limiting quality level. A table is constructed that can be used to select a quick switching variables sampling system for given values of AQL and LQL that has the minimum sum of risks. The document also discusses how to design a quick switching variables sampling system with an unknown standard deviation that involves minimum sum of risks.
The document discusses web usage mining, which involves automatically discovering patterns in how users access and interact with web pages on a website by analyzing web server log files. It describes the three main stages of the web usage mining process: data collection and preprocessing, pattern discovery, and pattern analysis. In the preprocessing stage, user access data is cleaned and organized into user sessions. Statistical and machine learning algorithms are then used to find hidden patterns in user behavior. Discovered patterns can be used by applications like recommendation engines. The document provides details on gathering and preprocessing usage data, including identifying unique users and constructing user sessions from server logs. It also discusses applying sequential pattern mining algorithms to discover frequent traversal patterns between pages within user sessions.
This document summarizes a study on automatically detecting boundaries and regions of interest in ultrasound images of focal liver lesions. The researchers used texture analysis and gradient vector flow snakes to extract boundaries after reducing speckle noise. They tested several noise filters and found median filtering worked best, achieving the highest PSNR. Texture analysis via gray-level co-occurrence matrix extraction detected regions more accurately than range or standard deviation filters. Morphological operations and seed point determination were then used to generate the final region of interest. The proposed automatic method facilitates ultrasound image segmentation and analysis of focal liver lesions.
This document presents a human identification system using gait recognition. The system first detects moving subjects in video sequences and extracts silhouettes using background subtraction. It then calculates motion parameters like joint angles and gait velocity from the silhouettes using Speeded Up Robust Features (SURF) descriptors. These motion parameters are classified using a Meta-sample based sparse representation method, achieving an overall classification rate of 94.6782% on a test dataset. The system provides view-invariant human identification through gait analysis.
This document discusses decision strategies for vertical handovers in heterogeneous wireless networks. It begins by introducing the concepts of vertical handovers and heterogeneous networks. It then discusses some key aspects of handover management including the three phase process (information gathering, decision, execution), types of handovers, and control mechanisms. Several vertical handover decision strategies are then summarized, including those based on decision functions, user-centric approaches, and multiple attribute decision making. The strategies aim to select the optimal network by evaluating different criteria like network conditions, user preferences, quality of service, and applying weighting and algorithms. The document provides an overview of recent research on improving handover decisions between different wireless technologies.
1) The document discusses a system called Web Gate Keeper that provides intrusion prevention for multi-tier web applications. It tracks user sessions to control access between the web server and database server.
2) Previously, intrusion prevention systems were developed separately for web servers and database servers, but this system aims to prevent intrusions across both simultaneously through session tracking and control.
3) The system architecture includes server 1 for session validation and tracking, and servers 2 and 3 host the actual web application and restrict database access only to those servers.
This document discusses and compares signature-based and behavior-based anti-malware approaches. Signature-based detection identifies malware by matching patterns in software to known malware signatures but is susceptible to evasion and cannot detect new malware. Behavior-based detection monitors program behaviors and flags anomalous behaviors as potentially malicious, but it can produce false positives and be evaded through mimicry attacks. The document also describes specification-based monitoring, a behavior-based technique that mediates program events according to security policies.
This document discusses various techniques for image retrieval, including text-based, content-based, and hybrid approaches. Content-based image retrieval (CBIR) extracts visual features like color, texture, shape from images and is able to retrieve similar images to a query image. CBIR systems segment images, extract features, search databases, and return results. CBIR has advantages over text-based retrieval but challenges remain around the semantic gap between low-level features and high-level concepts. The document also discusses evaluating retrieval performance and promising future research directions like reducing the semantic gap.
This document summarizes research on palmprint identification. It begins by introducing palmprint biometrics and principal line features. It then summarizes several existing approaches that extract principal lines using techniques like finite radon transform, gradient images, and morphological operators. The proposed approach is described which uses Canny edge detection to extract principal lines based on edge direction. It preprocesses images, applies Canny edge detection, divides the output into blocks to generate templates, and performs matching. Experimental results on a public database achieve an accuracy of 86% for personal identification.
A NEW APPROACH IN DYNAMIC TRAVELING SALESMAN PROBLEM: A HYBRID OF ANT COLONY ...ijmpict
Nowadays swarm intelligence-based algorithms are being used widely to optimize the dynamic traveling salesman problem (DTSP). In this paper, we have used mixed method of Ant Colony Optimization (AOC) and gradient descent to optimize DTSP which differs with ACO algorithm in evaporation rate and innovative data. This approach prevents premature convergence and scape from local optimum spots and also makes it possible to find better solutions for algorithm. In this paper, we’re going to offer gradient descent and ACO algorithm which in comparison to some former methods it shows that algorithm has significantly improved routes optimization.
AN IMPROVED MULTIMODAL PSO METHOD BASED ON ELECTROSTATIC INTERACTION USING NN...ijaia
In this paper, an improved multimodal optimization (MMO) algorithm,calledLSEPSO,has been proposed. LSEPSO combinedElectrostatic Particle Swarm Optimization (EPSO) algorithm and a local search method and then madesome modification onthem. It has been shown to improve global and local optima finding ability of the algorithm. This algorithm useda modified local search to improve particle's personal best, which usedn-nearest-neighbour instead of nearest-neighbour. Then, by creating n new points among each particle and n nearest particles, it triedto find a point which could be the alternative of particle's personal best. This methodprevented particle's attenuation and following a specific particle by its neighbours. The performed tests on a number of benchmark functions clearly demonstratedthat the improved algorithm is able to solve MMO problems and outperform other tested algorithms in this article.
Convergence tendency of genetic algorithms and artificial immune system in so...ijcsity
By the advances in the Evolution Algorithms (EAs) and the intelligent optimization metho
ds we witness the
big revolutions in solving the optimization problems. The application of the evolution algorithms are not
only not limited to the combined optimization problems, but also are vast in domain to the continuous
optimization problems. In this
paper we analyze and study the Genetic Algorithm (GA) and the Artificial
Immune System (AIS)
algorithm
which are capable in escaping the local optimization and also fastening
reaching the global optimization and to show the efficiency of the GA and AIS th
e application of them in
Solving Continuous Optimization Functions (SCOFs) are studied. Because of the multi variables and the
multi
-
dimensional spaces in SCOFs the use of the classic optimization methods, is generally non
-
efficient
and high cost. In other
words the use of the classic optimization methods for SCOFs generally leads to a
local optimized solution. A possible solution for SCOFs is to use the EAs which are high in probability of
succeeding reaching the local optimized solution. The results in pa
per show that GA is more efficient than
AIS in reaching the optimized solution in SCOFs.
EVALUATION THE EFFICIENCY OF ARTIFICIAL BEE COLONY AND THE FIREFLY ALGORITHM ...ijcsa
Now the Meta-Heuristic algorithms have been used vastly in solving the problem of continuous optimization. In this paper the Artificial Bee Colony (ABC) algorithm and the Firefly Algorithm (FA) are valuated. And for presenting the efficiency of the algorithms and also for more analysis of them, the continuous optimization problems which are of the type of the problems of vast limit of answer and the
close optimized points are tested. So, in this paper the efficiency of the ABC algorithm and FA are presented for solving the continuous optimization problems and also the said algorithms are studied from the accuracy in reaching the optimized solution and the resulting time and the reliability of the optimized answer points of view.
Markov Chain and Adaptive Parameter Selection on Particle Swarm Optimizer ijsc
Particle Swarm Optimizer (PSO) is such a complex stochastic process so that analysis on the stochastic behavior of the PSO is not easy. The choosing of parameters plays an important role since it is critical in the performance of PSO. As far as our investigation is concerned, most of the relevant researches are based on computer simulations and few of them are based on theoretical approach. In this paper, theoretical approach is used to investigate the behavior of PSO. Firstly, a state of PSO is defined in this paper, which contains all the information needed for the future evolution. Then the memory-less property of the state defined in this paper is investigated and proved. Secondly, by using the concept of the state and suitably dividing the whole process of PSO into countable number of stages (levels), a stationary Markov chain is established. Finally, according to the property of a stationary Markov chain, an adaptive method for parameter selection is proposed.
Reliable and accurate estimation of software has always been a matter of concern for industry and
academia. Numerous estimation models have been proposed by researchers, but no model is suitable for all
types of datasets and environments. Since the motive of estimation model is to minimize the gap between
actual and estimated effort, the effort estimation process can be viewed as an optimization problem to tune
the parameters. In this paper, evolutionary computing techniques, including, Bee colony optimization,
Particle swarm optimization and Ant colony optimization have been employed to tune the parameters of
COCOMO Model. The performance of these techniques has been analysed by established performance
measure. The results obtained have been validated by using data of Interactive voice response (IVR)
projects. Evolutionary techniques have been found to be more accurate than existing estimation models.
EVOLUTIONARY COMPUTING TECHNIQUES FOR SOFTWARE EFFORT ESTIMATIONijcsit
Reliable and accurate estimation of software has always been a matter of concern for industry and academia. Numerous estimation models have been proposed by researchers, but no model is suitable for all types of datasets and environments. Since the motive of estimation model is to minimize the gap between actual and estimated effort, the effort estimation process can be viewed as an optimization problem to tune
the parameters. In this paper, evolutionary computing techniques, including, Bee colony optimization, Particle swarm optimization and Ant colony optimization have been employed to tune the parameters of COCOMO Model. The performance of these techniques has been analysed by established performance measure. The results obtained have been validated by using data of Interactive voice response (IVR)
projects. Evolutionary techniques have been found to be more accurate than existing estimation models.
The document summarizes research using evolutionary computing techniques like particle swarm optimization, ant colony optimization, and bee colony optimization to improve software effort estimation compared to the COCOMO model. The techniques are applied to tune the parameters of the COCOMO model. The models are validated using a dataset of 48 interactive voice response projects, and the bee colony optimization technique produces the lowest mean magnitude of relative error (MMRE) of 0.11 and root mean squared error (RMSE) of 7.85, outperforming the other techniques and COCOMO model. Evolutionary computing techniques are found to provide more accurate effort estimates than existing estimation models.
The optimization of running queries in relational databases using ant colony ...ijdms
The issue of optimizing queries is a cost-sensitive
process and with respect to the number of associat
ed
tables in a query, its number of permutations grows
exponentially. On one hand, in comparison with oth
er
operators in relational database, join operator is
the most difficult and complicated one in terms of
optimization for reducing its runtime. Accordingly,
various algorithms have so far been proposed to so
lve
this problem. On the other hand, the success of any
database management system (DBMS) means
exploiting the query model. In the current paper, t
he heuristic ant algorithm has been proposed to sol
ve this
problem and improve the runtime of join operation.
Experiments and observed results reveal the efficie
ncy
of this algorithm compared to its similar algorithm
s.
Solving multiple sequence alignment problems by using a swarm intelligent op...IJECEIAES
In this article, the alignment of multiple sequences is examined through swarm intelligence based an improved particle swarm optimization (PSO). A random heuristic technique for solving discrete optimization problems and realistic estimation was recently discovered in PSO. The PSO approach is a nature-inspired technique based on intelligence and swarm movement. Thus, each solution is encoded as “chromosomes” in the genetic algorithm (GA). Based on the optimization of the objective function, the fitness function is designed to maximize the suitable components of the sequence and reduce the unsuitable components of the sequence. The availability of a public benchmark data set such as the Bali base is seen as an assessment of the proposed system performance, with the potential for PSO to reveal problems in adapting to better performance. This proposed system is compared with few existing approaches such as deoxyribonucleic acid (DNA) or ribonucleic acid (RNA) alignment (DIALIGN), PILEUP8, hidden Markov model training (HMMT), rubber band technique-genetic algorithm (RBT-GA) and ML-PIMA. In many cases, the experimental results are well implemented in the proposed system compared to other existing approaches.
A hybrid optimization algorithm based on genetic algorithm and ant colony opt...ijaia
In optimization problem, Genetic Algorithm (GA) and Ant Colony Optimization Algorithm (ACO) have
been known as good alternative techniques. GA is designed by adopting the natural evolution process,
while ACO is inspired by the foraging behaviour of ant species. This paper presents a hybrid GA-ACO for
Travelling Salesman Problem (TSP), called Genetic Ant Colony Optimization (GACO). In this method, GA
will observe and preserve the fittest ant in each cycle in every generation and only unvisited cities will be
assessed by ACO. From experimental result, GACO performance is significantly improved and its time
complexity is fairly equal compared to the GA and ACO.
Algorithms And Optimization Techniques For Solving TSPCarrie Romero
The document discusses three algorithms - simulated annealing, ant colony optimization, and genetic algorithm - for solving the traveling salesman problem (TSP). It analyzes each algorithm's approach, parameters used, and results of experiments on 15 and 50 randomly generated cities. Simulated annealing had average distances of 4.1341 and 20.1316 units for 15 and 50 cities respectively. Ant colony optimization yielded average distances of 3.9102 units for 15 cities, running faster than simulated annealing. Genetic algorithm was tested on 15 cities in Brazil.
The document presents a particle swarm inspired cuckoo search algorithm for real parameter optimization. It combines two algorithms: cuckoo search and particle swarm optimization. In cuckoo search, agents find new solutions using levy flights. The proposed algorithm adds the global best solution from particle swarm optimization to enhance exploitation. It balances exploration and exploitation through two new search strategies with random probabilities. The algorithm is tested on benchmark functions and two real-world problems, showing better performance than other algorithms.
This document discusses various artificial intelligence techniques for robot path planning, including ant colony optimization. It provides background on particle swarm optimization, genetic algorithms, tabu search, simulated annealing, reactive search optimization, and ant colony algorithms. It then proposes a solution for robotic path planning that uses ant colony optimization. The proposed solution involves defining a source and destination point for the robot, moving it forward one step at a time while checking for obstacles, having it take three steps back if an obstacle is encountered, and applying ant colony optimization algorithms to help the robot find an optimal path to bypass obstacles and reach the destination point.
The behaviour of ACS-TSP algorithm when adapting both pheromone parameters us...IJECEIAES
In this paper, an evolved ant colony system (ACS) is proposed by dynamically adapting the responsible parameters for the decay of the pheromone trails 휉 and 휌 using fuzzy logic controller (FLC) applied in the travelling salesman problems (TSP). The purpose of the proposed method is to understand the effect of both parameters 휉 and 휌 on the performance of the ACS at the level of solution quality and convergence speed towards the best solutions through studying the behaviour of the ACS algorithm during this adaptation. The adaptive ACS is compared with the standard one. Computational results show that the adaptive ACS with dynamic adaptation of local pheromone parameter 휉 is more effective compared to the standard ACS.
Optimized Robot Path Planning Using Parallel Genetic Algorithm Based on Visib...IJERA Editor
An analysis is made for optimized path planning for mobile robot by using parallel genetic algorithm. The
parallel genetic algorithm (PGA) is applied on the visible midpoint approach to find shortest path for mobile
robot. The hybrid ofthese two algorithms provides a better optimized solution for smooth and shortest path for
mobile robot. In this problem, the visible midpoint approach is used to make the effectiveness for avoiding
local minima. It gives the optimum paths which are always consisting on free trajectories. But the
proposedhybrid parallel genetic algorithm converges very fast to obtain the shortest route from source to
destination due to the sharing of population. The total population is partitioned into a number subgroups to
perform the parallel GA. The master thread is the center of information exchange and making selection with
fitness evaluation.The cell to cell crossover makes the algorithm significantly good. The problem converges
quickly with in a less number of iteration.
A Survey of Solving Travelling Salesman Problem using Ant Colony OptimizationIRJET Journal
This document summarizes research on solving the travelling salesman problem (TSP) using ant colony optimization (ACO). It first provides background on TSP and describes how ACO mimics real ants finding food to solve optimization problems. The document then reviews several papers that have applied ACO to TSP and compared it to other algorithms. It finds that ACO generally performs better than genetic algorithms at finding optimal solutions to TSP as the number of cities increases. Finally, it proposes studying the effects of different ACO parameters on finding optimal TSP solutions.
The document proposes a hybrid algorithm combining genetic algorithm and cuckoo search optimization to solve job shop scheduling problems. It aims to minimize makespan (completion time of all jobs) by scheduling jobs on machines. The genetic algorithm is used to explore the search space but can get trapped in local optima. Cuckoo search optimization performs local search faster than genetic algorithm and helps avoid local optima. Experimental results on benchmark problems show the hybrid algorithm yields better solutions in terms of makespan and runtime compared to genetic algorithm and ant colony optimization algorithms.
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
A COMPARISON BETWEEN SWARM INTELLIGENCE ALGORITHMS FOR ROUTING PROBLEMSecij
Travelling salesman problem (TSP) is a most popular combinatorial routing problem, belongs to the class of NP-hard problems. Many approacheshave been proposed for TSP.Among them, swarm intelligence (SI) algorithms can effectively achieve optimal tours with the minimum lengths and attempt to avoid trapping in local minima points. The transcendence of each SI is depended on the nature of the problem. In our studies, there has been yet no any article, which had compared the performance of SI algorithms for TSP perfectly. In this paper,four common SI algorithms are used to solve TSP, in order to compare the performance of SI algorithms for the TSP problem. These algorithms include genetic algorithm, particle swarm optimization, ant colony optimization, and artificial bee colony. For each SI, the various parameters and operators were tested, and the best values were selected for it. Experiments oversome benchmarks fromTSPLIBshow that
artificial bee colony algorithm is the best one among the fourSI-basedmethods to solverouting problems like TSP.
Electrically small antennas: The art of miniaturizationEditor IJARCET
We are living in the technological era, were we preferred to have the portable devices rather than unmovable devices. We are isolating our self rom the wires and we are becoming the habitual of wireless world what makes the device portable? I guess physical dimensions (mechanical) of that particular device, but along with this the electrical dimension is of the device is also of great importance. Reducing the physical dimension of the antenna would result in the small antenna but not electrically small antenna. We have different definition for the electrically small antenna but the one which is most appropriate is, where k is the wave number and is equal to and a is the radius of the imaginary sphere circumscribing the maximum dimension of the antenna. As the present day electronic devices progress to diminish in size, technocrats have become increasingly concentrated on electrically small antenna (ESA) designs to reduce the size of the antenna in the overall electronics system. Researchers in many fields, including RF and Microwave, biomedical technology and national intelligence, can benefit from electrically small antennas as long as the performance of the designed ESA meets the system requirement.
This document provides a comparative study of two-way finite automata and Turing machines. Some key points:
- Two-way finite automata are similar to read-only Turing machines in that they have a finite tape that can be read in both directions, but cannot write to the tape.
- Turing machines have an infinite tape that can be read from and written to, allowing them to recognize recursively enumerable languages.
- Both models are examined in their ability to accept the regular language L={anbm|m,n>0}.
- The time complexity of a two-way finite automaton for this language is O(n2) due to making two passes over the
This document analyzes and compares the performance of the AODV and DSDV routing protocols in a vehicular ad hoc network (VANET) simulation. Simulations were conducted using NS-2, SUMO, and MOVE simulators for a grid map scenario with varying numbers of nodes. The results show that AODV performed better than DSDV in terms of throughput and packet delivery fraction, while DSDV had lower end-to-end delays. However, neither protocol was found to be fully suitable for the highly dynamic VANET environment. The document concludes that further work is needed to develop improved routing protocols optimized for VANETs.
This document discusses the digital circuit layout problem and approaches to solving it using graph partitioning techniques. It begins by introducing the digital circuit layout problem and how it has become more complex with increasing circuit sizes. It then discusses how the problem can be decomposed into subproblems using graph partitioning to assign geometric coordinates to circuit components. The document reviews several traditional approaches to solve the problem, such as the Kernighan-Lin algorithm, and discusses their limitations for larger circuit sizes. It also discusses more recent approaches using evolutionary algorithms and concludes by analyzing the contributions of various approaches.
This document summarizes various data mining techniques that have been used for intrusion detection systems. It first describes the architecture of a data mining-based IDS, including sensors to collect data, detectors to evaluate the data using detection models, a data warehouse for storage, and a model generator. It then discusses supervised and unsupervised learning approaches that have been applied, including neural networks, support vector machines, K-means clustering, and self-organizing maps. Finally, it reviews several related works applying these techniques and compares their results, finding that combinations of approaches can improve detection rates while reducing false alarms.
This document provides an overview of speech recognition systems and recent progress in the field. It discusses different types of speech recognition including isolated word, connected word, continuous speech, and spontaneous speech. Various techniques used in speech recognition are also summarized, such as simulated evolutionary computation, artificial neural networks, fuzzy logic, Kalman filters, and Hidden Markov Models. The document reviews several papers published between 2004-2012 that studied speech recognition methods including using dynamic spectral subband centroids, Kalman filters, biomimetic computing techniques, noise estimation, and modulation filtering. It concludes that Hidden Markov Models combined with MFCC features provide good recognition results for large vocabulary, speaker-independent, continuous speech recognition.
This document discusses integrating two assembly lines, Line A and Line B, based on lean line design concepts to reduce space and operators. It analyzes the current state of the lines using tools like takt time analysis and MTM/UAS studies. Improvements are identified to eliminate waste, including methods improvements, workplace rearrangement, ergonomic changes, and outsourcing. Paper kaizen is conducted and work elements are retimed. The goal is to integrate the lines to better utilize space and manpower while meeting manufacturing standards.
This document summarizes research on the exposure of microwaves from cellular networks. It describes how microwaves interact with biological systems and discusses measurement techniques and safety standards regarding microwave exposure. While some studies have alleged health hazards from microwaves, independent reviews by health organizations have found no evidence that exposure to microwaves below international safety limits causes harm. The document concludes that with precautions like limiting exposure time and using phones with lower SAR ratings, microwaves from cell phones pose minimal health risks.
This document summarizes a research paper that examines the effect of feature reduction in sentiment analysis of online reviews. It uses principle component analysis to reduce the number of features (product attributes) from a dataset of 500 camera reviews labeled as positive or negative. Two models are developed - one using the original set of 95 product attributes, and one using the reduced set. Support vector machines and naive Bayes classifiers are applied to both models and their performance is evaluated to determine if classification accuracy can be maintained while using fewer features. The results show it is possible to achieve similar accuracy levels with less features, improving computational efficiency.
This document provides a review of multispectral palm image fusion techniques. It begins with an introduction to biometrics and palm print identification. Different palm print images capture different spectral information about the palm. The document then reviews several pixel-level fusion methods for combining multispectral palm images, finding that Curvelet transform performs best at preserving discriminative patterns. It also discusses hardware for capturing multispectral palm images and the process of region of interest extraction and localization. Common fusion methods like wavelet transform and Curvelet transform are also summarized.
This document describes a vehicle theft detection system that uses radio frequency identification (RFID) technology. The system involves embedding an RFID chip in each vehicle that continuously transmits a unique identification signal. When a vehicle is stolen, the owner reports it to the police, who upload the vehicle's information to a central database. Police vehicles are equipped with RFID receivers. If a stolen vehicle passes within range of a receiver, the receiver detects the vehicle's ID signal and displays its details on a tablet. This allows police to quickly identify and recover stolen vehicles. The system aims to make it difficult for thieves to hide a vehicle's identity and allows vehicles to be tracked globally wherever the detection system is implemented.
This document discusses and compares two techniques for image denoising using wavelet transforms: Dual-Tree Complex DWT and Double-Density Dual-Tree Complex DWT. Both techniques decompose an image corrupted by noise using filter banks, apply thresholding to the wavelet coefficients, and reconstruct the image. The Double-Density Dual-Tree Complex DWT yields better denoising results than the Dual-Tree Complex DWT as it produces more directional wavelets and is less sensitive to shifts and noise variance. Experimental results on test images demonstrate that the Double-Density method achieves higher peak signal-to-noise ratios, especially at higher noise levels.
This document compares the k-means and grid density clustering algorithms. It summarizes that grid density clustering determines dense grids based on the densities of neighboring grids, and is able to handle different shaped clusters in multi-density environments. The grid density algorithm does not require distance computation and is not dependent on the number of clusters being known in advance like k-means. The document concludes that grid density clustering is better than k-means clustering as it can handle noise and outliers, find arbitrary shaped clusters, and has lower time complexity.
This document proposes a method for detecting, localizing, and extracting text from videos with complex backgrounds. It involves three main steps:
1. Text detection uses corner metric and Laplacian filtering techniques independently to detect text regions. Corner metric identifies regions with high curvature, while Laplacian filtering highlights intensity discontinuities. The results are combined through multiplication to reduce noise.
2. Text localization then determines the accurate boundaries of detected text strings.
3. Text binarization filters background pixels to extract text pixels for recognition. Thresholding techniques are used to convert localized text regions to binary images.
The method exploits different text properties to detect text using corner metric and Laplacian filtering. Combining the results improves
This document describes the design and implementation of a low power 16-bit arithmetic logic unit (ALU) using clock gating techniques. A variable block length carry skip adder is used in the arithmetic unit to reduce power consumption and improve performance. The ALU uses a clock gating circuit to selectively clock only the active arithmetic or logic unit, reducing dynamic power dissipation from unnecessary clock charging/discharging. The ALU was simulated in VHDL and synthesized for a Xilinx Spartan 3E FPGA, achieving a maximum frequency of 65.19MHz at 1.98mW power dissipation, demonstrating improved performance over a conventional ALU design.
This document describes using particle swarm optimization (PSO) and genetic algorithms (GA) to tune the parameters of a proportional-integral-derivative (PID) controller for an automatic voltage regulator (AVR) system. PSO and GA are used to minimize the objective function by adjusting the PID parameters to achieve optimal step response with minimal overshoot, settling time, and rise time. The results show that PSO provides high-quality solutions within a shorter calculation time than other stochastic methods.
This document discusses implementing trust negotiations in multisession transactions. It proposes a framework that supports voluntary and unexpected interruptions, allowing negotiating parties to complete negotiations despite temporary unavailability of resources. The Trust-x protocol addresses issues related to validity, temporary loss of data, and extended unavailability of one negotiator. It allows a peer to suspend an ongoing negotiation and resume it with another authenticated peer. Negotiation portions and intermediate states can be safely and privately passed among peers to guarantee stability for continued suspended negotiations. An ontology is also proposed to provide formal specification of concepts and relationships, which is essential in complex web service environments for sharing credential information needed to establish trust.
This document discusses and compares various nature-inspired optimization algorithms for resolving the mixed pixel problem in remote sensing imagery, including Biogeography-Based Optimization (BBO), Genetic Algorithm (GA), and Particle Swarm Optimization (PSO). It provides an overview of each algorithm, explaining key concepts like migration and mutation in BBO. The document aims to prove that BBO is the best algorithm for resolving the mixed pixel problem by comparing it to other evolutionary algorithms. It also includes figures illustrating concepts like the species model and habitat in BBO.
This document discusses principal component analysis (PCA) for face recognition. It begins with an introduction to face recognition and PCA. PCA works by calculating eigenvectors from a set of face images, which represent the principal components that account for the most variance in the image data. These eigenvectors are called "eigenfaces" and can be used to reconstruct the face images. The document then discusses how the system is implemented, including preparing a face database, normalizing the training images, calculating the eigenfaces/principal components, projecting the face images into this reduced space, and recognizing faces by calculating distances between projected test images and training images.
This document summarizes research on using wireless sensor networks to detect mobile targets. It discusses two optimization problems: 1) maximizing the exposure of the least exposed path within a sensor budget, and 2) minimizing sensor installation costs while ensuring all paths have exposure above a threshold. It proposes using tabu search heuristics to provide near-optimal solutions. The research also addresses extending the models to consider wireless connectivity, heterogeneous sensors, and intrusion detection using a game theory approach. Experimental results show the proposed mobile replica detection scheme can rapidly detect replicas with no false positives or negatives.
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
The Department of Veteran Affairs (VA) invited Taylor Paschal, Knowledge & Information Management Consultant at Enterprise Knowledge, to speak at a Knowledge Management Lunch and Learn hosted on June 12, 2024. All Office of Administration staff were invited to attend and received professional development credit for participating in the voluntary event.
The objectives of the Lunch and Learn presentation were to:
- Review what KM ‘is’ and ‘isn’t’
- Understand the value of KM and the benefits of engaging
- Define and reflect on your “what’s in it for me?”
- Share actionable ways you can participate in Knowledge - - Capture & Transfer
This talk will cover ScyllaDB Architecture from the cluster-level view and zoom in on data distribution and internal node architecture. In the process, we will learn the secret sauce used to get ScyllaDB's high availability and superior performance. We will also touch on the upcoming changes to ScyllaDB architecture, moving to strongly consistent metadata and tablets.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Discover top-tier mobile app development services, offering innovative solutions for iOS and Android. Enhance your business with custom, user-friendly mobile applications.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
"$10 thousand per minute of downtime: architecture, queues, streaming and fin...Fwdays
Direct losses from downtime in 1 minute = $5-$10 thousand dollars. Reputation is priceless.
As part of the talk, we will consider the architectural strategies necessary for the development of highly loaded fintech solutions. We will focus on using queues and streaming to efficiently work and manage large amounts of data in real-time and to minimize latency.
We will focus special attention on the architectural patterns used in the design of the fintech system, microservices and event-driven architecture, which ensure scalability, fault tolerance, and consistency of the entire system.