In this paper, a multi-objective based NSGA-II algo
rithm is proposed for dynamic job-shop scheduling
problem (DJSP) with random job arrivals and machine
breakdowns. In DJSP schedules are usually
inevitable due to various unexpected disruptions. T
o handle this problem, it is necessary to select
appropriate key machines at the beginning of the si
mulation instead of random selection. Thus, this pa
per
seeks to address on approach called social network
analysis method to identify the key machines of the
addressed DJSP. With identified key machines, the e
ffectiveness and stability of scheduling i.e., make
span
and starting time deviations of the computational c
omplex NP-hard problem has been solved with propose
d
multi-objective based hybrid NSGA-ll algorithm. Sev
eral experiments studies have been conducted and
comparisons have been made to demonstrate the effic
iency of the proposed approach with classical multi
-
objective based NSGA-II algorithm. The experimental
results illustrate that the proposed method is ver
y
effective in various shop floor conditions
A Model for Optimum Scheduling Of Non-Resumable Jobs on Parallel Production M...IOSR Journals
This document presents a mathematical model for optimizing the scheduling of non-resumable jobs on parallel production machines that are subject to multiple fixed periods of unavailability. The problem is formulated as an integer linear programming model to minimize the makespan. Two algorithms are proposed to solve the problem - the first uses the longest processing time rule to generate an initial solution and upper bound, and the second is a greedy algorithm that iteratively achieves an optimal solution by backtracking. The algorithms are implemented in software and validated using numerical examples and industrial case studies, showing makespan reductions of 49% and 40% compared to actual industry scheduling.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Scheduling By Using Fuzzy Logic in ManufacturingIJERA Editor
This paper represents the scheduling process in furniture manufacturing unit. It gives the fuzzy logic application in flexible manufacturing system. Flexible manufacturing systems are production system in furniture manufacturing unit. FMS consist of same multipurpose numerically controlled machines. Here in this project the scheduling has been done in FMS by using fuzzy logic tool in Matlab software. The fuzzy logic based scheduling model in this paper will deals with the job and best alternative route selection with multi-criteria of machine. Here two criteria for job and sequencing and routing with rules. This model is applicable to the scheduling of any manufacturing industry.
An Implementation on Effective Robot Mission under Critical Environemental Co...IJERA Editor
Software engineering is a field of engineering, for designing and writing programs for computers or other electronic devices. A software engineer, or programmer, writes software (or changes existing software) and compiles software using methods that make it better quality. Is the application of engineering to the design, development, implementation, testingand main tenance of software in a systematic method. Now a days the robotics are also plays an important role in present automation concepts. But we have several challenges in that robots when they are operated in some critical environments. Motion planning and task planning are two fundamental problems in robotics that have been addressed from different perspectives. For resolve this there are Temporal logic based approaches that automatically generate controllers have been shown to be useful for mission level planning of motion, surveillance and navigation, among others. These approaches critically rely on the validity of the environment models used for synthesis. Yet simplifying assumptions are inevitable to reduce complexity and provide mission-level guarantees; no plan can guarantee results in a model of a world in which everything can go wrong. In this paper, we show how our approach, which reduces reliance on a single model by introducing a stack of models, can endow systems with incremental guarantees based on increasingly strengthened assumptions, supporting graceful degradation when the environment does not behave as expected, and progressive enhancement when it does.
System model.Chapter One(GEOFFREY GORDON)Towfiq218
This document provides an overview of system modeling concepts. It defines what a system is and basic system components like entities, attributes, and activities. It discusses different types of systems like open vs closed systems, stochastic vs deterministic activities, and continuous vs discrete systems. It also covers various types of models like physical, mathematical, static, and dynamic models. Specific examples are provided to illustrate concepts like static and dynamic physical and mathematical models. Principles of modeling like block-building, relevance, accuracy, and aggregation are also covered.
Parallel Line and Machine Job Scheduling Using Genetic AlgorithmIRJET Journal
This document summarizes research on using a genetic algorithm to solve a parallel line and machine job scheduling problem. The objective is to minimize the makespan (completion time) of all jobs. A genetic algorithm is applied that represents possible job schedules as chromosomes. The fitness function evaluates schedules based on makespan. Genetic operators like crossover and mutation create new schedules from existing ones. The algorithm was tested on sample job data using a MATLAB program. Results showed the genetic algorithm approach can find optimal job allocations and schedules that minimize makespan for parallel line scheduling problems.
Efficient dispatching rules based on data mining for the single machine sched...csandit
In manufacturing the solutions found for scheduling
problems and the human expert’s
experience are very important. They can be transfor
med using Artificial Intelligence techniques
into knowledge and this knowledge could be used to
solve new scheduling problems. In this
paper we use Decision Trees for the generation of n
ew Dispatching Rules for a Single Machine
shop solved using a Genetic Algorithm. Two heuristi
cs are proposed to use the new Dispatching
Rules and a comparative study with other Dispatchin
g Rules from the literature is presented.
Control chart pattern recognition using k mica clustering and neural networksISA Interchange
Automatic recognition of abnormal patterns in control charts has seen increasing demands nowadays in manufacturing processes. This paper presents a novel hybrid intelligent method (HIM) for recognition of the common types of control chart pattern (CCP). The proposed method includes two main modules: a clustering module and a classifier module. In the clustering module, the input data is first clustered by a new technique. This technique is a suitable combination of the modified imperialist competitive algorithm (MICA) and the K-means algorithm. Then the Euclidean distance of each pattern is computed from the determined clusters. The classifier module determines the membership of the patterns using the computed distance. In this module, several neural networks, such as the multilayer perceptron, probabilistic neural networks, and the radial basis function neural networks, are investigated. Using the experimental study, we choose the best classifier in order to recognize the CCPs. Simulation results show that a high recognition accuracy, about 99.65%, is achieved.
A Model for Optimum Scheduling Of Non-Resumable Jobs on Parallel Production M...IOSR Journals
This document presents a mathematical model for optimizing the scheduling of non-resumable jobs on parallel production machines that are subject to multiple fixed periods of unavailability. The problem is formulated as an integer linear programming model to minimize the makespan. Two algorithms are proposed to solve the problem - the first uses the longest processing time rule to generate an initial solution and upper bound, and the second is a greedy algorithm that iteratively achieves an optimal solution by backtracking. The algorithms are implemented in software and validated using numerical examples and industrial case studies, showing makespan reductions of 49% and 40% compared to actual industry scheduling.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Scheduling By Using Fuzzy Logic in ManufacturingIJERA Editor
This paper represents the scheduling process in furniture manufacturing unit. It gives the fuzzy logic application in flexible manufacturing system. Flexible manufacturing systems are production system in furniture manufacturing unit. FMS consist of same multipurpose numerically controlled machines. Here in this project the scheduling has been done in FMS by using fuzzy logic tool in Matlab software. The fuzzy logic based scheduling model in this paper will deals with the job and best alternative route selection with multi-criteria of machine. Here two criteria for job and sequencing and routing with rules. This model is applicable to the scheduling of any manufacturing industry.
An Implementation on Effective Robot Mission under Critical Environemental Co...IJERA Editor
Software engineering is a field of engineering, for designing and writing programs for computers or other electronic devices. A software engineer, or programmer, writes software (or changes existing software) and compiles software using methods that make it better quality. Is the application of engineering to the design, development, implementation, testingand main tenance of software in a systematic method. Now a days the robotics are also plays an important role in present automation concepts. But we have several challenges in that robots when they are operated in some critical environments. Motion planning and task planning are two fundamental problems in robotics that have been addressed from different perspectives. For resolve this there are Temporal logic based approaches that automatically generate controllers have been shown to be useful for mission level planning of motion, surveillance and navigation, among others. These approaches critically rely on the validity of the environment models used for synthesis. Yet simplifying assumptions are inevitable to reduce complexity and provide mission-level guarantees; no plan can guarantee results in a model of a world in which everything can go wrong. In this paper, we show how our approach, which reduces reliance on a single model by introducing a stack of models, can endow systems with incremental guarantees based on increasingly strengthened assumptions, supporting graceful degradation when the environment does not behave as expected, and progressive enhancement when it does.
System model.Chapter One(GEOFFREY GORDON)Towfiq218
This document provides an overview of system modeling concepts. It defines what a system is and basic system components like entities, attributes, and activities. It discusses different types of systems like open vs closed systems, stochastic vs deterministic activities, and continuous vs discrete systems. It also covers various types of models like physical, mathematical, static, and dynamic models. Specific examples are provided to illustrate concepts like static and dynamic physical and mathematical models. Principles of modeling like block-building, relevance, accuracy, and aggregation are also covered.
Parallel Line and Machine Job Scheduling Using Genetic AlgorithmIRJET Journal
This document summarizes research on using a genetic algorithm to solve a parallel line and machine job scheduling problem. The objective is to minimize the makespan (completion time) of all jobs. A genetic algorithm is applied that represents possible job schedules as chromosomes. The fitness function evaluates schedules based on makespan. Genetic operators like crossover and mutation create new schedules from existing ones. The algorithm was tested on sample job data using a MATLAB program. Results showed the genetic algorithm approach can find optimal job allocations and schedules that minimize makespan for parallel line scheduling problems.
Efficient dispatching rules based on data mining for the single machine sched...csandit
In manufacturing the solutions found for scheduling
problems and the human expert’s
experience are very important. They can be transfor
med using Artificial Intelligence techniques
into knowledge and this knowledge could be used to
solve new scheduling problems. In this
paper we use Decision Trees for the generation of n
ew Dispatching Rules for a Single Machine
shop solved using a Genetic Algorithm. Two heuristi
cs are proposed to use the new Dispatching
Rules and a comparative study with other Dispatchin
g Rules from the literature is presented.
Control chart pattern recognition using k mica clustering and neural networksISA Interchange
Automatic recognition of abnormal patterns in control charts has seen increasing demands nowadays in manufacturing processes. This paper presents a novel hybrid intelligent method (HIM) for recognition of the common types of control chart pattern (CCP). The proposed method includes two main modules: a clustering module and a classifier module. In the clustering module, the input data is first clustered by a new technique. This technique is a suitable combination of the modified imperialist competitive algorithm (MICA) and the K-means algorithm. Then the Euclidean distance of each pattern is computed from the determined clusters. The classifier module determines the membership of the patterns using the computed distance. In this module, several neural networks, such as the multilayer perceptron, probabilistic neural networks, and the radial basis function neural networks, are investigated. Using the experimental study, we choose the best classifier in order to recognize the CCPs. Simulation results show that a high recognition accuracy, about 99.65%, is achieved.
A case study on Machine scheduling and sequencing using Meta heuristicsIJERA Editor
Modern manufacturing systems are constantly increasing in complexity and become more agile in nature such
system has become more crucial to check feasibility of machine scheduling and sequencing because effective
scheduling and sequencing can yield increase in productivity due to maximum utilization of available resources
but when number of machine increases traditional scheduling methods e.g. Johnson‟s ,rule is becomes in
effective Due to the limitations involved in exhaustive enumeration, for such problems meta-heuristics has
become greater choice for solving NP hard problems because of their multi solution and strong neighbourhood
search capability in a reasonable time.
This document describes an experiment using a least squares method to identify the dynamic model of a level control system in a didactic plant. The plant uses a Foundation Fieldbus communication protocol. The experiment applies a PRBS signal to excite the system and records the input and output signals. It then uses a non-recursive least squares estimator to identify the system and approximate its behavior with a second order transfer function. The results showed that the identification technique was able to accurately model the dynamic response of the level control loop.
In some industries as foundries, it is not technically feasible to interrupt a processor between jobs. This restriction gives rise to a scheduling problem called no-idle scheduling. This paper deals with scheduling of no-idle open shops to minimize maximum completion time of jobs, called makespan. The problem is first mathematically formulated by three different mixed integer linear programming models. Since open shop scheduling problems are NP-hard, only small instances can be solved to optimality using these models. Thus, to solve large instances, two meta-heuristics based on simulated annealing and genetic algorithms are developed. A complete numerical experiment is conducted and the developed models and algorithms are compared. The results show that genetic algorithm outperforms simulated annealing.
A Simulation Based Approach for Studying the Effect of Buffers on the Perform...inventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Fuzzy Retrial Queues with Priority using DSW Algorithmijceronline
In this paper we study the priority queueing model under fuzzy environment.It optimize a fuzzy priority queueing model (preemptive priority, non-preemptive priority) in which arrival rate ,service rate,retrial rate are fuzzy numbers.Approximate method of Extension namely DSW (Dong, Shah and Wong) algorithm is used to define membership functions of the performance measures of priority queuing system . DSW algorithm is based on the cut representation of fuzzy sets in a standard interval analysis. Numerical example is also illustrated to check the validity of the model.
Novel Heuristics for no-Wait Two Stage Multiprocessor Flow Shop with Probable...IRJET Journal
This document presents a study on scheduling algorithms for a two-stage manufacturing flow shop problem with rework and sequence-dependent setup times. It introduces novel heuristic algorithms like discrete particle swarm optimization, adapted imperialist competitive algorithm, and adapted invasive weed optimization. These algorithms are evaluated based on their ability to minimize the makespan (total completion time) for a given set of jobs in the flow shop. The adapted invasive weed optimization approach achieved superior performance compared to the other algorithms in computational experiments on this scheduling problem.
A LITERATURE SURVEY OF COGNITIVE COMPLEXITY METRICS FOR STATECHART DIAGRAMSijseajournal
Statechart diagrams have inherent complexity which keeps increasing every time the diagrams are modified. This complexity poses problems in comprehending statechart diagrams. The study of cognitive complexity has over the years provided valuable information for the design of improved software systems. Researchers have proposed numerous metrics that have been used to measure and therefore control the complexity of software. However, there is inadequate literature related to cognitive complexity metrics that can apply to measure statechart diagrams. In this study, a literature survey of statechart diagrams is conducted to investigate if there are any gaps in the literature. Initially, a description of UML and statechart diagrams is presented, followed by the complexities associated with statechart diagrams and finally an analysis of existing cognitive complexity metrics and metrics related to statechart diagrams. Findings indicate that metrics that employ cognitive weights to measure statechart diagrams are lacking.
Role of Simulation in Deep Drawn Cylindrical PartIJSRD
Simulation is widely used in forming industry due to its speed and lower cost and it has been proven to be effective in prediction of formability and spring back behavior. The purpose of finite element simulation in the sheet metal forming process is to minimize the time and cost in the design phase by predicting key outcomes such as the final shape of the part, the possibility of various defects and the flow of material. Such simulation is most useful and efficient when it is performed in the early stage of design by designers, rather than by analysis specialists after the detailed design is complete. The accuracy of such simulation depends on knowledge of material properties, boundary conditions and processing parameters. In the industry today, numerical sheet metal forming simulation is very important tool for reducing load time and improving part quality. In this paper finite element model for the deep-drawing of cylindrical cups is constructed and the simulation results are obtained by using different simulation parameters, i.e. punch velocity, coefficient of friction and blank holder force of the FE mesh-elements and these results are compared with experimental work.
This document provides an overview of various operations research (OR) models, including: linear programming, network flow programming, integer programming, nonlinear programming, dynamic programming, stochastic programming, combinatorial optimization, stochastic processes, discrete time Markov chains, continuous time Markov chains, queuing, and simulation. It describes the basic components and applications of each model type at a high level.
Modularizing Arcihtectures Using Dendrograms1victor tang
The document describes a 5-step algorithm for modularizing product architectures using dendrograms. The algorithm groups functions into modules based on their inputs and outputs. It defines a distance metric between modules based on the distances between their inputs and outputs. The distances are used to cluster the modules into a hierarchical dendrogram that can identify common modules across products for platform development. The algorithm is applied as an example to modularize four different products.
THE IMPLICATION OF STATISTICAL ANALYSIS AND FEATURE ENGINEERING FOR MODEL BUI...ijcseit
This document discusses various statistical analysis and feature engineering techniques that can be used for model building in machine learning algorithms. It describes how proper feature extraction through techniques like correlation analysis, principal component analysis, recursive feature elimination, and feature importance can help improve the accuracy of machine learning models. The document provides examples of applying different feature selection methods like univariate selection, recursive feature elimination, and principal component analysis on a diabetes dataset. It also explains the mathematics behind principal component analysis and how feature importance is estimated using an extra trees classifier. Overall, the document emphasizes how statistical analysis and feature engineering are important for effective model building in machine learning.
Burr Type III Software Reliability Growth ModelIOSR Journals
This document presents the Burr Type III software reliability growth model based on non-homogeneous Poisson process (NHPP) using time domain data. The maximum likelihood estimation method is used to estimate unknown parameters in the model from ungrouped failure data. Goodness of fit is also calculated to assess how well the mathematical model fits the data. Parameter estimation is performed on real software failure data sets, and analysis of reliability is presented for the given data sets.
A Survey on Wireless Network SimulatorsjournalBEEI
The Network simulator helps the developer to create and simulate new models on an arbitrary network by specifying both the behavior of the network nodes and the communication channels. It provides a virtual environment for an assortment of desirable features such as modeling a network based on a specific criteria and analyzing its performance under different scenarios. This saves cost and time required for testing the functionality and the execution of network. This paper has surveyed various Wireless Network Simulators and compared them.
This document presents a methodology for evaluating, ranking, and selecting parameters for an electric discharge machining (EDM) system using a multi-attribute decision making (MADM) approach. Attributes related to the EDM system are identified and classified. A three-stage selection procedure is used that includes normalizing data, determining attribute weights, and calculating closeness coefficients to rank alternatives. Three tools are ranked for a D3 workpiece using TOPSIS and two graphical techniques, with copper ranking highest as the best tool. The methodology provides a systematic way to optimize EDM parameters and validate experimental results.
An efficient simulated annealing algorithm for a No-Wait Two Stage Flexible F...ijait
In this paper, no wait two stage flexible flow shop scheduling problem (FFSSP) is solved using two metaheuristic algorithms. This problem with minimum makespan performance measure is NP-Hard. The proposed algorithms are Simulated Annealing and Genetic Algorithm. The results are analyzed in terms of Relative Percentage Deviation of Makespan. The performance of the proposed algorithms are studied and compared with that of MDA algorithm. For this propose a number of problems in different sizes are solved. The results of the studies proposes the effective algorithm. This is followed by describing the
outline of the study, concluding remarks and suggesting potential areas for further researches
A WORKSPACE SIMULATION FOR TAL TR-2 ARTICULATED ROBOT IAEME Publication
This paper discusses about simulation. Simulation is optimizing system performance. It is an unobtrusive scientific method of enquiry involving experiments rather than with the portion of reality that the model represents. Simulation is nothing but a result generation of system performance data.Simulation is often used to identify the better of the two alternatives.
A F AULT D IAGNOSIS M ETHOD BASED ON S EMI - S UPERVISED F UZZY C-M EANS...IJCI JOURNAL
Machine learning approaches are generally adopted i
n many fields including data mining, image
processing, intelligent fault diagnosis etc. As a c
lassic unsupervised learning technology, fuzzy C-me
ans
cluster analysis plays a vital role in machine lear
ning based intelligent fault diagnosis. With the ra
pid
development of science and technology, the monitori
ng signal data is numerous and keeps growing fast.
Only typical fault samples can be obtained and labe
led. Thus, how to apply semi-supervised learning
technology in fault diagnosis is significant for gu
aranteeing the equipment safety. According to this,
a novel
fault diagnosis method based on semi-supervised fuz
zy C-means(SFCM) cluster analysis is proposed.
Experimental results on Iris data set and the steel
plates faults data set show that this method is su
perior to
traditional fuzzy C-means clustering analysis
Matthew Kickenson is applying to the University of Geneva CUI program. He has a high school diploma from Montgomery Blair High School where he studied mathematics, French, sciences, and took courses in Cisco networking and electrical engineering. For work experience, he has been a cook, cashier, and camp counselor. He is proficient in communication, digital skills, and holds CompTIA certifications. He is currently taking an independent online course to earn CompTIA's cloud computing certification.
A case study on Machine scheduling and sequencing using Meta heuristicsIJERA Editor
Modern manufacturing systems are constantly increasing in complexity and become more agile in nature such
system has become more crucial to check feasibility of machine scheduling and sequencing because effective
scheduling and sequencing can yield increase in productivity due to maximum utilization of available resources
but when number of machine increases traditional scheduling methods e.g. Johnson‟s ,rule is becomes in
effective Due to the limitations involved in exhaustive enumeration, for such problems meta-heuristics has
become greater choice for solving NP hard problems because of their multi solution and strong neighbourhood
search capability in a reasonable time.
This document describes an experiment using a least squares method to identify the dynamic model of a level control system in a didactic plant. The plant uses a Foundation Fieldbus communication protocol. The experiment applies a PRBS signal to excite the system and records the input and output signals. It then uses a non-recursive least squares estimator to identify the system and approximate its behavior with a second order transfer function. The results showed that the identification technique was able to accurately model the dynamic response of the level control loop.
In some industries as foundries, it is not technically feasible to interrupt a processor between jobs. This restriction gives rise to a scheduling problem called no-idle scheduling. This paper deals with scheduling of no-idle open shops to minimize maximum completion time of jobs, called makespan. The problem is first mathematically formulated by three different mixed integer linear programming models. Since open shop scheduling problems are NP-hard, only small instances can be solved to optimality using these models. Thus, to solve large instances, two meta-heuristics based on simulated annealing and genetic algorithms are developed. A complete numerical experiment is conducted and the developed models and algorithms are compared. The results show that genetic algorithm outperforms simulated annealing.
A Simulation Based Approach for Studying the Effect of Buffers on the Perform...inventionjournals
International Journal of Engineering and Science Invention (IJESI) is an international journal intended for professionals and researchers in all fields of computer science and electronics. IJESI publishes research articles and reviews within the whole field Engineering Science and Technology, new teaching methods, assessment, validation and the impact of new technologies and it will continue to provide information on the latest trends and developments in this ever-expanding subject. The publications of papers are selected through double peer reviewed to ensure originality, relevance, and readability. The articles published in our journal can be accessed online.
Fuzzy Retrial Queues with Priority using DSW Algorithmijceronline
In this paper we study the priority queueing model under fuzzy environment.It optimize a fuzzy priority queueing model (preemptive priority, non-preemptive priority) in which arrival rate ,service rate,retrial rate are fuzzy numbers.Approximate method of Extension namely DSW (Dong, Shah and Wong) algorithm is used to define membership functions of the performance measures of priority queuing system . DSW algorithm is based on the cut representation of fuzzy sets in a standard interval analysis. Numerical example is also illustrated to check the validity of the model.
Novel Heuristics for no-Wait Two Stage Multiprocessor Flow Shop with Probable...IRJET Journal
This document presents a study on scheduling algorithms for a two-stage manufacturing flow shop problem with rework and sequence-dependent setup times. It introduces novel heuristic algorithms like discrete particle swarm optimization, adapted imperialist competitive algorithm, and adapted invasive weed optimization. These algorithms are evaluated based on their ability to minimize the makespan (total completion time) for a given set of jobs in the flow shop. The adapted invasive weed optimization approach achieved superior performance compared to the other algorithms in computational experiments on this scheduling problem.
A LITERATURE SURVEY OF COGNITIVE COMPLEXITY METRICS FOR STATECHART DIAGRAMSijseajournal
Statechart diagrams have inherent complexity which keeps increasing every time the diagrams are modified. This complexity poses problems in comprehending statechart diagrams. The study of cognitive complexity has over the years provided valuable information for the design of improved software systems. Researchers have proposed numerous metrics that have been used to measure and therefore control the complexity of software. However, there is inadequate literature related to cognitive complexity metrics that can apply to measure statechart diagrams. In this study, a literature survey of statechart diagrams is conducted to investigate if there are any gaps in the literature. Initially, a description of UML and statechart diagrams is presented, followed by the complexities associated with statechart diagrams and finally an analysis of existing cognitive complexity metrics and metrics related to statechart diagrams. Findings indicate that metrics that employ cognitive weights to measure statechart diagrams are lacking.
Role of Simulation in Deep Drawn Cylindrical PartIJSRD
Simulation is widely used in forming industry due to its speed and lower cost and it has been proven to be effective in prediction of formability and spring back behavior. The purpose of finite element simulation in the sheet metal forming process is to minimize the time and cost in the design phase by predicting key outcomes such as the final shape of the part, the possibility of various defects and the flow of material. Such simulation is most useful and efficient when it is performed in the early stage of design by designers, rather than by analysis specialists after the detailed design is complete. The accuracy of such simulation depends on knowledge of material properties, boundary conditions and processing parameters. In the industry today, numerical sheet metal forming simulation is very important tool for reducing load time and improving part quality. In this paper finite element model for the deep-drawing of cylindrical cups is constructed and the simulation results are obtained by using different simulation parameters, i.e. punch velocity, coefficient of friction and blank holder force of the FE mesh-elements and these results are compared with experimental work.
This document provides an overview of various operations research (OR) models, including: linear programming, network flow programming, integer programming, nonlinear programming, dynamic programming, stochastic programming, combinatorial optimization, stochastic processes, discrete time Markov chains, continuous time Markov chains, queuing, and simulation. It describes the basic components and applications of each model type at a high level.
Modularizing Arcihtectures Using Dendrograms1victor tang
The document describes a 5-step algorithm for modularizing product architectures using dendrograms. The algorithm groups functions into modules based on their inputs and outputs. It defines a distance metric between modules based on the distances between their inputs and outputs. The distances are used to cluster the modules into a hierarchical dendrogram that can identify common modules across products for platform development. The algorithm is applied as an example to modularize four different products.
THE IMPLICATION OF STATISTICAL ANALYSIS AND FEATURE ENGINEERING FOR MODEL BUI...ijcseit
This document discusses various statistical analysis and feature engineering techniques that can be used for model building in machine learning algorithms. It describes how proper feature extraction through techniques like correlation analysis, principal component analysis, recursive feature elimination, and feature importance can help improve the accuracy of machine learning models. The document provides examples of applying different feature selection methods like univariate selection, recursive feature elimination, and principal component analysis on a diabetes dataset. It also explains the mathematics behind principal component analysis and how feature importance is estimated using an extra trees classifier. Overall, the document emphasizes how statistical analysis and feature engineering are important for effective model building in machine learning.
Burr Type III Software Reliability Growth ModelIOSR Journals
This document presents the Burr Type III software reliability growth model based on non-homogeneous Poisson process (NHPP) using time domain data. The maximum likelihood estimation method is used to estimate unknown parameters in the model from ungrouped failure data. Goodness of fit is also calculated to assess how well the mathematical model fits the data. Parameter estimation is performed on real software failure data sets, and analysis of reliability is presented for the given data sets.
A Survey on Wireless Network SimulatorsjournalBEEI
The Network simulator helps the developer to create and simulate new models on an arbitrary network by specifying both the behavior of the network nodes and the communication channels. It provides a virtual environment for an assortment of desirable features such as modeling a network based on a specific criteria and analyzing its performance under different scenarios. This saves cost and time required for testing the functionality and the execution of network. This paper has surveyed various Wireless Network Simulators and compared them.
This document presents a methodology for evaluating, ranking, and selecting parameters for an electric discharge machining (EDM) system using a multi-attribute decision making (MADM) approach. Attributes related to the EDM system are identified and classified. A three-stage selection procedure is used that includes normalizing data, determining attribute weights, and calculating closeness coefficients to rank alternatives. Three tools are ranked for a D3 workpiece using TOPSIS and two graphical techniques, with copper ranking highest as the best tool. The methodology provides a systematic way to optimize EDM parameters and validate experimental results.
An efficient simulated annealing algorithm for a No-Wait Two Stage Flexible F...ijait
In this paper, no wait two stage flexible flow shop scheduling problem (FFSSP) is solved using two metaheuristic algorithms. This problem with minimum makespan performance measure is NP-Hard. The proposed algorithms are Simulated Annealing and Genetic Algorithm. The results are analyzed in terms of Relative Percentage Deviation of Makespan. The performance of the proposed algorithms are studied and compared with that of MDA algorithm. For this propose a number of problems in different sizes are solved. The results of the studies proposes the effective algorithm. This is followed by describing the
outline of the study, concluding remarks and suggesting potential areas for further researches
A WORKSPACE SIMULATION FOR TAL TR-2 ARTICULATED ROBOT IAEME Publication
This paper discusses about simulation. Simulation is optimizing system performance. It is an unobtrusive scientific method of enquiry involving experiments rather than with the portion of reality that the model represents. Simulation is nothing but a result generation of system performance data.Simulation is often used to identify the better of the two alternatives.
A F AULT D IAGNOSIS M ETHOD BASED ON S EMI - S UPERVISED F UZZY C-M EANS...IJCI JOURNAL
Machine learning approaches are generally adopted i
n many fields including data mining, image
processing, intelligent fault diagnosis etc. As a c
lassic unsupervised learning technology, fuzzy C-me
ans
cluster analysis plays a vital role in machine lear
ning based intelligent fault diagnosis. With the ra
pid
development of science and technology, the monitori
ng signal data is numerous and keeps growing fast.
Only typical fault samples can be obtained and labe
led. Thus, how to apply semi-supervised learning
technology in fault diagnosis is significant for gu
aranteeing the equipment safety. According to this,
a novel
fault diagnosis method based on semi-supervised fuz
zy C-means(SFCM) cluster analysis is proposed.
Experimental results on Iris data set and the steel
plates faults data set show that this method is su
perior to
traditional fuzzy C-means clustering analysis
Matthew Kickenson is applying to the University of Geneva CUI program. He has a high school diploma from Montgomery Blair High School where he studied mathematics, French, sciences, and took courses in Cisco networking and electrical engineering. For work experience, he has been a cook, cashier, and camp counselor. He is proficient in communication, digital skills, and holds CompTIA certifications. He is currently taking an independent online course to earn CompTIA's cloud computing certification.
This document contains profiles of 14 women listing their names, years of birth, heights, weights, body measurements, and roles as either promotional girls (PG) or models for the company CÔNG TY CỔ PHẦN DỊCH VỤ PG Á ĐÔNG. The profiles provide basic details about each woman such as name, birth year, height, weight, and body measurements. Their roles within the company are also noted as either promotional girls or models.
This document discusses JSR 364, which aims to broaden JCP membership by defining new membership classes and changing existing ones. It is one of four JSRs (JSR 348, 355, 358, 364) that together form the JCP.next initiative to evolve the Java Community Process. The document outlines the new proposed Associate and Partner membership classes for individuals and non-profit groups, respectively, and compares rights of different membership classes. It also provides resources for getting involved in the JCP, such as the Adopt-a-JSR program.
La educación a distancia es un ámbito emergente, complementario y complejo con bases múltiples. Presenta gran diversidad al cubrir necesidades educativas de la población con diferentes enfoques como la separación de profesor y alumno, el aprendizaje independiente y la comunicación mediada por tecnología. También presenta errores como la improvisación y falta de coordinación.
This work presents the first experimental evaluation of the Floating Content (FC) communication paradigm in a campus/large office setting. By logging information transfer events we have characterized mobility patterns, and we have assessed the performance of services implemented using the FC paradigm. Our results unveil the key relevance of group dynamics in user movements for the FC performance. Surprisingly, in such an environment, our results show that a relatively low user density is enough to guarantee content persistence over time, contrarily to predictions from available models. Based on these experimental findings, we develop a novel simple analytical model that accounts for the peculiarities of the mobility patterns in such a setting, and that can accurately predict the effectiveness of FC for the implementation of services in a campus/large office setting.
Un niño ha sido encerrado en el ático de un hombre disfrazado. Para escapar y advertir a otros niños sobre el hombre, el niño debe descifrar un mensaje en clave que incluye contar estatuas, seguir un pasillo, subir escaleras y resolver un acertijo sobre algo siempre abierto para los niños pero cerrado los domingos.
Declan Littrean has over 7 years of experience as the Operations Manager for Papa John's franchise operations in Trinidad and Tobago. The letter provides details of Declan's experience, including project managing the development of 4 new stores and relocation of 2 stores. It also mentions his involvement in employee training and interviews, cash management, understanding financials, and working with quality control systems and key performance indicators. The author of the letter recommends Declan as an excellent candidate for employment and believes he can become a valuable asset to any organization.
Artificial Neural Networks can achieve high degree of computation rates by
employing a massive number of simple processing elements with a high degree of
connectivity between elements. In this paper an attempt is made to present a Constraint
Satisfaction Adaptive Neural Network (CSANN) to solve the generalized job-shop
scheduling problem and it shows how to map a difficult constraint satisfaction job-shop
scheduling problem onto a simple neural net, where the number of neural processors equals
the number of operations, and the number of interconnections grows linearly with the total
number of operations. The proposed neural network can be easily constructed and can adjust
its weights of connections based on the sequence and resource constraints of the job-shop
scheduling problem during its processing. Simulation studies have shown that the proposed
neural network produces better solutions to job-shop scheduling problem.
Planning and Scheduling of a Corrugated Cardboard Manufacturing Process in IoTijtsrd
The automation has revolutionized the traditional product development scheme by using advanced design and manufacturing technologies such as computer aided design, process planning, and scheduling. However, research in this field was still based mostly on experimentation, as most manufacturing companies did not use simulation techniques in the implementation of their manufacturing planning and scheduling process. In order to address this problem, software developers have put simulation software tools in the market such as Enterprise Resource Planning ERP , Advanced Planning and Scheduling APS , and Risk based Planning and Scheduling RPS systems. In this paper, a methodology to model high degree of accuracy for the production floor, the planning and scheduling of corrugated cardboard manufacturing process through RPS simulation in Internet of Things IoT environment is established. The RPS model is able to generate a deterministic schedule without randomness, create a risk analysis of the planning and scheduling, and handle the uncertainty. Bruno Kemen | Sarhan M. Musa "Planning and Scheduling of a Corrugated Cardboard Manufacturing Process in IoT" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-1 , December 2020, URL: https://www.ijtsrd.com/papers/ijtsrd37965.pdf Paper URL : https://www.ijtsrd.com/engineering/electrical-engineering/37965/planning-and-scheduling-of-a-corrugated-cardboard-manufacturing-process-in-iot/bruno-kemen
The document presents an experimental design of a constraint satisfaction adaptive neural network (CSANN) for solving generalized job-shop scheduling problems. The CSANN is able to easily map the constraints of a scheduling problem into its architecture and remove any constraint violations during processing through adaptive adjustment of connection weights and biases. Simulation results show that the CSANN produces good solutions, often near-optimal, for various sized job-shop scheduling problems, from small problems with known optimal solutions to larger problems with hundreds of operations where optimal solutions are not known. The CSANN approach provides an effective means for solving complex job-shop scheduling problems in a manner that scales linearly with problem size.
SCHEDULING DIFFERENT CUSTOMER ACTIVITIES WITH SENSING DEVICEijait
Most periodic tasks are assigned to processors using partition scheduling policy after checking feasibility conditions. A new approach is proposed for scheduling different activities with one periodic task within the system. In this paper, control strategies are identified for allocating different types of tasks (activities) to
individual computing elements like Smartphone or microphones. In our simulation model, each periodic task generates an aperiodic tasks are taken into consideration. Different sets of periodic tasks and aperiodic tasks are scheduled together. This new approach proves that when all different activities are
scheduled with one periodic tasks leads to better performance.
PREDICTION OF AVERAGE TOTAL PROJECT DURATION USING ARTIFICIAL NEURAL NETWORKS...IAEME Publication
The prediction of project‘s expectancy life is an important issue for entrepreneurs since it helps them to avoid the expiration time of projects. To properly address this issue, Neural Network-based approach, fuzzy logic and regression methods are used to predict the necessary time that can be consumed to put an end to the targeted project. Before applying the three aforementioned approaches, the modeling and simulation of the activities network are introduced for calculating the total average time of project. Then, comparatively speaking, the neural network, fuzzy logic and regression method approach are compared in terms prediction’s accuracy. The generated error from the three methods is compared, namely different types of errors are calculated. Basically, the input variables consist of the probability of success (PS), the coefficient of improvement (Coef_PS) and the coefficient of learning (CofA), while the output variable is the average total duration of the project (DTTm). The Predicted mean square error (MSE) values are purposefully used to compare the three models. Interestingly, the results show that the optimum prediction model is the fuzzy logic model with accurate results. It is noteworthy to say that the application in this paper can be applied on a real case study.
Bragged Regression Tree Algorithm for Dynamic Distribution and Scheduling of ...Editor IJCATR
In the past few years, Grid computing came up as next generation computing platform which is a combination of
heterogeneous computing resources combined by a network across dynamic and geographically separated organizations. So, it
provides the perfect computing environment to solve large-scale computational demands. As the Grid computing demands are still
increasing from day to day due to rise in large number of complex jobs worldwide. So, the jobs may take much longer time to
complete due to poor distribution of batches or groups of jobs to inappropriate CPU’s. Therefore there is need to develop an efficient
dynamic job scheduling algorithm that would assign jobs to appropriate CPU’s dynamically. The main problem which dealt in the
paper is, how to distribute the jobs when the payload, importance, urgency, flow time etc. dynamically keeps on changing as the grid
expands or is flooded with number of job requests from different machines within the grid.
In this paper, we present a scheduling strategy which takes the advantage of decision tree algorithm to take dynamic decision
based on the current scenarios and which automatically incorporates factor analysis for considering the distribution of jobs.
Design and Implementation of a Multi-Agent System for the Job Shop Scheduling...CSCJournals
This document proposes an agent-based parallel genetic algorithm for solving the job shop scheduling problem. The approach divides the genetic algorithm population into multiple subpopulations that are evolved independently in parallel on different hosts. Agents are used to create the initial populations in parallel and execute the genetic algorithm operations. The genetic algorithm runs in execution phases where subpopulations evolve independently, and migration phases where the subpopulations exchange solutions. Experimental results showed this parallel approach improves the performance over a non-parallel genetic algorithm.
A case study on Machine scheduling and sequencing using Meta heuristicsIJERA Editor
Modern manufacturing systems are constantly increasing in complexity and become more agile in nature such
system has become more crucial to check feasibility of machine scheduling and sequencing because effective
scheduling and sequencing can yield increase in productivity due to maximum utilization of available resources
but when number of machine increases traditional scheduling methods e.g. Johnson‟s ,rule is becomes in
effective Due to the limitations involved in exhaustive enumeration, for such problems meta-heuristics has
become greater choice for solving NP hard problems because of their multi solution and strong neighbourhood
search capability in a reasonable time.
Job Shop Scheduling Using Mixed Integer ProgrammingIJMERJOURNAL
The document describes four mixed integer programming models for job shop scheduling with different objectives:
1) Minimize makespan without due dates
2) Minimize makespan with due date constraints
3) Minimize total earliness to accomplish just-in-time manufacturing
4) Minimize total lateness to reduce lateness penalties
The models are solved and their computational performance is compared based on parameters like makespan, machine utilization, and time efficiency. The results show that adding due date and earliness/lateness constraints increases makespan and machine idle time, lowering utilization.
Nitt paper A fuzzy multi-objective based un-related parallel Machine Schedu...Arudhra N
This paper addresses a multi-objective scheduling problem to minimize makespan, tardiness, load variation, flow time, and secondary resource constraints for an unrelated parallel machine scheduling problem. A multi-objective evolutionary algorithm called Fuzzy-Non-dominated Sorting Genetic Algorithm (FNSGA-II) is proposed to solve this computationally challenging problem. The performance of FNSGA-II is validated on randomly generated test problems and is found to perform reasonably well in terms of quality, computational time, diversity and spacing metrics. The paper formulates the scheduling problem as a fuzzy mixed-integer non-linear programming model and describes the implementation of FNSGA-II using evolutionary operators like crossover, mutation, and non-dom
Study of model predictive control using ni lab viewIAEME Publication
This document discusses the implementation of model predictive control (MPC) using National Instruments LabVIEW software. It begins with introductions to MPC and LabVIEW. It then covers constructing state space and transfer function models in LabVIEW. Simulation results are presented for MPC applied to first order systems with and without time delay. MPC performance is compared to PID control, showing MPC can handle constraints and optimize process operation while PID cannot. The document concludes MPC simulation using LabVIEW is successful and simulation results are useful for control system design.
Study of model predictive control using ni lab viewiaemedu
This document discusses the implementation of model predictive control (MPC) using National Instruments LabVIEW software. It begins with introductions to MPC and LabVIEW. It then covers constructing state space and transfer function models in LabVIEW. Simulation results are presented for MPC applied to first order systems with and without time delay. MPC performance is compared to PID control, showing MPC can handle constraints and optimize process operation while PID cannot. The document concludes MPC simulation using LabVIEW is successful and simulation results are useful for control system design.
Study of model predictive control using ni lab viewiaemedu
This document discusses the implementation of model predictive control (MPC) using National Instruments LabVIEW software. It begins with introductions to MPC and LabVIEW. It then covers constructing state space and transfer function models in LabVIEW. Simulation results are presented for MPC applied to first order systems with and without time delay. MPC performance is compared to PID control, showing MPC can handle constraints and optimize process operation while PID cannot. The document concludes MPC simulation using LabVIEW is successful and simulation results are useful for control system design.
Modified heuristic time deviation Technique for job sequencing and Computatio...ijcsit
The document discusses job sequencing and scheduling techniques. It begins by defining job sequencing as the arrangement of tasks to be performed in a specified order. It then discusses several algorithms and methods for solving job sequencing problems, including tabu search, fuzzy TOPSIS, genetic algorithms, Johnson's rule, and mathematical modeling approaches. It also covers classification of jobs, performance metrics, and discusses modified time deviation as a heuristic technique to arrange job sequences to minimize total elapsed time.
This document is a thesis submitted by Vijayananda D Mohire to the Karnataka State Open University in partial fulfillment of the requirements for a Master of Technology degree in Information Technology. It contains two questions and answers on the topics of intelligent control, scheduling, and synaptic dynamics models. The document provides tables of contents and figures to aid navigation and includes signatures from the candidate and internal supervisor.
A GUI-driven prototype for synthesizing self-adaptation decisionjournalBEEI
This document describes a GUI-driven prototype for assessing the synthesis of self-adaptation decisions. The prototype allows users to configure simulation parameters, execute synthesis simulations, and visualize the results. It integrates libraries for model checking, charting, and graph visualization. The prototype is demonstrated on a use case involving application deployment decisions in an autonomic cloud computing environment. The prototype aims to ease experimentation and evaluation of synthesis-driven approaches for self-adaptive systems.
Distributed Feature Selection for Efficient Economic Big Data AnalysisIRJET Journal
The document proposes a new framework for efficiently analyzing large and high-dimensional economic big data. The framework combines methods for economic feature selection and econometric model construction to identify patterns in economic development from vast amounts of economic indicator data. It relies on three key aspects: 1) novel data pre-processing techniques to prepare high-quality economic data, 2) an innovative distributed feature identification solution to locate important economic indicators from multidimensional datasets, and 3) new econometric models to capture patterns of economic development. The framework is demonstrated on economic data collected over 30 years from over 300 towns in Dalian, China.
This document presents a mathematical model for optimizing the scheduling of non-resumable jobs on parallel production machines that are subject to multiple fixed periods of unavailability. The problem is formulated as an integer linear programming model to minimize the makespan. Two algorithms are proposed to solve the problem - the first uses the longest processing time rule to generate an initial solution and upper bound, and the second is a greedy algorithm that iteratively achieves an optimal solution by backtracking. The algorithms are implemented in software and validated using numerical examples and industrial case studies, showing makespan reductions of 49% and 40% compared to actual industry scheduling.
Short Term Load Forecasting Using Bootstrap Aggregating Based Ensemble Artifi...Kashif Mehmood
Short Term Load Forecasting (STLF) can predict load from several minutes to week plays
the vital role to address challenges such as optimal generation, economic scheduling, dispatching and
contingency analysis. This paper uses Multi-Layer Perceptron (MLP) Artificial Neural Network
(ANN) technique to perform STFL but long training time and convergence issues caused by bias,
variance and less generalization ability, unable this algorithm to accurately predict future loads. This
issue can be resolved by various methods of Bootstraps Aggregating (Bagging) (like disjoint
partitions, small bags, replica small bags and disjoint bags) which helps in reducing variance and
increasing generalization ability of ANN. Moreover, it results in reducing error in the learning process
of ANN. Disjoint partition proves to be the most accurate Bagging method and combining outputs of
this method by taking mean improves the overall performance. This method of combining several
predictors known as Ensemble Artificial Neural Network (EANN) outperform the ANN and Bagging
method by further increasing the generalization ability and STLF accuracy.
IRJET- New Simulation Methodology for Dynamic Simulation Modeling of Construc...IRJET Journal
1) The document proposes a new simulation methodology for modeling construction activities and schedules using Flexsim software to account for risk and resource leveling.
2) It describes developing simulation models of operation theater construction activities to generate risk-based schedules and budget estimates.
3) The methodology is validated by modeling construction of two rooms and comparing results to traditional schedules. It accurately models activity durations and accounts for waiting times between dependent activities.
Similar to A M ULTI -O BJECTIVE B ASED E VOLUTIONARY A LGORITHM AND S OCIAL N ETWORK A NALYSIS A PPROACH FOR D YNAMIC J OB S HOP S CHEDULING P ROBLEMS (20)
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
Digital Twins Computer Networking Paper Presentation.pptxaryanpankaj78
A Digital Twin in computer networking is a virtual representation of a physical network, used to simulate, analyze, and optimize network performance and reliability. It leverages real-time data to enhance network management, predict issues, and improve decision-making processes.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELijaia
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Gas agency management system project report.pdfKamal Acharya
The project entitled "Gas Agency" is done to make the manual process easier by making it a computerized system for billing and maintaining stock. The Gas Agencies get the order request through phone calls or by personal from their customers and deliver the gas cylinders to their address based on their demand and previous delivery date. This process is made computerized and the customer's name, address and stock details are stored in a database. Based on this the billing for a customer is made simple and easier, since a customer order for gas can be accepted only after completing a certain period from the previous delivery. This can be calculated and billed easily through this. There are two types of delivery like domestic purpose use delivery and commercial purpose use delivery. The bill rate and capacity differs for both. This can be easily maintained and charged accordingly.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...
A M ULTI -O BJECTIVE B ASED E VOLUTIONARY A LGORITHM AND S OCIAL N ETWORK A NALYSIS A PPROACH FOR D YNAMIC J OB S HOP S CHEDULING P ROBLEMS
1. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
DOI: 10.5121/ijci.2015.4207 75
A MULTI-OBJECTIVE BASED EVOLUTIONARY
ALGORITHM AND SOCIAL NETWORK ANALYSIS
APPROACH FOR DYNAMIC JOB SHOP SCHEDULING
PROBLEM
1
V.K.Manupati, 2
N.Arudhra, 2
P.Vigneshwar, 2
D. RajaShekar and 2
M.Yaswanth
1
Associate professor, Department of Mechanical engineering, KLUniversity, Vijayawada,
AP
2
Under Graduate Student, Department of Mechanical engineering,
KLUniversity,Vijayawada, AP
ABSTRACT
In this paper, a multi-objective based NSGA-II algorithm is proposed for dynamic job-shop scheduling
problem (DJSP) with random job arrivals and machine breakdowns. In DJSP schedules are usually
inevitable due to various unexpected disruptions. To handle this problem, it is necessary to select
appropriate key machines at the beginning of the simulation instead of random selection. Thus, this paper
seeks to address on approach called social network analysis method to identify the key machines of the
addressed DJSP. With identified key machines, the effectiveness and stability of scheduling i.e., makespan
and starting time deviations of the computational complex NP-hard problem has been solved with proposed
multi-objective based hybrid NSGA-ll algorithm. Several experiments studies have been conducted and
comparisons have been made to demonstrate the efficiency of the proposed approach with classical multi-
objective based NSGA-II algorithm. The experimental results illustrate that the proposed method is very
effective in various shop floor conditions.
KEYWORDS
Social network analysis, NP hard, NSGA-II algorithm, Dynamic job-shop scheduling problem.
1. INTRODUCTION
Recent manufacturing has been emphasized on short product life cycles, variety of products with
medium production. This can be achieved through current information and communication
technologies (ICT) due to their rapid advancements. Job shop scheduling problem (JSP) attracts
considerable attention over other production problems due to its varied applications in the field of
manufacturing industries [1]. From the Literature it is well proved that JSP is the most difficult
combinatorial optimization problems and it is well known as NP-hard problems [2]. It is
necessary to understand the behavior of the manufacturing system under real-time events, such as
random job arrivals, machine breakdowns, etc. The dynamic nature of the system can be called as
dynamic JSP, which is essential for the handling real- world scheduling problems [3].
In real shop floor conditions the production orders are executed after the schedule, this brings
large deviations to the original sequence. Thus, it is necessary to identify key machines at the
2. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
76
beginning of simulation rather than randomly generating them. Hence, uncertain situations in
shopfloor such as machine break down, random arrival of jobs, etc can be taken care. A key
machine[4], described various types of topologies and how these topologies affect the search
space for exploiting the desired solution. In their work, authors have presented the descriptive
statistics such as average distance, diameter, and distribution sequence of various topologies and
found that the series of statistics directly affects the performance of the topologies. In this paper,
we have identified an approach called social network analysis method (SNAM) that can capture
the key machines from the available machines. This has been achieved with two folds: first, in
DJSP the structure of the network has been achieved where interactions between different
resources and their influence on the performance of the manufacturing system can be achieved.
Whereas, with the flexibility of self-contained structure in collaboration networks obtained from
the proposed SNAM and its descriptive statistics the performance of the system can be analyzed
[5]. Second, the functional properties such as centrality measures is identified in much better way
[6].
In this article, we have conducted social network analysis for identifying the key machines and
how this key machines effect the system performance i.e., efficiency and effectiveness of the
scheduling. As the problem is multi-objective in nature, it is necessary to handle simultaneously.
Thus, we proposed a multi-objective based evolutionary algorithm based NSGA-II for solving the
problem to optimality. It was found that the proposed algorithm obtained lower make span in less
computational time and less starting time deviations. Thus, we were able to develop and
implement an efficient approach for improving the performance of the system.
The remainder of this paper is organized as follows. In section 2, we describe a concrete example
and its notations. In Section 3, we developed a mathematical representation of the proposed
model and social network analysis based solution approach is presented. Section 4 explains the
classical and proposed multi-objective based NSGA-II algorithms for solving the proposed
problem. The experimentation with different test instances their results are introduced in section
5. In Section 6 the results and their discussions are detailed. The paper concludes with section 7
which suggests the directions of future work.
2. PROBLEM DESCRIPTION
In this section, we consider a dynamic job shop scheduling problem in stochastic environments
with random job arrivals, auxiliary resources constraints, machine beak-downs. To enhance the
performance measures of the system such as makespan and starting time deviation, a
mathematical model has been developed. Since the job arrivals at the system is dynamically flow
over time thus it follows a closely to exponentially distributions. Consequently, the machine
breakdown and repair timing also assumed as an exponential distribution. Thus, the mean time
between failure (MTBS) and the mean time to repair (MTTR) are the two parameters that denote
the breakdown level of the shop floor i.e., the percentage of time the machines have failures.
Here, the job arrival rate has considered as passion distribution with certain job release policy. It
is a case of mixed integer non linear programming (MINLP) model with multi-objectives. To
meet the requirements from real-world production, an evolutionary algorithm approach with
social network analysis has been implemented to solve the problem. The above mentioned
problem makes several assumptions that are worth highlighting.
2.1. Assumptions
1. Each machine can process only one operation of any job at a time.
2. Job’s setup times are sequence dependent.
3. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
77
3. Jobs may have different arrival (ready) times.
4. Machine must not be interrupted except machine breakdown while processing a job.
5. Processing times of jobs are machine dependent.
6. Preemption and machine breakdown are allowed.
7. Processing times and numbers of operable machines are known in advance.
Table 1. The Experimental Condition
Dimensions Characteristic Specifications
Job shop Size
Machine break down level
MTTR
MTBF,MTTR
Key machine
4 to 5 machines
0.025
5
Exponential distribution
Consider from social network
analysis
Jobs Random arrival
Job arrival rate
The number of new jobs
Job release policy
Processing time of an operation
Schedule frequency
Poisson distribution
0.125,0.25.0.5,1
11,12,13,14,15,16,20
Immediate
U[1,10]
4
Performance measure Make span
Starting time deviations
2.2. Mathematical Model
By allocating the above mentioned assumptions and notations, the problem is being formulated as
the following fuzzy mixed–integer non–linear programming (FMINLP) model.
Objectives :
1 (1)
1
*( - )2
MinZ
Min Z
N
Ci
i
ST STi i
= ∑
=
=
1 1
(2)
1
Subjected to constraints :
1;
N M
ijk j
i k
i j
M
i
X ∀
= =
≠
∑
=
=∑ ∑
1
¹
1, ,;
1
1
; ,
(3)
(4)
N
ijk jk
i
i j
N
X Y iijk ik
j
j
N
X Y j k
=
≤∑
=
≠
= …
= ∀∑
(5)
, ,* *(1- ) { } ;,c A Xijk Max c s s pj N i ijk j jk i j k+ ≥ + + ∀ (6)
, )* *(1- ) ( , ) ; ( (7)
,
1
1; ,
M
c A z Max c s p Yj N ij i j jk jk
k
z zij ji
i j S i jr
i j S i jr
+ ∈ ≠
∈
≥ + ∀∑
=
+ ≠= ∀ (8)
,;
1
,
M
X z i ji S i jrjk ij
k
∈∀∑
=
≠≤
{ } ( )
(9)
0; 0; , 0,1 ; , , ., 10,C T i X Y Z i j k
i i ijk ik ij
≥ ≥ ∀∈∀
4. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
78
The above mentioned objectives, i.e. minimization of total makespan (Z1) total starting time
deviation (Z2) are given by Eqs. (1) – (2), respectively. Eq (3) assures that each job is only
assigned to one position on a machine. Eq (4) suggests that if job j is allotted to machine k, then it
is adopted by another job indicating dummy job 0. Constraint (5) specify that at most one job can
immediately adopt the previously allotted job i on machine k. Constraint (6) computes the
completion time of job when it is processed instantly after job i on machine . Constraint (7 and 8)
ensures that if job i and j needs same operation, one must be finished earlier before starting the
other. Constraint (9) gives the relation between Xijk and Zij. Eventually Eq (10) points the non-
negativity and integrality constraints.
3. SOCIAL NETWORK ANALYSIS MODEL
The method describes how the manufacturing execution data can be extracted and viewed as a
network with nodes. Here, the data have been taken from the literature (Zhou et al. and Shao et
al., 2010). Consequently, various tests have been conducted with SNAM to find different
characteristics of the obtained network. In detail, the SNAM is categorised into two steps: (a)
network modelling, and (b) network analysis and it is mentioned in the following sections:
3.1. Network Modelling
Network consists of set of nodes connected with ties indicating interaction [9]. This section
presents how the manufacturing system execution data can be represented as networks. The
modelling algorithms in Ucinet software package and for visualizing the obtained results are
submitted to Netdraw software package. The obtained network from the affiliation matrix is
called as collaboration network. This collaboration network is more interesting and meaningful
than the simple network in terms of its characteristics, size, etc. The above procedure has been
repeated for the remaining scenarios and obtained different collaboration networks. These
collaboration networks are depicted in
Figure 1. Random corroborative network Figure 2. Corroborative network with key machines
3.2.Centrality Measures of a Network
Where, centrality is used to find how central a node is in the network. In this research, we have
considered the two most popular centrality measures such as degree and betweenness centralities
for the analysis of the network. The degree centrality measures the influence of the node from and
to its closest neighbour with complexity of O(n) to linearly scale the nodes in the network, where
n is the number of nodes. The second centrality measure, betweenness, has the higher level of
5. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
79
control on the information floating between different nodes in the network [7]. In this paper, we
have submitted the input data to the Ucinet and then obtained the results of degree and
betweenness centrality of each attribute. In the below Table 2, the measures of three centralities
of different instances of the particular work systems, are shown.
Table 2. Test Samples With Different Centralities
Instances (jobs
×machines)
Degree Centrality Betweeness Centrality Identified key
machines from Social
network Analysis
11 ×5 18
16
18
12
6
8.310
6.934
8.310
2.372
0.742
Machine 1
Machine 3
12×5 16
16
16
16
8
7.237
7.237
7.237
7.237
0.864
Machine 1
Machine2
Machine3
Machine4
13×4 20
20
12
11
15.029
15.029
3.096
4.129
Machine1
Machine2
4. MULTI-OBJECTIVE NSGA-II ALGORITHM
In this research, we considered a multi-objective NSGA-II to generate optimal solutions. Where,
Goldberg was the first who suggested the concept of non-dominated sorting genetic algorithm
(NSGA) and Srinivas and Deb were the first to implement it. As NSGA differs from well known
simple genetic algorithm is only in the way the selection operator works. Although NSGA has
been used to solve a variety of MOPs, its main drawbacks include: (i) high computational
complexity of non-dominated sorting, (ii) lack of elitism, and (iii) the need for specifying the
tunable parameter. With the advent of NSGA-II, the above mentioned drawbacks have been
resolved. We first apply a classical non-dominated sorting genetic algorithm-II (NSGA-II) [8]
which includes Pareto ranking and crowding distance mechanism for selection of the individuals.
Thereafter, with identified key machines from SNAM the MOEAs, has been implemented for
solving the considered problem. The operators play a crucial role in the algorithm for generating
of the better quality solutions. In order to find the better quality solutions, it is necessary to fine
tune the parameters such as crossover and mutation in better manner. Therefore, the authors have
said the ranges of operators i.e. crossover as 0.65 to 0.95 and mutation as 0.1 to 0.01. By
maintaining these ranges with the termination criteria as 100 iterations we have run the algorithm
on an average of 10 per run. In this manner we improved the system performance better than the
existing one.
6. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
80
Figure 3.Schematic procedure for processing NSGA-II
5. EXPERIMENTATION
From Figure 4, the Pareto optimal solutions of the considered two performance measures with the
classical NSGA-II and proposed NSGA-II algorithm is shown. Table 3 depicts different instances
and their comparison results. For 11jobs 5 machines instance with classical NSGA-II algorithm
the optimum makespan value is 28, whereas for proposed NSGA-II it is 24, similarly the starting
time deviations for classical is 2 and proposed is 7. This trend is continuing and the results are
improving for proposed approach. All the mentioned algorithms are coded with MATLAB
software and the problem is tested on Intel® Core™2 Duo CPU T7250 @2.00GHz, 1.99 GB of
RAM.
Table 3. Comparison Of Different Instances With Classical Nsga-Ii And Proposed Nsga-Ii With
Consideration Of Selected Key Machines
Instances
(Jobs ×
Machines)
NSGA- II NSGA-II with key machines
Makespan
(Schedule
efficiency)
Starting time
deviation (Schedule
stability)
Makespan
(Schedule
efficiency)
Starting time
deviation
(Schedule
stability)
11× 5
12 × 5
13× 4
28
37
42
2
1
2
24
32
36
7
3
4
7. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
81
Figure 4. Comparison of classical NSGA II and proposed NSGA II with social network analysis
6. CONCLUSIONS AND FUTURE WORK
This current paper elaborates in details of the proposed social network analysis method and multi-
objective based NSGA-II evolutionary algorithm to solve the dynamic job shop scheduling
problem. To find the performance of the proposed approach and to improve the performance of
the manufacturing system two major performance measures such as scheduling efficiency i.e.,
makespan and scheduling effectiveness i.e., starting time deviations are measured. Hence, the job
shop scheduling is computationally complex and well proved NP-hard in nature it is necessary to
handle with evolutionary algorithm based approach. Here, we have identified an NSGA-II
algorithm to solve the above mentioned problem. However, the identified problem is much
complex than the general job shop because the uncertainity in job arrival and machine breakdown
is considered one should take care of key machines identification. The proposed social network
analysis method helps to identify the key machines dynamically thus the considered performance
measure is enhanced. Moreover, with classical NSGA-II algorithm the proposed social network
based evolutionary algorithm is compared. Experimental results shows that the proposed
approach show its superiority over other. Further work may include generation of hybrid
algorithms to solve on more problems with many performance measures that affect the system.
REFERENCES
[1] Kataev, Michael Yu, et al. "Enterprise systems in Russia: 1992–2012."Enterprise Information
Systems 7.2 (2013): 169-186.
[2] Xia, Weijun, and Zhiming Wu. "An effective hybrid optimization approach for multi-objective
flexible job-shop scheduling problems." Computers & Industrial Engineering 48.2 (2005): 409-425.
[3] Ouelhadj, Djamila, and Sanja Petrovic. "A survey of dynamic scheduling in manufacturing
systems." Journal of Scheduling 12.4 (2009): 417-431.
[4] Mendes, R., Kennedy, J., & Neves, J. The fully informed particle swarm: simpler, maybe better.
Evolutionary Computation, IEEE Transactions on, 8(3) 2004, 204-210.
[5] Wasserman, Stanley, and Joseph Galaskiewicz, “ Advances in social network analysis: Research in
the social and behavioral sciences”. Vol. 171. Sage Publications (1994).
[6] Newman, Mark EJ. "Power laws, Pareto distributions and Zipf's law."Contemporary physics 46.5
(2005): 323-351.
[7] Hildorsson, Fredrik, and Tor Kvernvik. "Method and Arrangement for Supporting Analysis of Social
Networks in a Communication Network." U.S. Patent Application 13/498,230.
[8] Srinivas, N., and Deb, K.. “Multi-objective optimization using nondominated sorting in genetic
algorithms”. Evolutionary Computation, 1995, 2(3), 221-248.
[9] Clauset, Aaron, Mark EJ Newman, and Cristopher Moore. "Finding community structure in very large
networks." Physical review E 70.6 (2004): 066111.
8. International Journal on Cybernetics & Informatics (IJCI) Vol. 4, No. 2, April 2015
82
[10] Shao, Xinyu, et al. "Integration of process planning and scheduling—a modified genetic algorithm-
based approach." Computers & Operations Research 36.6 (2009): 2082-2096.