The document discusses a novel history-based weighted voting algorithm for safety-critical systems. It first reviews existing majority and weighted average voting algorithms and their limitations. It then proposes a new algorithm that assigns weights dynamically based on fuzzy logic assessments of module agreement and each module's historical reliability. The algorithm is evaluated experimentally against triple modular redundancy and shown to provide near 100% safety with two error-free modules or better results than existing algorithms with one or multiple errors. It concludes the new approach offers a better compromise between safety and availability for safety-critical applications.
The Internet of things (IoT) affects on humans life deeply. There are traditional cyber threats and also new threats. There is no guard and immunity for systems against the innumerable variance of attack and exploitation. In this paper, an approach base on Polling System is presented for secure routing IoT devices. We use polling system with probabilistic routing, so there is a probability to move from one queue to another. Probabilistic polling system allows us to perform priority of stakeholders’ votes.
The Internet of things (IoT) affects on humans life deeply. There are traditional cyber threats and also new threats. There is no guard and immunity for systems against the innumerable variance of attack and exploitation. In this paper, an approach base on Polling System is presented for secure routing IoT devices. We use polling system with probabilistic routing, so there is a probability to move from one queue to another. Probabilistic polling system allows us to perform priority of stakeholders’ votes.
An intrusion detection algorithm for amiIJCI JOURNAL
Nowadays, using the smart metering devices for energy users to manage a wide variety of subscribers,
reading devices for measuring, billing, disconnection and connection of subscribers’ connection
management is an important issue. The performance of these intelligent systems is based on information
transfer in the context of information technology, so reported data from network should be managed to
avoid the malicious activities that including the issues that could affect the quality of service the system. In
this paper for control of the reported data and to ensure the veracity of the obtained information, using
intrusion detection system is proposed based on the support vector machine and principle component
analysis (PCA) to recognize and identify the intrusions and attacks in the smart grid. Here, the operation of
intrusion detection systems for different kernel of SVM when using support vector machine (SVM) and PCA
simultaneously is studied. To evaluate the algorithm, based on data KDD99, numerical simulation is done
on five different kernels for an intrusion detection system using support vector machine with PCA
simultaneously. Also comparison analysis is investigated for presented intrusion detection algorithm in
terms of time - response, rate of increase network efficiency and increase system error and differences in
the use or lack of use PCA. The results indicate that correct detection rate and the rate of attack error
detection have best value when PCA is used, and when the core of algorithm is radial type, in SVM
algorithm reduces the time for data analysis and enhances performance of intrusion detection.
Biometrics technologies are gaining popularity because they provide more reliable and secure means in the process of authentication and verification of users. Dynamic typing is a kind of behavioral biometrics which uses different methods and techniques to store and analyze the users own way of typing. This paper presents a user authentication methodology using keystroke dynamics through piezo-resistive force sensors. An authentication system has been created checking the total typing time, the typing time between each key typed, the force of key typing and the average typing force. The system checks the user authentication veracity in the act of registration. A common numeric keypad modified with piezo-resistive sensors along with a microcontroller were used as materials. The methodology also uses a statistical classifier for the evaluation of users, a data filter to evaluate samples and a method for determining the individual thresholds of users. The system presented biometric error rates of 7.91% of FRR (false rejection rate), 2.32% of FAR (false acceptance rate) and 4.72% of EER (equal error rate).
ARTIFICIAL INTELLIGENCE TECHNIQUES FOR THE MODELING OF A 3G MOBILE PHONE BASE...ijaia
The principal objective of this work is to be able to use artificial intelligence techniques to be able to
design a predictive model of the performance of a third-generation mobile phone base radio, using the
analysis of KPIs obtained in a statistical data set of the daily behaviour of an RBS. For the realization of
these models, various techniques such as Decision Trees, Neural Networks and Random Forest were used.
which will allow faster progress in the deep analysis of large amounts of data statistics and get better
results. In this part of the work, data was obtained from the behaviour of a third-party mobile phone base
radio generation of the Claro operator in Ecuador, it should be noted that. To specify this practical case,
several models were generated based on in various artificial intelligence technique for the prediction of
performance results of a mobile phone base radio of third generation, the same ones that after several tests
were creation of a predictive model that determines the performance of a mobile phone base radio. As a
conclusion of this work, it was determined that the development of a predictive model based on artificial
intelligence techniques is very useful for the analysis of large amounts of data in order to find or predict
complex results, more quickly and trustworthy. The data are KPIs of the daily and hourly performance of a
radio base of third generation mobile telephony, these data were obtained through the operator's remote
monitoring and management tool Sure call PRS.
Due to diagnosis problem in detecting lung Cancer, it becomes the most dangerous cancer seen in human being. Because of early diagnosis, the survival rate among people is increased. The prediction of lung cancer is the most challenging cancer problem, due to its structure of cells in human body. In which most of tissues or cells are overlapping on one another. Now-a-days, the use of images processing techniques is increased in growing medical field for its disease diagnosis, where the time factor plays important role. Detecting cancer within a time, increases the survival rate of patients. Many radiologists still use MRI only for assessment of superior sulcus tumors and in cases where invasion of spinal cord canal is suspected. MRI can detect and stage lung cancer and this method would be excellent of lung malignancies and other diseases.
The Internet of things (IoT) affects on humans life deeply. There are traditional cyber threats and also new threats. There is no guard and immunity for systems against the innumerable variance of attack and exploitation. In this paper, an approach base on Polling System is presented for secure routing IoT devices. We use polling system with probabilistic routing, so there is a probability to move from one queue to another. Probabilistic polling system allows us to perform priority of stakeholders’ votes.
The Internet of things (IoT) affects on humans life deeply. There are traditional cyber threats and also new threats. There is no guard and immunity for systems against the innumerable variance of attack and exploitation. In this paper, an approach base on Polling System is presented for secure routing IoT devices. We use polling system with probabilistic routing, so there is a probability to move from one queue to another. Probabilistic polling system allows us to perform priority of stakeholders’ votes.
An intrusion detection algorithm for amiIJCI JOURNAL
Nowadays, using the smart metering devices for energy users to manage a wide variety of subscribers,
reading devices for measuring, billing, disconnection and connection of subscribers’ connection
management is an important issue. The performance of these intelligent systems is based on information
transfer in the context of information technology, so reported data from network should be managed to
avoid the malicious activities that including the issues that could affect the quality of service the system. In
this paper for control of the reported data and to ensure the veracity of the obtained information, using
intrusion detection system is proposed based on the support vector machine and principle component
analysis (PCA) to recognize and identify the intrusions and attacks in the smart grid. Here, the operation of
intrusion detection systems for different kernel of SVM when using support vector machine (SVM) and PCA
simultaneously is studied. To evaluate the algorithm, based on data KDD99, numerical simulation is done
on five different kernels for an intrusion detection system using support vector machine with PCA
simultaneously. Also comparison analysis is investigated for presented intrusion detection algorithm in
terms of time - response, rate of increase network efficiency and increase system error and differences in
the use or lack of use PCA. The results indicate that correct detection rate and the rate of attack error
detection have best value when PCA is used, and when the core of algorithm is radial type, in SVM
algorithm reduces the time for data analysis and enhances performance of intrusion detection.
Biometrics technologies are gaining popularity because they provide more reliable and secure means in the process of authentication and verification of users. Dynamic typing is a kind of behavioral biometrics which uses different methods and techniques to store and analyze the users own way of typing. This paper presents a user authentication methodology using keystroke dynamics through piezo-resistive force sensors. An authentication system has been created checking the total typing time, the typing time between each key typed, the force of key typing and the average typing force. The system checks the user authentication veracity in the act of registration. A common numeric keypad modified with piezo-resistive sensors along with a microcontroller were used as materials. The methodology also uses a statistical classifier for the evaluation of users, a data filter to evaluate samples and a method for determining the individual thresholds of users. The system presented biometric error rates of 7.91% of FRR (false rejection rate), 2.32% of FAR (false acceptance rate) and 4.72% of EER (equal error rate).
ARTIFICIAL INTELLIGENCE TECHNIQUES FOR THE MODELING OF A 3G MOBILE PHONE BASE...ijaia
The principal objective of this work is to be able to use artificial intelligence techniques to be able to
design a predictive model of the performance of a third-generation mobile phone base radio, using the
analysis of KPIs obtained in a statistical data set of the daily behaviour of an RBS. For the realization of
these models, various techniques such as Decision Trees, Neural Networks and Random Forest were used.
which will allow faster progress in the deep analysis of large amounts of data statistics and get better
results. In this part of the work, data was obtained from the behaviour of a third-party mobile phone base
radio generation of the Claro operator in Ecuador, it should be noted that. To specify this practical case,
several models were generated based on in various artificial intelligence technique for the prediction of
performance results of a mobile phone base radio of third generation, the same ones that after several tests
were creation of a predictive model that determines the performance of a mobile phone base radio. As a
conclusion of this work, it was determined that the development of a predictive model based on artificial
intelligence techniques is very useful for the analysis of large amounts of data in order to find or predict
complex results, more quickly and trustworthy. The data are KPIs of the daily and hourly performance of a
radio base of third generation mobile telephony, these data were obtained through the operator's remote
monitoring and management tool Sure call PRS.
Due to diagnosis problem in detecting lung Cancer, it becomes the most dangerous cancer seen in human being. Because of early diagnosis, the survival rate among people is increased. The prediction of lung cancer is the most challenging cancer problem, due to its structure of cells in human body. In which most of tissues or cells are overlapping on one another. Now-a-days, the use of images processing techniques is increased in growing medical field for its disease diagnosis, where the time factor plays important role. Detecting cancer within a time, increases the survival rate of patients. Many radiologists still use MRI only for assessment of superior sulcus tumors and in cases where invasion of spinal cord canal is suspected. MRI can detect and stage lung cancer and this method would be excellent of lung malignancies and other diseases.
PREDICTIVE MAINTENANCE AND ENGINEERED PROCESSES IN MECHATRONIC INDUSTRY: AN I...ijaia
The paper proposes the results of a research industry project concerning predictive maintenance process
optimization, applied to a machine cutting polyurethane. A company producing cutting machines, has been
provided with an online control system able to detect blade status of a machine supplied to a customer
producing polyurethane components. A software platform has been developed for the real time monitoring
of the blade status and for the prediction of the break up conditions adopting a multi-parametric data
analysis approach, based on the simultaneous use of unsupervised and supervised machine learning
algorithms. Specifically, the proposed method adopts a k-Means algorithm to classify bidimensional risk
maps, and a Long Short Term Memory (LSTM) one to predict the alerting levels based on the analysis of
the last values for some process variables. The analysed algorithms are applied to an experimental dataset.
For the agriculture sector, detecting and identifying plant diseases at an early stage is extremely important and
still very challenging. Machine learning is an application of AI that helps us achieve this purpose effectively. It
uses a group of algorithms to analyze and interpret data, learn from it, and using it, smart decisions can be
made. For accomplishing this project, a dataset that contains a set of healthy & diseased plant leaf images are
used then using image processing we extract the features of the image. Then we model this dataset with
different machine learning algorithms like Random Forest, Support Vector Machine, Naïve Bayes etc. The aim is
to hold out a comparative study to spot which of those algorithm can predict diseases with the at most
accuracy. We compare factors like precision, accuracy, error rates as well as prediction time of different
machine learning algorithms. After all these comparison, valuable conclusions can be made for this project.
Comparative Study on Machine Learning Algorithms for Network Intrusion Detect...ijtsrd
Network has brought convenience to the earth by permitting versatile transformation of information, however it conjointly exposes a high range of vulnerabilities. A Network Intrusion Detection System helps network directors and system to view network security violation in their organizations. Characteristic unknown and new attacks are one of the leading challenges in Intrusion Detection System researches. Deep learning that a subfield of machine learning cares with algorithms that are supported the structure and performance of brain known as artificial neural networks. The improvement in such learning algorithms would increase the probability of IDS and the detection rate of unknown attacks. Throughout, we have a tendency to suggest a deep learning approach to implement increased IDS and associate degree economical. Priya N | Ishita Popli "Comparative Study on Machine Learning Algorithms for Network Intrusion Detection System" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-1 , December 2020, URL: https://www.ijtsrd.com/papers/ijtsrd38175.pdf Paper URL : https://www.ijtsrd.com/computer-science/computer-network/38175/comparative-study-on-machine-learning-algorithms-for-network-intrusion-detection-system/priya-n
Multimodal authentication is one of the prime concepts in current applications of real scenario. Various
approaches have been proposed in this aspect. In this paper, an intuitive strategy is proposed as a
framework for providing more secure key in biometric security aspect. Initially the features will be
extracted through PCA by SVD from the chosen biometric patterns, then using LU factorization technique
key components will be extracted, then selected with different key sizes and then combined the selected key
components using convolution kernel method (Exponential Kronecker Product - eKP) as Context-Sensitive
Exponent Associative Memory model (CSEAM). In the similar way, the verification process will be done
and then verified with the measure MSE. This model would give better outcome when compared with SVD
factorization[1] as feature selection. The process will be computed for different key sizes and the results
will be presented.
Optimization of network traffic anomaly detection using machine learning IJECEIAES
In this paper, to optimize the process of detecting cyber-attacks, we choose to propose 2 main optimization solutions: Optimizing the detection method and optimizing features. Both of these two optimization solutions are to ensure the aim is to increase accuracy and reduce the time for analysis and detection. Accordingly, for the detection method, we recommend using the Random Forest supervised classification algorithm. The experimental results in section 4.1 have proven that our proposal that use the Random Forest algorithm for abnormal behavior detection is completely correct because the results of this algorithm are much better than some other detection algorithms on all measures. For the feature optimization solution, we propose to use some data dimensional reduction techniques such as information gain, principal component analysis, and correlation coefficient method. The results of the research proposed in our paper have proven that to optimize the cyberattack detection process, it is not necessary to use advanced algorithms with complex and cumbersome computational requirements, it must depend on the monitoring data for selecting the reasonable feature extraction and optimization algorithm as well as the appropriate attack classification and detection algorithms.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
GROUP FUZZY TOPSIS METHODOLOGY IN COMPUTER SECURITY SOFTWARE SELECTIONijfls
In today's interconnected world, the risk of malwares is a major concern for users. Antivirus software is a
device to prevent, discover, and eliminatemalwares such as, computer worm, trojan horses,computer
viruses,spyware and adware. In the competitive IT environment, due to availability of many antivirus
software and their diverse features evaluating them is an arguable and complicated issue for users which
has a significant impact on the efficiency of computers defense systems. The anti-virus selection problem
can be formulated as a multiple criteria decision making problem. This paper proposes an antivirus
evaluation model for computer users based on group fuzzy TOPSIS. We study a real world case of antivirus
software and define criteria for antivirus selection problem. Seven alternatives were selected from among
the most popular antiviruses in the market and seven criteria were determined by the experts. The study is
followed by the sensitivity analyses of the results which also gives valuable insights into the needs and
solutions for different users in different conditions.
Pattern recognition using context dependent memory model (cdmm) in multimodal...ijfcstjournal
Pattern recognition is one of the prime concepts in current technologies in both private and public sectors.
The analysis and recognition of two or more patterns is a complex task due to several factors. The
consideration of two or more patterns requires huge space for keeping the storage media as well as
computational aspect. Vector logic gives very good strategy for recognition of patterns. This paper
proposes pattern recognition in multimodal authentication system with the use of vector logic and makes
the computation model hard and less error rate. Using PCA two to three biometric patterns will be fusion
and then various key sizes will be extracted using LU factorization approach. The selected keys will be
combined using vector logic, which introduces a memory model often called Context Dependent Memory
Model (CDMM) as computational model in multimodal authentication system that gives very accurate and
very effective outcome for authentication as well as verification. In the verification step, Mean Square
Error (MSE) and Normalized Correlation (NC) as metrics to minimize the error rate for the proposed
model and the performance analysis will be presented.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Fuzzy Analytic Hierarchy Based DBMS Selection In Turkish National Identity Ca...Ferhat Ozgur Catak
Database Management Systems (DBMS) play an important role to support
enterprise application developments. Selection of the right DBMS is a crucial decision for
software engineering process. This selection requires optimizing a number of criteria.
Evaluation and selection of DBMS among several candidates tend to be very complex. It
requires both quantitative and qualitative issues. Wrong selection of DBMS will have a
negative effect on the development of enterprise application. It can turn out to be costly and adversely affect business process. The following study focuses on the evaluation of a multi criteria
decision problem by the usage of fuzzy logic. We will demonstrate the methodological considerations
regarding to group decision and fuzziness based on the DBMS selection problem. We developed a new
Fuzzy AHP based decision model which is formulated and proposed to select a DBMS easily. In this
decision model, first, main criteria and their sub criteria are determined for the evaluation. Then these
criteria are weighted by pair-wise comparison, and then DBMS alternatives are evaluated by assigning a
rating scale.
A Decision tree is termed as good DT when it has small size and when new data is introduced it can be classified accurately. Pre-processing the input data is one of the good approaches for generating a good DT. When different data pre-processing methods are used with the combination of DT classifier it evaluates to give high performance. This paper involves the accuracy variation in the ID3 classifier when used in combination with different data pre-processing and feature selection method. The performances of DTs are produced from comparison of original and pre-processed input data and experimental results are shown by using standard decision tree algorithm-ID3 on a dataset.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
PREDICTIVE MAINTENANCE AND ENGINEERED PROCESSES IN MECHATRONIC INDUSTRY: AN I...ijaia
The paper proposes the results of a research industry project concerning predictive maintenance process
optimization, applied to a machine cutting polyurethane. A company producing cutting machines, has been
provided with an online control system able to detect blade status of a machine supplied to a customer
producing polyurethane components. A software platform has been developed for the real time monitoring
of the blade status and for the prediction of the break up conditions adopting a multi-parametric data
analysis approach, based on the simultaneous use of unsupervised and supervised machine learning
algorithms. Specifically, the proposed method adopts a k-Means algorithm to classify bidimensional risk
maps, and a Long Short Term Memory (LSTM) one to predict the alerting levels based on the analysis of
the last values for some process variables. The analysed algorithms are applied to an experimental dataset.
For the agriculture sector, detecting and identifying plant diseases at an early stage is extremely important and
still very challenging. Machine learning is an application of AI that helps us achieve this purpose effectively. It
uses a group of algorithms to analyze and interpret data, learn from it, and using it, smart decisions can be
made. For accomplishing this project, a dataset that contains a set of healthy & diseased plant leaf images are
used then using image processing we extract the features of the image. Then we model this dataset with
different machine learning algorithms like Random Forest, Support Vector Machine, Naïve Bayes etc. The aim is
to hold out a comparative study to spot which of those algorithm can predict diseases with the at most
accuracy. We compare factors like precision, accuracy, error rates as well as prediction time of different
machine learning algorithms. After all these comparison, valuable conclusions can be made for this project.
Comparative Study on Machine Learning Algorithms for Network Intrusion Detect...ijtsrd
Network has brought convenience to the earth by permitting versatile transformation of information, however it conjointly exposes a high range of vulnerabilities. A Network Intrusion Detection System helps network directors and system to view network security violation in their organizations. Characteristic unknown and new attacks are one of the leading challenges in Intrusion Detection System researches. Deep learning that a subfield of machine learning cares with algorithms that are supported the structure and performance of brain known as artificial neural networks. The improvement in such learning algorithms would increase the probability of IDS and the detection rate of unknown attacks. Throughout, we have a tendency to suggest a deep learning approach to implement increased IDS and associate degree economical. Priya N | Ishita Popli "Comparative Study on Machine Learning Algorithms for Network Intrusion Detection System" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-1 , December 2020, URL: https://www.ijtsrd.com/papers/ijtsrd38175.pdf Paper URL : https://www.ijtsrd.com/computer-science/computer-network/38175/comparative-study-on-machine-learning-algorithms-for-network-intrusion-detection-system/priya-n
Multimodal authentication is one of the prime concepts in current applications of real scenario. Various
approaches have been proposed in this aspect. In this paper, an intuitive strategy is proposed as a
framework for providing more secure key in biometric security aspect. Initially the features will be
extracted through PCA by SVD from the chosen biometric patterns, then using LU factorization technique
key components will be extracted, then selected with different key sizes and then combined the selected key
components using convolution kernel method (Exponential Kronecker Product - eKP) as Context-Sensitive
Exponent Associative Memory model (CSEAM). In the similar way, the verification process will be done
and then verified with the measure MSE. This model would give better outcome when compared with SVD
factorization[1] as feature selection. The process will be computed for different key sizes and the results
will be presented.
Optimization of network traffic anomaly detection using machine learning IJECEIAES
In this paper, to optimize the process of detecting cyber-attacks, we choose to propose 2 main optimization solutions: Optimizing the detection method and optimizing features. Both of these two optimization solutions are to ensure the aim is to increase accuracy and reduce the time for analysis and detection. Accordingly, for the detection method, we recommend using the Random Forest supervised classification algorithm. The experimental results in section 4.1 have proven that our proposal that use the Random Forest algorithm for abnormal behavior detection is completely correct because the results of this algorithm are much better than some other detection algorithms on all measures. For the feature optimization solution, we propose to use some data dimensional reduction techniques such as information gain, principal component analysis, and correlation coefficient method. The results of the research proposed in our paper have proven that to optimize the cyberattack detection process, it is not necessary to use advanced algorithms with complex and cumbersome computational requirements, it must depend on the monitoring data for selecting the reasonable feature extraction and optimization algorithm as well as the appropriate attack classification and detection algorithms.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
GROUP FUZZY TOPSIS METHODOLOGY IN COMPUTER SECURITY SOFTWARE SELECTIONijfls
In today's interconnected world, the risk of malwares is a major concern for users. Antivirus software is a
device to prevent, discover, and eliminatemalwares such as, computer worm, trojan horses,computer
viruses,spyware and adware. In the competitive IT environment, due to availability of many antivirus
software and their diverse features evaluating them is an arguable and complicated issue for users which
has a significant impact on the efficiency of computers defense systems. The anti-virus selection problem
can be formulated as a multiple criteria decision making problem. This paper proposes an antivirus
evaluation model for computer users based on group fuzzy TOPSIS. We study a real world case of antivirus
software and define criteria for antivirus selection problem. Seven alternatives were selected from among
the most popular antiviruses in the market and seven criteria were determined by the experts. The study is
followed by the sensitivity analyses of the results which also gives valuable insights into the needs and
solutions for different users in different conditions.
Pattern recognition using context dependent memory model (cdmm) in multimodal...ijfcstjournal
Pattern recognition is one of the prime concepts in current technologies in both private and public sectors.
The analysis and recognition of two or more patterns is a complex task due to several factors. The
consideration of two or more patterns requires huge space for keeping the storage media as well as
computational aspect. Vector logic gives very good strategy for recognition of patterns. This paper
proposes pattern recognition in multimodal authentication system with the use of vector logic and makes
the computation model hard and less error rate. Using PCA two to three biometric patterns will be fusion
and then various key sizes will be extracted using LU factorization approach. The selected keys will be
combined using vector logic, which introduces a memory model often called Context Dependent Memory
Model (CDMM) as computational model in multimodal authentication system that gives very accurate and
very effective outcome for authentication as well as verification. In the verification step, Mean Square
Error (MSE) and Normalized Correlation (NC) as metrics to minimize the error rate for the proposed
model and the performance analysis will be presented.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Fuzzy Analytic Hierarchy Based DBMS Selection In Turkish National Identity Ca...Ferhat Ozgur Catak
Database Management Systems (DBMS) play an important role to support
enterprise application developments. Selection of the right DBMS is a crucial decision for
software engineering process. This selection requires optimizing a number of criteria.
Evaluation and selection of DBMS among several candidates tend to be very complex. It
requires both quantitative and qualitative issues. Wrong selection of DBMS will have a
negative effect on the development of enterprise application. It can turn out to be costly and adversely affect business process. The following study focuses on the evaluation of a multi criteria
decision problem by the usage of fuzzy logic. We will demonstrate the methodological considerations
regarding to group decision and fuzziness based on the DBMS selection problem. We developed a new
Fuzzy AHP based decision model which is formulated and proposed to select a DBMS easily. In this
decision model, first, main criteria and their sub criteria are determined for the evaluation. Then these
criteria are weighted by pair-wise comparison, and then DBMS alternatives are evaluated by assigning a
rating scale.
A Decision tree is termed as good DT when it has small size and when new data is introduced it can be classified accurately. Pre-processing the input data is one of the good approaches for generating a good DT. When different data pre-processing methods are used with the combination of DT classifier it evaluates to give high performance. This paper involves the accuracy variation in the ID3 classifier when used in combination with different data pre-processing and feature selection method. The performances of DTs are produced from comparison of original and pre-processed input data and experimental results are shown by using standard decision tree algorithm-ID3 on a dataset.
International Journal of Computational Engineering Research(IJCER)ijceronline
International Journal of Computational Engineering Research(IJCER) is an intentional online Journal in English monthly publishing journal. This Journal publish original research work that contributes significantly to further the scientific knowledge in engineering and Technology.
A Novel Hybrid Voter Using Genetic Algorithm and Performance HistoryWaqas Tariq
Triple Modular Redundancy (TMR) is generally used to increase the reliability of real time systems where three similar modules are used in parallel and the final output is arrived at using voting methods. Numerous majority voting techniques have been proposed in literature however their performances are compromised for some typical set of module output value. Here we propose a new voting scheme for analog systems retaining the advantages of previous reported schemes and reduce the disadvantages associated with them. The scheme utilizes a genetic algorithm and previous performances history of the modules to calculate the final output. The scheme has been simulated using MATLAB and the performance of the voter has been compared with that of fuzzy voter proposed by Shabgahi et al [4]. The performance of the voter proposed here is better than the existing voters.
A robust algorithm based on a failure sensitive matrix for fault diagnosis of...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Improving the accuracy of fingerprinting system using multibiometric approachIJERA Editor
Biometric technology is a science that used to verify or identify the individual based on physical and/or
behavioral traits. Although biometric systems are considered more secure than other traditional methods such as
password, or key, they also have many limitations such as noisy image, or spoof attack. One of the solutions to
overcome these limitations, is by applying a multibiometric system. Multibiometric system has a significant
effect in improving the performance of both security and accuracy of the system. It also can alleviate the spoof
attacks and reduce the fail to enroll error. A multi-sample is one implementations of the multibiometric systems.
In this study, a new algorithm is suggested to provide a second chance for the genuine user who is rejected, to
compare his/her provided finger with the other samples of the same finger. Multisampling fingerprint is used to
implement this new algorithm. The algorithm is activated when the match score of the user is not equal to a
threshold but close to it, then the system provides another chance to compare the finger with another sample of
the same trait. Using multi-sample biometric system improved the performance of the system by reducing the
False Reject Rate (FRR). Applying the original matching algorithm on the presented database produced 3
genuine users, and 5 imposters for the same fingerprint. While after implementing the suggested condition, the
system performance is enhanced by producing 6 genuine users, and 2 imposters for the same fingerprint. This
work was built and executed depending on a previous Matlab code presented by Zhi Li Wu. Thresholds and
Receiver Operating Characteristic (ROC) curves computed before and after implementing the suggested
multibiometric algorithm. Both ROC curves compared. A final decision and recommendations are provided
depending on the results obtained from this project
BEARINGS PROGNOSTIC USING MIXTURE OF GAUSSIANS HIDDEN MARKOV MODEL AND SUPPOR...IJNSA Journal
Prognostic of future health state relies on the estimation of the Remaining Useful Life (RUL) of physical
systems or components based on their current health state. RUL can be estimated by using three main
approaches: model-based, experience-based and data-driven approaches. This paper deals with a datadriven
prognostics method which is based on the transformation of the data provided by the sensors into
models that are able to characterize the behavior of the degradation of bearings.
For this purpose, we used Support Vector Machine (SVM) as modeling tool. The experiments on the
recently published data base taken from the platform PRONOSTIA clearly show the superiority of the
proposed approach compared to well established method in literature like Mixture of Gaussian Hidden
Markov Models (MoG-HMMs).
BEARINGS PROGNOSTIC USING MIXTURE OF GAUSSIANS HIDDEN MARKOV MODEL AND SUPPOR...IJNSA Journal
Prognostic of future health state relies on the estimation of the Remaining Useful Life (RUL) of physical systems or components based on their current health state. RUL can be estimated by using three main approaches: model-based, experience-based and data-driven approaches. This paper deals with a data driven prognostics method which is based on the transformation of the data provided by the sensors into
models that are able to characterize the behavior of the degradation of bearings.
For this purpose, we used Support Vector Machine (SVM) as modeling tool. The experiments on the recently published data base taken from the platform PRONOSTIA clearly show the superiority of the proposed approach compared to well established method in literature like Mixture of Gaussian Hidden Markov Models (MoG-HMMs).
4Data Mining Approach of Accident Occurrences Identification with Effective M...IJECEIAES
Data mining is used in various domains of research to identify a new cause for tan effect in the society over the globe. This article includes the same reason for using the data mining to identify the Accident Occurrences in different regions and to identify the most valid reason for happening accidents over the globe. Data Mining and Advanced Machine Learning algorithms are used in this research approach and this article discusses about hyperline, classifications, pre-processing of the data, training the machine with the sample datasets which are collected from different regions in which we have structural and semi-structural data. We will dive into deep of machine learning and data mining classification algorithms to find or predict something novel about the accident occurrences over the globe. We majorly concentrate on two classification algorithms to minify the research and task and they are very basic and important classification algorithms. SVM (Support vector machine), CNB Classifier. This discussion will be quite interesting with WEKA tool for CNB classifier, Bag of Words Identification, Word Count and Frequency Calculation.
Traditionally, in paper based election,voters cast their vote to select right candidate, where they simply put their vote in voting box and at the end of the voting day the votes are going to be count manually. This process was much time consuming as well as was erroneous. To overcome this drawback Electronic Voting Machine (EVM) was introduced. In EVM, Voter cast their vote by pressing the voting button which was on EVM. The Major advantage of EVM system is , the votes are counted automatically instead of manually. But the drawback of EVM machine was, the votes may get manipulated and was not secure. So to overcome all these drawbacks, research on biometric based voting system is going on. This Paper focuses on survey of different voting system using Fingerprint biometric through different algorithms and methods.
A Survey On Genetic Algorithm For Intrusion Detection SystemIJARIIE JOURNAL
The Internet has become a part of daily life and an essential tool today. Internet has been used as an important component of
business models. Therefore, It is very important to maintain a high level security to ensure safe and trusted communication of
information between various organizations.
Intrusion Detection Systems have become a needful component in terms of computer and network security. Intrusion detection is
one of the important security constraints for maintaining the integrity of information. Intrusion detection systems are the tools
used for prevention and detection of threats to computer systems. Various approaches have been applied in past that are less
effective to curb the menace of intrusion.
In this paper, a survey on applications of genetic algorithms in intrusion detection systems is carried out.
Synthesis of Polyurethane Solution (Castor oil based polyol for polyurethane)IJARIIE JOURNAL
Around 160 million hector unused is available in India. India is the world’s largest producer of castor oil,
producing over 75% of the total world’s supply. There are over a hundred companies in India-small and
medium-that are into castor oil production, producing a variety of the basic grades o castor oil. All the above
factors make it imperative that the India industry relooks at the castor oil sector in order to devise suitable
strategies to derive the most benefits from such an attractive confluence of factors. Castor oil is unique owing to
its exceptional diversity of application. The oil and its derivatives are used in over 100 different applications in
diverse industries such as paints, lubricants, pharma, cosmetics, paper, rubber and more. Recent developments
have successfully derived polyol from natural oils and synthesized range of PU product from them. However,
making flexible solution from natural oil polyol is still proving challenging. The goal of this thesis is to
understand the potentials and the limitations of natural oil as an alternative to petroleum polyol. An initial
attempt to understand natural oil polyol showed that flexible solution could be synthesized from castor oil,
which produced a rigid solution. Characterization results indicate that the glass transition temperature (Tg) was
the predominant factor that determines the rigidity of the solution. The high Tg of solution was attributed to the
low number of covalent bond between cross linkers.
Automatic signature verification with chain code using weighted distance and ...eSAT Journals
Abstract The signature forgery can be restricted by either online or offline signature verification techniques. It verifies the signature by
performing a match with the pre-processed signature dynamically by detecting the motion of stylus during signature while on
other hand, offline verifies by performing a match using the two dimensional scanned image of the signature. This paper studies
about the various techniques available in offline signature verification along with their shadows.
Keywords: Signature Verification, Weighted Distance, High Pressure Factor, Normalization, Threshold Value
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
2. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
fuzzy set theory. The voter assigns a fuzzy difference value to each pair of voter inputs based
on their numerical distance. A set off fuzzy rules then determines a single fuzzy agreeability
value for each individual input which describes how well it matches the other inputs. The
agreeability of each voter input is then defuzzi6ed to give a weighting value for that input
which determines its contribution to the voter output. The weight values are then used in the
weighted average algorithm for calculating the voter 6nal output. The voter is experimentally
evaluated from the point of view safety and availability, and compared with the inexact
majority voter in a Triple Modular Redundant structured framework. The impact of changing
some fuzzy variables on the performance of the voter is also investigated. We show that the
fuzzy voter gives more correct outputs (higher availability) than the inexact majority voter
with small and large errors, less incorrect outputs (higher safety) than the inexact majority
voter in the presence of small errors, and less benign outputs than the inexact majority voter.
In this paper different existing weighted average voting algorithms are surveyed and their
merits and demerits or limitations are discussed based upon which a novel History based
weighted Voting algorithm with Soft Dynamic threshold is proposed. Experimentation results
of the novel voting algorithm for Triple Modular Redundant (TMR) system are compared
with existing voting algorithms and the novel voter is giving almost 100% Safety if two of
the three modules are error free and giving better results for one error free module. Novel
voter is also giving better results for the multiple error conditions with all the modules having
errors.
Index Terms: Triple Modular Redundancy, Result Amalgamation, Weighted Average
Voters, History Records, Soft Dynamic Threshold, Safety Critical Systems
I. INTRODUCTION
Voter is a critical component in the implementation of N-Modular Redundant systems [13,
14]. Voting can be a hard problem in itself, for at least three reasons: i) floating point
arithmetic is not exact and thus voting on floating point values requires inexact voting, ii) the
output of variants (redundant modules) may be extremely sensitive to small variations in
critical regions, e.g. around threshold values in the specification, and iii) some problems have
multiple correct solutions (e.g. square roots) which may confuse the voter. Different voting
strategies have been introduced to handle these problems; examples are maximum likelihood
voter [15], predictor voters [16], stepwise negotiation voter [17] and word voters [18]. Two
traditional and widely used voting algorithms are the majority and the weighted average
voters. In its general form, an inexact majority voter [19] produces a correct output if the
majority of its inputs match each other; that is, they are within an application-specific interval
of each other. In cases of no majority, the voter generates an exception flag which can be
detected by the system supervisor to drive the system towards a safe state. Efficient
implementations of the majority voter have been addressed in [20-22].
The weighted average voter on the other hand, calculates the weighted mean of its redundant
input values. It is useful in applications such as clock synchronization in distributed computer
systems, pattern recognition and sensor planes where a result has to be generated in each
voting cycle. The weights can be predetermined or can be adjusted dynamically. Calculated
42
3. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
weights, wi, are then used to compute the voter output, y = Σwi. Xi / Σwi where xi values are
the voter inputs and y is the voter output.
The standard majority and weighted average voters are examples of two distinct
groups of voting algorithms. One with a high level of safety yet with a low level of
availability and the other with a low level of safety yet a high level of availability. This paper
introduces a novel voter with a compromise safety performance between the standard inexact
majority and weighted average voters. It also gives a higher availability than both the
majority and weighted average voters.
Safety Critical systems are the systems which may lead to hazards, loss of lives or great
damage to the Property if they fail. There are different domains in which safety critical
control systems are used - Automotives – Drive-by-wire systems, Break by wire systems used
in cars, Medicine - Infusion pumps, Caner Radiation Therapy machines etc., Military and
Space applications -Rocket launchers, Satellite launchers etc., Industrial Process Control,
Robotics and Consumer electronic appliances. There is a need to increase the reliability,
availability and safety in all these applications. Faults that occur in these applications may
lead to hazardous situations. If a single module or channel is used and when it becomes faulty
due to some noise the system may fail and hazard may occur.
Hence N – Modular Redundancy or N-Version Programming along with voting
technique is used to mask the faults in the faulty environments[1][2]. There are different
architectural patterns [10] in which redundant modules with a voter are used in the safety
critical systems. All the N-modules or N-versions [3] are designed by different teams to meet
the same specifications. All these modules take the same input data, process it and generate
the results which will be passed to the voter. The voter has to mask the fault by isolating or
avoiding the faulty module and the correct value has to be picked by the voter. There are
different types of voting algorithms [7] mentioned in the literature. Some Voting algorithms
like Majority, Plurality voters [4] generate the output if the majority or required numbers of
inputs to the voter are matched; otherwise it will generate no output so that the system can be
taken to the fail safe state. Adaptive Majority voting algorithm [9] gives better performance
by using history records. But for some safety-critical systems, there may not be any fail safe
state. In such systems, the voter has to generate some value as the output using some methods
like amalgamating the outputs or results of all modules, which is called as result
amalgamation. Median, average, weighted average voters are some examples for the voters
which amalgamate the inputs of the voter and generate some value as the voter output.
History based weighted average voters consider the history of the modules and for the highly
reliable module high weight is given. In this research work, Instead of harsh threshold, Soft
threshold which can be changed dynamically is used to find the agreeability value of each
module output with the other remaining module outputs. Harsh threshold results in
agreeability value of either 0 or 1 but soft threshold method uses fuzzy Z function to generate
agreeability or closeness value as shown in Figure 2.
This Research Paper is organized as follows:
Section II is the literature survey of the existing voting algorithms.
In Section 3, Implementation of fuzzy voter
43
4. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
In Section 4 Improvement in History based weighted Voting algorithm with soft dynamic
threshold is given
In Section 5, Experimental method and Test Harness is described
In Section 6, Experimental results are analyzed
In Section 7, Conclusions and Future works are given.
2. RELATED WORKS
Fuzzy voter designed in this paper is nothing but a softened inexact majority voter. In this
voter there is a need for two thresholds. All the distance values or agreement distance values
for each pair of module outputs, below the lower threshold are considered as complete
agreement cases. The distances above the upper threshold are considered as complete
disagreement cases. The middle distance values between lower and upper thresholds are
processed using fuzzy approach. In this fuzzy approach three parameters p, q and r are used
which will decide small, medium and large membership values as shown in Figure 1. Rule
based fuzzy inference step along with centroid norm for defuzzification are used in this voter.
But the fuzzy parameter values are statically selected in this voter and the performance of the
voter varies with variation of these fuzzy parameter values. Static selection of fuzzy threshold
parameter values is a major limitation in this voter. There is a need for automatic dynamic
selection of values for these parameters for any kind of dynamically varying input dataset.
Fig.1: Structure of a three-input fuzzy voting unit
Fuzzy voting approach is given by Blank et al. (2010) for sensor fusion for the systems
with low system information. In this fuzzy voter design, only fuzzy membership functions are
used instead of fuzzy rule based inference. Scores are assigned for each sensor based upon
these fuzzy membership function values and then fused output value is calculated as a
weighted average using these scores. Computational complexity is reduced compared to the
rule based fuzzy and centroid norm for defuzzification. Performance of this membership
function based voter is little lower compared to the rule based fuzzy voter but computational
complexity is far reduced. In this voter also, optimal values are selected for fuzzy parameters
statically based on trial and error method. These fuzzy parameters may not be efficient for the
44
5. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
other data or dynamically changing data. The steps involved in the static fuzzy voting
approach given by Blank et al. (2010) for sensor fusion are given below.
1) Find the Eucledian distance between each pair of module outputs i, j.
dij=|xi-xj|
For eah module i = 1 to n
For each module j = 1 to n
DM (i, j) = dij
End
End
2) From the distance matrix DM by copying the upper triangle matrix elements except the
diagonal elements into the vector ND.
3) Normalize the module agreement distances in the set ND in the 0 – 1 scale before using
them in the fuzzy membership functions.
4) Specific shapes used for fuzzy membership functions are shown in Figure 2.
Figure 2: Fuzzy membership functions for static fuzzy voter
The concept of three point’s fuzziness is when the fuzzy value boundaries and its most
probable or most advisable value are known. Such way of the definition of fuzzy value can be
described by the triangular membership function. With the normalized module agreement
distance values, compute the membership vector µx(dij) for the agreement sets. Later express
degree of agreement in the module outputs in a closed linear form. Therefore it is
Possible to calculate the membership values µset(dij) very efficiently.
Module scores are computed instead of making use of fuzzy rule set for inference step.
Scores for module i and j which have dij as the module agreement distance can be calculated
as follows:
scorei += µhigh(dij) + µmed(dij) - µlow(dij)
scorej += µhigh(dij) + µmed(dij) - µlow(dij)
Initially, all the module scores are initialized to zero. For each normalized distance of module
pairs in closed linear form, scores are updated by accumulating with the newly computed
score for the corresponding module.
45
6. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
For example for TMR system, for each of the normalized distances d12, d23 and d31, cor-
responding module scores are updated as given.
For normalized distance d12,
Score1+= µhigh(d12)+ µmed(d12)- µlow(d12)
Score2+= µhigh(d12)+ µmed(d12)- µlow(d12)
For normalized distance d23,
Score2+= µhigh(d23)+ µmed(d23)- µlow(d23)
Score3+= µhigh(d23)+ µmed(d23)- µlow(d23)
For normalized distance d31,
Score3+= µhigh(d31)+ µmed(d31)- µlow(d31)
Score1+= µhigh(d31)+ µmed(d31)- µlow(d31)
6) If any module score is negative, normalize all the module scores to make all of them
positive.
Calculate the voter output as a weighted average of scores and module output values.
If i n scorei = _1 >0
Output = i n scorei xi = _1 * / i n scorei = _1
Otherwise
Output = (x1+x2+……..+xn)/n
Figure 3: The distance matrix
46
7. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
Fig.4: Structure of a five-input fuzzy voting unit
Basic Lorkzok’s Standard weighted average voting algorithm (Lorkzok WA):
In this voting algorithm [8] weights are calculated based on the distances between the
module outputs as given below
ଵ
Wi = మ ሺೣ,ೣೕሻ
ଵା∏ಿ
సభ,ೕసభ ഀమ
!సೕ
Where d (xi,xj) is the distance between the output values of module i and module j and a is a
scaling factor.
After assigning the weights, output of the voter is calculated as follows:
௪
Y = ∑ே ሺ
ୀଵ ሻ xi
௦
Where s is the sum of all the weights In this algorithm, reliability of the modules in the
previous voting cycles called history is not considered.
History based weighted average voting algorithm Algorithm for building history
records:
History records [6] are built based on the reliability of the modules. If a module has
contribution for the majority consensus of the outputs of all the modules in a particular voting
cycle, then a Boolean variable is set to 1 otherwise
Cleared to 0 the cumulative sum of this Boolean variable up to the current voting cycle is
calculated which is the
History record of a particular module A module with high cumulative sum value is the
highest reliable module and
The one with low cumulative sum value is less reliable module.
47
8. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
This history value is normalized by dividing it by the cycle number and is called as the state
indicator Pi of the
Module I. There are two versions of history based weighted average voters called state
indicator based and
Module elimination based weighted average voting algorithms as described in the reference
[6].
In the state indicator based weighted average voting
Algorithm (HWA1), weights are assigned based on the state indicator Pi value
Wi = pi
In module elimination based weight assignment (HWA2) method if state indicator value of a
module is less than the average state indicator value of all the modules then weight for that
module is assigned as zero and eliminated from contributing to the voter output.
Wi = 0 if Pi < Pavg
Where Pavg = (P1+P2+P3+............+PN)/N
Otherwise Wi = Pi
If we consider Triple Modular Redundancy (TMR), these two versions work well if the same
two modules
Consistently reliable and the other module generates outputs with some error. But in the
reality, any module
May fail randomly and generate erroneous outputs. The existing history based weighted
average algorithms failed
To produce the correct results even though majority of the modules have generated the error
free outputs. This
Problem occurred since values for weights are assigned only based upon history. The module
which generates
Correct output in the present cycle may be neglected and zero or less weight may be assigned
for that module if it
Has poor history record. Hence proper weight is not given for the degree of closeness or
agreement of a
Module with other module outputs.
Weighted average voter with Soft Threshold (WA ST):
In this voting algorithm [5] Degree of Closeness is calculated. Degree of closeness of
each module with
Other module is calculated and average agreement value is calculated and assigned as a
weight for that module.
Threshold is made soft by using a roll-off constant which is tenable. But in this algorithm
history is not used. This
Algorithm generated no output or benign output if all the weights of all the modules are
assigned zero value.
In Reference [11], Modified History based weighted average voting with soft dynamic
threshold is given. In
This work, the threshold is calculated based upon the notional correct output of the voter. It is
difficult to predict the voter output before only to decide the threshold. It is a major limitation
in this voter. In Reference [12], a neural network based voter is designed and the neural
network is trained using feed forward error back propagation algorithm. It is time taking
process to train the network.
48
9. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
3. FUZZY VOTER
The fuzzy voter described herein uses fuzzy logic to generate the weights required for
calculating a weighted average voter output. Fig. 5 shows the basic structure of a three-input
fuzzy voter.
Fig 5: structure of 3 input voter
3.1. Calculating the fuzzy difference of input pairs
The first step in the approach requires the de6nition of a fuzzy difference variable to
describe each pair of inputs to the voter. For each pair xi and xj with numerical distance dij ,
based on the triangular membership functions shown in Fig 6. , we de6ne a fuzzy difference
variable represented by a set of membership grades µA(dij) where A: {small; medium;
large}.Where symmetrical sets are used, this requires two parameter values to be speci6ed.
Based on the numerical difference between any two inputs, a non-zero membership grade
will be assigned to one or two of the fuzzy sets defined for the corresponding fuzzy
difference variable. For convenience triangular fuzzy membership functions are used. which
a ramp function in place of the traditional hard (that is, discontinuous) threshold found in
traditional inexact majority voters.
Fig 6: Definition of the difference variable membership functions
49
10. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
Fig. 6 shows two qualitatively different fuzzy difference variables. In the first case, there is a
significant region in which two inputs which differ by a non-zero amount are regarded as
being in de6nite agreement; an intermediate region in which the difference is specified using
linguistic variables that may be true to a lesser of greater extent (for example, the difference
between two inputs may be such that a non-zero membership is awarded to the small and
medium fuzzy difference sets); and a third region which identi6es inputs that are in definite
disagreement. In the case of the second fuzzy variable, there is no region of definite
agreement speci6ed, although there is (as one would expect) a region ofde6nite disagreement.
Fig 7: Definition of the output fuzzy variable membership functions
The final output value y for an m-way fuzzy voter is obtained by weighting each input signal
xi with the calculated weight wi:
y = Σwi . xi / Σwi
3.3. Fuzzy rule set definition
Table 1 shows a rule matrix that summarizes one possible set of fuzzy rules for combining
and
Mapping fuzzy difference values onto a fuzzy agreeability value in a 3-input system.
TABLE1: Rule matrix for fuzzy input variables
& Dij
small Medium large
Dik small vhigh Med high
medium med Low vlow
large high Vlow vlow
50
11. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
Fig 8: The Rule Viewer for fuzzy voter as follows
4. AN IMPROVEMENT IN HISTORY BASED WEIGHTED VOTING ALGORITHM
FOR SAFETY CRITICAL SYSTEMS
A correctly functioning weighted average voter always generates the weighted mean
of its inputs that is identical to or in between the inputs that the majority voter would select as
in agreement. It is obvious that the output of inexact majority and weighted average voters for
all agreement voting cycles are similar. This implication leads us to introduce a novel voter
that is a combination of majority and weighted average voters. It performs as a majority voter
in agreement cases, and functions as a weighted average voter in disagreement voting cycles.
The voter is less complex and quicker than the weighted average voter, since in majority of
the cases it does not perform the relatively time consuming weighted averaging procedure.
A novel history based weighted average voting algorithm with soft dynamic threshold is
given below:
1. Let x1, x2, …xm be the voter inputs and y its output.
2. The distance between the output of module xi and output of modulej xj is
calculated as
dij = |xi-xj|
3. Closeness index Sij using following formula
1 if dij<=VT
|ௗି௩௧|
Sij = 1-כ௩௧ି௩௧ if dij<=n*VT
0 if dij>n*VT
Where n is a variable that can be assigned a value >=2 to make the threshold soft.
And dij is the distance between i and j module outputs and vt is the voting threshold.
4. Calculate History values using the procedure given in the Reference [6] but use n*VT
as the threshold for agreement while calculating history records. Find the Normalized
history values for each module by dividing the history with cycle number.
51
12. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
5. History and Closeness Product (H) for each module as follows
Hi = Pi*(Σ Sij/N-1)
Where N is the Total number of modules And Pi is a normalized history value of the
module i and Pi=Histi /cycleno
6. Normalized History average Pavg
Pavg = ∑ே
ୀଵ
7. Calculate the weights for all N modules as follows
For i =1 to N
if HCPi=0 AND Pi<Pavg
Wi=0
Otherwise
Wi=2*Hi
8. If all the weights are equal to zero in the worst case, Modify the weights as follows
Wi=pi2
for i= 1 to N
9. Calculate weighted average using the weights
∑ಿ ௪ . ௫
సభ
Y = ∑ಿ ௪
సభ
Where Wi is the weight of i th module and xi is the output of i th module
5. EXPERIMENTAL METHODOLOGY
5.1. Test Harness: Test Harness for experimentation with voting algorithms is shown in the
Figure 9. Below.
The input generator produces one notional correct result in each voting cycle. This sequence
of numbers identical correct results expected from redundant modules. Copies of the notional
correct result are presented to each saboteur in every voting cycle. The saboteurs can be
programmed to introduce selected module error amplitudes, according to selected random
distributions. The symptom of errors appears to the voter as numerical input values. A
comparator is used to check for agreement between the notional correct result and the output
of the voter under test at any voting cycle. However, for simplicity, issues associated with
ensuring synchronization of the inputs to the voter and to the saboteurs are ignored.
Figure 9: Experimental Setup to evaluate the Performance of a Voter
52
13. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
A voter threshold (dynamic or fixed), VT, is used to determine the maximum acceptable
divergence of voter inputs in each voting cycle from the notional correct result, and an
accuracy threshold, AT, is used in comparator to determine if the distance between the
notional correct result and the voter output is within acceptable limits. In this framework, the
accuracy threshold is chosen equal to the voter threshold in each voting cycle. A voter result
which has a distance from the notional correct answer less than the accuracy threshold is
taken as a correct output, otherwise it is considered as an incorrect output. This is a valid
assumption in a many real time systems in which the discontinuity between consecutive
correct variant results is small (Bennett, 1994). Hence, the presence a large discontinuity is
indicative an error and can be detected by the acceptance tests. Where the voter cannot reach
an agreement between the outputs of saboteurs, it produces a of default value that moves the
system toward a fail-safe or fail-stop state. Such voter output is called a disagreed (benign)
result. It is also assumed that all voters perform correctly. This assumption is made due to the
fact that the voting algorithm is usually a simpler program than the modules it monitors.
Cyclic data like Sin wave is generated using the equation
Given below
Input data = 100 + 100 * sin (t)
Sample rate t is taken as 0.1.
Generated input data is given to each of the modules and the random error of uniform
distribution is injected
Into each of the required module in the required range [-e, +e]. Initially generated input data
before injecting the
Error is considered as the notional correct output. Fixed voting Threshold and Accuracy
Threshold are
Considered as 0.5
For the Soft Dynamic Threshold methods, Voting Threshold can be varied dynamically. For
the Weighted Average voter with Soft Dynamic Threshold the tunable parameter is taken as
5. For the Novel Algorithm n=5 taken which is same as tunable parameter of Weighted
Average voter with Soft Dynamic Threshold so that results can be compared
Based on the n value threshold is changed. Closeness Index varies as shown in the Figure 10.
Below based upon
The distance measure.
Figure 10: Closeness Index (Sij) versus distance measure (dij) in the novel algorithm
53
14. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
The generated output by the voter is compared with the notional correct output and if the
difference is less
Than the accuracy threshold value, it is considered as the correct result otherwise incorrect
result. Each set of Experiment is performed for 10000 runs and the number of correct results
(nc) and number of incorrect results (nic) are counted.
Then the performance of the voter is evaluated by using the parameters Availability and
Safety as given below:
Availability = nc /n
Safety = 1-(nic/n)
Where nc = Number of correct results given by a voter
nic = Number of Incorrect results given by a voter and
n= Total number of runs or voting cycles
Safety (S): Since from a safety viewpoint the smallest number of agreed but incorrect outputs
is desirable for a given voter, the safety measure can be defined as: S = (1-n ic / n). Thus S [0
1] and ideally S=1.
Reliability (R): A voter which produces more correct results among its total outputs can be
interpreted as more reliable voter. Reliability is defined as the ratio of correct voter outputs to
the number of voting actions:
R= n c / n. Thus R[0 1] and ideally R=1.
Within the test harness the following parameters can be adjusted.
• The value of consensus threshold.
• Number of voter inputs. The test harness provides a facility to define 3, 5 and 7 inputs
voters.
• Input data trajectory and sample rate. Different types of input data trajectory can be
selected within the test harness. The frequency of data arrival (sample rate of input data)
is also adjustable.
• The value of accuracy threshold.
• The amplitude of injected errors. The amplitude of injected errors can be expressed as a
function of the input signal. If δ is defined as the maximum amplitude of errors during a
particular test consisting of n voting cycles, and A is taken as the maximum amplitude of
input data, the δ / A will be the maximum error-to-signal ratio (ESR) of that particular
test.
• Number of injected errors. One or two or three saboteurs may be programmed to
simulate variant results’ errors.
• Error persistence time. The experimental harness has capability to select error persistence
time,
Error arrival intervals and its activation period Errors can be permanent, transient, and
intermittent.
• Error Distribution. A variety of error distributions including uniform, exponential,
normal and
Poisson distributions with adjustable parameters have been defined within the test
harness.
5. EXPERIMENTAL RESULTS
Empirical evaluation of the safety performance of the voters is done by running each voter
for 10000 voting cycles.
54
15. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
Fig 11: Safety Comparison with all modules having equal amplitude errors
(Small Errors) for 10000 runs
Figure 11. Shows safety performance for small errors for the large and small errors, proposed
novel algorithm has better safety performance than the Module elimination version of the
History based weighted average voter, Lorkzok’s weighted average voter and Weighted
average voter with Soft dynamic Threshold. But in this case, State Indicator version of the
History based weighted average voter is performing better than all other voting algorithms
since it considers only history to assign the weights.
Safety of different voting algorithms with two error free modules for 10000 runs is
compared in Figure.
In this upto 3000 cycles’ module1 and module2 are error free and module3 is
perturbed with an error in the range
[-e, +e], From 3001 to 7000 cycles module1 and 3 are error free and module2 is
perturbed with an error in the range [-e, +e], From 7001 to 10000 cycles modules 2 and 3 are
error free where as module1 is perturbed with an error in the range [-e,+e].The two History
based weighted average versions called State Indicator based version and Module
Elimination based version failed to give 100% safety even though two modules are error free.
The reason is, much
Importance is given for previous reliability history but in the current voting cycle,
things may be different. A module which has got good history so far may be perturbed with
errors in the current voting cycle. But due to its past reliability history, It is given high weight
and the erroneous module contributes much for the voter output. This is a major limitation in
the two versions of history based weighted average voter which has been
Overcome in the Novel Algorithm by taking the History and Closeness Product (HCP) into
consideration as given
In the algorithm while assigning the Weights assigned and outputs for the given input
values are shown in the Table 2. And Table 3. For Module Elimination based weighted
average voter and proposed novel History based weighted average voter with Soft Dynamic
Threshold respectively. In the Table 2. And Table 3. Column headings are given
55
16. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
Below x – Notional Correct output x1, x2, x3 are the outputs generated by module1, module2
And module3 respectively H1,H2,H3 are history values of the modules and w1,w2,w3 are the
weights assigned for the modules and HWA O/P is the output produced by the module
elimination version of History based weighted average voter. NH1, NH2, NH3 are the history
values and N_w1, N_w2, N_w3 are the weights assigned for the
Modules and N_O/P is the output produced by the proposed novel algorithm.
In the Table 2 and Table 3 Third module is perturbed with error up to 20 th cycle
where as remaining two modules are error free and there onwards for the remaining voting
cycles, Second module is perturbed with error where as remaining two modules are error free.
Module Elimination version of History based weighted average voter results are compared
with the Novel voting algorithm. If the same two modules are consistently error free, module
elimination based version is producing the correct results. But practically this is not possible.
Any module may be inconsistent and fail randomly at the runtime. Cycle no 21 onwards,
second module is perturbed with errors. But module elimination version gives importance to
the previous history and hence gives high weight to the erroneous module2 as shown in
Table.3. Due to this high weight, it contributes much for the result. Module elimination
version needs some recovery time. Whereas, the novel algorithm considers History and
Closeness or Consensus for Majority of each module to assign the weights and is able to
produce correct outputs as shown in Table.3, if any two modules are error free
TABLE 2: OUTPUTS GENERATED BY HISTORY BASED WEIGHTED
AVERAGE VOTER
Cycle
x X1 X2 X3 H1 H2 H3 W1 W2 W3 o/p
no
16 199.749 199.749 199.749 195.921 16 16 1 1 1 0 199.749
17 199.957 199.957 199.957 205.465 17 17 1 1 1 0 199.957
18 199.166 199.166 199.166 200.042 18 18 1 1 1 0 199.166
19 197.385 197.385 197.385 190.56 19 19 2 1 1 0 197.385
20 194.63 194.63 194.63 184.68 20 20 2 1 1 0 194.63
TABLE 3: OUTPUTS GENERATED BY PROPOSED NOVEL VOTER
Cycle
x X1 X2 X3 NH1 NH2 NH3 N_w1 N_w2 N_w3 o/p
no
16 199.749 199.74 199.74 195.92 16 16 1 1 1 0 199.749
17 199.957 199.95 199.95 205.46 17 17 1 1 1 0 199.957
18 199.166 199.16 199.16 200.04 18 18 2 1.8125 1.81249 0.1806 199.208
19 197.385 197.38 19738 190.56 19 19 2 1 1 0 17.385
20 194.63 194.63 194.63 184.68 20 20 2 1 1 0 194.63
21 190.93 190.93 191.42 190.93 21 21 3 2 2 0.2857 191.163
22 186.321 186.32 191.55 186.32 22 21 4 1 0 0.1818 186.321
7. CONCLUSIONS & FUTURE WORK
In this work, a Novel History based weighted average voter with Soft Dynamic Threshold is
designed and safety performance is evaluated empirically for 10,000 voting cycles on a Triple
Modular Redundant system (TMR). Reliability history of the modules and closeness
or agreeability of a module output with other module outputs (majority consensus) in a
voting cycle are used to assign the weights for the individual modules and final output is
56
17. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
generated by calculating the weighted average of all the module outputs. The Novel voting
algorithm is performing better and giving almost 100% safety if majority of the modules are
error free which is the much needed behaviour for fault masking in the practical applications.
Novel voting
Algorithm is also giving better safety performance for the multiple error scenarios
compared to the other history based weighted average voters.
Majority consensus is established if the majority of the modules generate the same
output values, which need not be correct. Majority of modules may coincidentally generate
the erroneous same output and may cause for the majority consensus and contribute for the
final output. This can be overcome using forecasting and prediction algorithms like double
exponential smoothing and
Interpolation to predict the cyclic pattern data output for the current cycle based on
the outputs of the past cycles and it remains the future work.
REFERENCES
[1] J.-C. Laprie, “Dependable computing and fault-tolerance: concepts and terminology,”
in Digest of Papers FTCS’15:
IEEE 15th Annu. Int.Symp. Fault-Tolerant Computing Systems, Ann. Arbor, MI,
June 1985,pp. 2–11.
[2] B. W. Johnson, Design and Analysis of Fault-Tolerant Digital Systems New York:
Addison-Wesley, 1989.
[3] L. Chen and A. Avizienis, “N-Version Programming: a fault-tolerance approach to
reliability of software
Operation,” in Digest of Papers FTCS’8: IEEE 8th Annu. Int. Symp. Fault-Tolerant
Computing Systems, Toulouse,France, June 1978, pp. 3–9. York: Addison-Wesley,
1989.
[4] D. M. Blough and G. F. Sullivan, “A comparison of voting strategies for fault-
tolerant distributed systems,” in Proc.IEEE 9th Symp. Reliable Distributed Systems,
Huntsville, Alabama, Oct. 1990, pp. 136–145.
[5] G. Latif-Shabgahi, "A Novel Algorithm for Weighted Average Voting Used in Fault-
Tolerant Computing Systems," Microprocessors and Microsystems, vol. 28, pp. 357-
361,2004.
[6] G. Latif-Shabgahi, J. M. Bass, and S. Bennett, "History- Based Weighted Average
Voter: A Novel Software Voting Algorithm for Fault-Tolerant Computer Systems,"
Euromicro Conference on Parallel, Distributed, and Network-Based
Processing, pp. 402-409, 2001
[7] G. Latif-Shabgahi, Julian M. Bass and Stuart Bennett, “A taxonomy for software
voting algorithms used in safetycritical Systems,” IEEE Trans.Reliability, vol. 53, no.
3, pp 319-328, Sept. 2004
[8] P. R. Lorczak , A. K. Caglayan, and D. E. Eckhardt, "A Theoretical Investigation of
Generalized Voters for Redundant Systems," presented at FTCS-19. Digest of
Papers. Nineteenth International Symposium on Fault-
Tolerant Computing" chicago, USA, 1989.
[9] G. Latif-Shabgahi, S. Bennett, "Adaptive majority voter: a novel voting algorithm for
real-time fault-tolerant control Systems," in 25th Euromicro Conf., vol. 2, 1999, pp.
113- 120
57
19. International Journal of Computer science and Engineering Research and Development (IJCSERD),
ISSN 2248- 9363 (Print), ISSN- 2248-9371 (Online) Volume 3, Number 1, Jan-March (2013)
ABOUT AUTHERS
Dr.Panchumarthy Seetha Ramaiah obtained his Ph.D in Computer
Science from Andhra University in 1990. He is presently working as a
professor of Computer Science in the department of Computer Science and
Systems Engineering, Andhra University College of Engineering,
Visakhapatnam- INDIA. His research interests are safety critical systems,
VLSI design, Embedded Systems and Fault Tolerant Systems. Dr.P. Seetha
Ramaiah is the Principal Investigator for several Defence R&D projects
and Department of Science and Technology projects of the Government of
India in the areas of Embedded Systems and robotics. He has published
twelve journal papers, and presented twenty International Conference papers in addition to
twenty one papers at National Conferences in India.
B.Uma Maheswara Rao obtained his M.Tech (CSE) in Computer Science
from Andhra University in 2008. He is presently as a Research Scholar of
Computer Science in the department of Computer Science and Systems
Engineering, Andhra University College of Engineering, Visakhapatnam-
INDIA. His research interests are safety critical systems, VLSI design,
Embedded Systems and Fault Tolerant Systems. He has published one
journal papers.
59