journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJERD, journal of science and technology, how to get a research paper published, publishing a paper, publishing of journal, publishing of research paper, reserach and review articles, IJERD Journal, How to publish your research paper, publish research paper, open access engineering journal, Engineering journal, Mathemetics journal, Physics journal, Chemistry journal, Computer Engineering, Computer Science journal, how to submit your paper, peer reviw journal, indexed journal, reserach and review articles, engineering journal, www.ijerd.com, research journals,
yahoo journals, bing journals, International Journal of Engineering Research and Development, google journals, hard copy of journal
Min-based qualitative possibilistic networks are one of the effective tools for a compact representation of
decision problems under uncertainty. The exact approaches for computing decision based on possibilistic
networks are limited by the size of the possibility distributions. Generally, these approaches are based on
possibilistic propagation algorithms. An important step in the computation of the decision is the
transformation of the DAG (Direct Acyclic Graph) into a secondary structure, known as the junction trees
(JT). This transformation is known to be costly and represents a difficult problem. We propose in this paper
a new approximate approach for the computation of decision under uncertainty within possibilistic
networks. The computing of the optimal optimistic decision no longer goes through the junction tree
construction step. Instead, it is performed by calculating the degree of normalization in the moral graph
resulting from the merging of the possibilistic network codifying knowledge of the agent and that codifying
its preferences.
Binary Classification with Models and Data Density Distribution by Xuan ChenXuan Chen
This document discusses binary classification using probabilistic labels. It introduces the problem of determining whether models can achieve more accurate predictions using probabilistic labels that indicate the probability a sample belongs to a class, rather than deterministic 0-1 labels. The author conducted experiments on several common models using different data density distributions. The results showed Gaussian Process Regression generally had the best performance and achieved error bounds of O(n-2+γ4) when using probabilistic labels under certain data distributions, outperforming existing error bounds for deterministic labels. However, under some distributions the existing bounds were not exceeded.
Intrusion Detection System for Classification of Attacks with Cross Validationinventionjournals
Now days, due to rapidly uses of internet, the patterns of network attacks are increasing. There are various organizations and institutes are using internet and access or share the sensitive information in network. To protect information from unauthorized or intruders is one of the important issues. In this paper, we have used decision tree techniques like C4.5 and CART as classifier for classification of attacks. We have proposed an ensemble model that is combination of C4.5 and Classification and Regression Tree (CART) as robust classifier for classification of attacks. We have used NSL-KDD data set with binary and multiclass problem with 10-fold cross validation. The proposed ensemble model gives satisfactory accuracy as 99.67% and 99.53% in case of binary class and multiclass NSL-KDD data set respectively.
Evolutionary Algorithmical Approach for VLSI Physical Design- Placement ProblemIDES Editor
Physical layout automation is very important in
VLSI’s field. With the advancement of semiconductor
technology, VLSI is coming to VDSM (Very Deep Sub
Micrometer), and the scale of the random logic IC circuits
goes towards million gates. Physical design is the process of
determining the physical location of active devices and
interconnecting them inside the boundary of the VLSI
chip.The earliest and the most critical stage in VLSI layout
design is the placement. The background is the rectangle
packing problem: given a set of rectangular modules of
arbitrary sizes, place them without overlap on a plane within
a rectangle of minimum area [1], [5]. The VLSI placement
problem is to place the object in the fixed area of die without
overlap and with some cost constrain such as the wire length
and area of the die. The wire length and the area optimization
is the major task in the physical design. We first
introduce about the major technique involved in the algorithm
Solving Scheduling Problems as the Puzzle Games Using Constraint Programmingijpla
Constraint programming (CP) is one of the most effective techniques for solving practical operational
problems. The outstanding feature of the method is a set of constraints affecting a solution of a problem
can be imposed without a need to explicitly defining a linear relation among variables, i.e. an equation.
Nevertheless, the challenge of paramount importance in using this technique is how to present the
operational problem in a solvable Constraint Satisfaction Problem (CSP) model. The problem modelling is
problem independent and could be an exhaustive task at the beginning stage of problem solving,
particularly when the problem is a real-world practical problem. This paper investigates the application of
a simple grid puzzle game when a player attempts to solve practical scheduling problems. The examination
scheduling and logistic fleet scheduling are presented as operational games. The game‘s rules are set up
based on the operational practice. CP is then applied to solve the defined puzzle and the results show the
success of the proposed method. The benefit of using a grid puzzle as the model is that the method can
amplify the simplicity of CP in solving practical problems.
Fundamentals of the fuzzy logic based generalized theory of decisionsSpringer
The document discusses decision making under imperfect information when information is represented by geometric primitives rather than numbers or fuzzy sets. It introduces the concept of "fuzzy geometry" or "F-geometry" where visual perceptions that cannot be precisely represented are modeled using geometric shapes. Five cases of imperfect information are considered, from numerical information to purely visual perceptions. F-geometry uses primitives like points, lines, and shapes to represent perceptions and enables reasoning about decisions despite imprecise information. Basic primitives, operations, and axioms of F-geometry are defined to formally represent and reason with perceptions.
Lexisearch Approach to Travelling Salesman ProblemIOSR Journals
The aim of this paper is to introduce Lexisearch the structure of the search algorithm does not
require huge dynamic memory during execution. Mathematical programming is concerned with finding optimal
solutions rather than obtaining good solutions. The Lexisearch derives its name from lexicography .This
approach has been used to solve various combinatorial problems efficiently , The Assignment problem, The
Travelling Salesman Problem , The job scheduling problem etc. In all these problems the lexicographic search
was found to be more efficient than the Branch bound algorithms. This algorithm is deterministic and is always
guaranteed to find an optimal solution.
This lesson teaches students about the relationship between visual fraction models and equations when dividing fractions. Students will formally connect fraction models to multiplication through the use of multiplicative inverses. They will use fraction strips and tape diagrams to model division problems involving fractions. Students will learn that dividing a fraction by another fraction is the same as multiplying by the inverse or reciprocal of the divisor fraction. The lesson provides examples showing how to set up and solve word problems involving division of fractions using visual models and equations.
Min-based qualitative possibilistic networks are one of the effective tools for a compact representation of
decision problems under uncertainty. The exact approaches for computing decision based on possibilistic
networks are limited by the size of the possibility distributions. Generally, these approaches are based on
possibilistic propagation algorithms. An important step in the computation of the decision is the
transformation of the DAG (Direct Acyclic Graph) into a secondary structure, known as the junction trees
(JT). This transformation is known to be costly and represents a difficult problem. We propose in this paper
a new approximate approach for the computation of decision under uncertainty within possibilistic
networks. The computing of the optimal optimistic decision no longer goes through the junction tree
construction step. Instead, it is performed by calculating the degree of normalization in the moral graph
resulting from the merging of the possibilistic network codifying knowledge of the agent and that codifying
its preferences.
Binary Classification with Models and Data Density Distribution by Xuan ChenXuan Chen
This document discusses binary classification using probabilistic labels. It introduces the problem of determining whether models can achieve more accurate predictions using probabilistic labels that indicate the probability a sample belongs to a class, rather than deterministic 0-1 labels. The author conducted experiments on several common models using different data density distributions. The results showed Gaussian Process Regression generally had the best performance and achieved error bounds of O(n-2+γ4) when using probabilistic labels under certain data distributions, outperforming existing error bounds for deterministic labels. However, under some distributions the existing bounds were not exceeded.
Intrusion Detection System for Classification of Attacks with Cross Validationinventionjournals
Now days, due to rapidly uses of internet, the patterns of network attacks are increasing. There are various organizations and institutes are using internet and access or share the sensitive information in network. To protect information from unauthorized or intruders is one of the important issues. In this paper, we have used decision tree techniques like C4.5 and CART as classifier for classification of attacks. We have proposed an ensemble model that is combination of C4.5 and Classification and Regression Tree (CART) as robust classifier for classification of attacks. We have used NSL-KDD data set with binary and multiclass problem with 10-fold cross validation. The proposed ensemble model gives satisfactory accuracy as 99.67% and 99.53% in case of binary class and multiclass NSL-KDD data set respectively.
Evolutionary Algorithmical Approach for VLSI Physical Design- Placement ProblemIDES Editor
Physical layout automation is very important in
VLSI’s field. With the advancement of semiconductor
technology, VLSI is coming to VDSM (Very Deep Sub
Micrometer), and the scale of the random logic IC circuits
goes towards million gates. Physical design is the process of
determining the physical location of active devices and
interconnecting them inside the boundary of the VLSI
chip.The earliest and the most critical stage in VLSI layout
design is the placement. The background is the rectangle
packing problem: given a set of rectangular modules of
arbitrary sizes, place them without overlap on a plane within
a rectangle of minimum area [1], [5]. The VLSI placement
problem is to place the object in the fixed area of die without
overlap and with some cost constrain such as the wire length
and area of the die. The wire length and the area optimization
is the major task in the physical design. We first
introduce about the major technique involved in the algorithm
Solving Scheduling Problems as the Puzzle Games Using Constraint Programmingijpla
Constraint programming (CP) is one of the most effective techniques for solving practical operational
problems. The outstanding feature of the method is a set of constraints affecting a solution of a problem
can be imposed without a need to explicitly defining a linear relation among variables, i.e. an equation.
Nevertheless, the challenge of paramount importance in using this technique is how to present the
operational problem in a solvable Constraint Satisfaction Problem (CSP) model. The problem modelling is
problem independent and could be an exhaustive task at the beginning stage of problem solving,
particularly when the problem is a real-world practical problem. This paper investigates the application of
a simple grid puzzle game when a player attempts to solve practical scheduling problems. The examination
scheduling and logistic fleet scheduling are presented as operational games. The game‘s rules are set up
based on the operational practice. CP is then applied to solve the defined puzzle and the results show the
success of the proposed method. The benefit of using a grid puzzle as the model is that the method can
amplify the simplicity of CP in solving practical problems.
Fundamentals of the fuzzy logic based generalized theory of decisionsSpringer
The document discusses decision making under imperfect information when information is represented by geometric primitives rather than numbers or fuzzy sets. It introduces the concept of "fuzzy geometry" or "F-geometry" where visual perceptions that cannot be precisely represented are modeled using geometric shapes. Five cases of imperfect information are considered, from numerical information to purely visual perceptions. F-geometry uses primitives like points, lines, and shapes to represent perceptions and enables reasoning about decisions despite imprecise information. Basic primitives, operations, and axioms of F-geometry are defined to formally represent and reason with perceptions.
Lexisearch Approach to Travelling Salesman ProblemIOSR Journals
The aim of this paper is to introduce Lexisearch the structure of the search algorithm does not
require huge dynamic memory during execution. Mathematical programming is concerned with finding optimal
solutions rather than obtaining good solutions. The Lexisearch derives its name from lexicography .This
approach has been used to solve various combinatorial problems efficiently , The Assignment problem, The
Travelling Salesman Problem , The job scheduling problem etc. In all these problems the lexicographic search
was found to be more efficient than the Branch bound algorithms. This algorithm is deterministic and is always
guaranteed to find an optimal solution.
This lesson teaches students about the relationship between visual fraction models and equations when dividing fractions. Students will formally connect fraction models to multiplication through the use of multiplicative inverses. They will use fraction strips and tape diagrams to model division problems involving fractions. Students will learn that dividing a fraction by another fraction is the same as multiplying by the inverse or reciprocal of the divisor fraction. The lesson provides examples showing how to set up and solve word problems involving division of fractions using visual models and equations.
A Study on Youth Violence and Aggression using DEMATEL with FCM Methodsijdmtaiir
The DEMATEL method is then a good technique for
making decisions. In this paper we analyzed the risk factors of
youth violence and what makes them more aggressive. Since
there are more risk factors of youth violence, to relate each
other more complex to construct FCM and analyze them.
Moreover the data is an unsupervised one obtained from
survey as well as interviews. Hence fuzzy alone has the
capacity to analyses these concepts.
Using Grid Puzzle to Solve Constraint-Based Scheduling Problemcsandit
Constraint programming (CP) is one of the most effe
ctive techniques for solving practical
operational problems. The outstanding feature of th
e method is a set of constraints affecting a
solution of a problem can be imposed without a need
to explicitly defining a linear relation
among variables, i.e. an equation. Nevertheless, th
e challenge of paramount importance in
using this technique is how to present the operatio
nal problem in a solvable Constraint
Satisfaction Problem (CSP) model. The problem model
ling is problem independent and could be
an exhaustive task at the beginning stage of proble
m solving, particularly when the problem is a
real-world practical problem. This paper investigat
es the application of a simple grid puzzle
game when a player attempts to solve a practical sc
heduling problem. The examination
scheduling is presented as an operational game. The
game‘s rules are set up based on the
operational practice. CP is then applied to solve t
he defined puzzle and the results show the
success of the proposed method. The benefit of usin
g a grid puzzle as the model is that the
method can amplify the simplicity of CP in solving
practical problems.
The document outlines a framework for machine learning including: 1) the key components of a learning system including input, knowledge base, learner, and output; 2) different perspectives on machine learning such as optimization, concept formation, and pattern recognition; and 3) different approaches to inductive learning including decision trees, evolutionary algorithms, neural networks, and conceptual clustering. Examples are provided to illustrate different inductive learning systems and how they can generate rules from examples.
Introduction to Machine Learning Aristotelis Tsirigos butest
This document provides an introduction to machine learning, covering several key concepts:
- Machine learning aims to build models from data to make predictions without being explicitly programmed.
- There are different types of learning problems including supervised, unsupervised, and reinforcement learning.
- Popular machine learning algorithms discussed include Bayesian learning, nearest neighbors, decision trees, linear classifiers, and ensembles.
- Proper evaluation of machine learning models is important using techniques like cross-validation.
Sparse representation based classification of mr images of brain for alzheime...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document contains 54 multiple choice questions assessing knowledge of decision analysis concepts from Quantitative Analysis for Management, 11e by Render. The questions cover topics such as expected monetary value, decision making under risk and uncertainty, decision criteria like maximax and maximin, decision trees, utility theory, and calculating expected value with and without perfect information. Correct answers are provided for each question.
This document discusses algorithm-independent machine learning techniques. It introduces concepts like bias and variance, which can quantify how well a learning algorithm matches a problem without depending on a specific algorithm. Methods like cross-validation, bootstrapping, and resampling can be used with different algorithms. While no algorithm is inherently superior, such techniques provide guidance on algorithm use and help integrate multiple classifiers.
Strategic information transmission (trojan teaching) in client-consultant relationships.
The document discusses an experiment on "trojan teaching", where an informed party (consultant) communicates less than full information to an uninformed party (client) for their own benefit. The experiment finds advisors under unpaid conditions send "noisy" signals that do not fully reflect their private information more often. Clients who receive precise signals from advisors strongly follow their advice, while clients receiving noisy advice favor one option. Overall, advisors told the truth 46% of the time and sent deliberately noisy signals 24% of the time.
PREDICTIVE EVALUATION OF THE STOCK PORTFOLIO PERFORMANCE USING FUZZY CMEANS A...ijfls
The aim of this paper is to investigate the trend of the return of a portfolio formed randomly or for any
specific technique. The approach is made using two techniques fuzzy: fuzzy c-means (FCM) algorithm and
the fuzzy transform, where the rules used at fuzzy transform arise from the application of the FCM
algorithm. The results show that the proposed methodology is able to predict the trend of the return of a
stock portfolio, as well as the tendency of the market index. Real data of the financial market are used from
2004 until 2007.
Secondary mathematics wednesday august 22 2012brearatliff
Using CSCOPE for Instructional Planning - a PowerPoint outlining how to use the YAG, TEKS Verification Document, IFD, VAD, and EITG for effective lesson planning.
The document discusses various technical challenges in segmentation analysis including incorporating categorical variables, determining the optimal number of clusters, and accounting for differences in ratings scale usage, and provides solutions such as multiple correspondence analysis, mixture modeling, and response calibration. It also addresses how to speed up the segmentation process through automation and formalizing analysis goals and criteria for success.
This document discusses different approaches to theorizing in design research. It outlines several types of theory, from lower-level theories like frameworks and methods to higher-level design theories. The document also discusses how design research can be used to both develop new design theories and modify existing kernel theories through approaches like Action Design Research. Finally, it emphasizes that theorizing is important for advancing design research and notes that the goal should be to develop design principles even if a full design theory is not achieved.
This document proposes combining supervised and unsupervised classification using belief function theory. It presents an approach that treats uncertainty and limits conflicts between clustering and classification. Experimental results on four datasets show improved classification performance after combining the clustering and classification results, with perfect classification for some datasets. The conclusions are that the new approach provides good results for generic data. Future work includes handling missing data and applying the approach to real images.
Intent-Aware Temporal Query Modeling for Keyword SuggestionFindwise
This paper presents a data-driven approach for capturing the temporal variations in user search behaviour by modeling the dynamic query relationships using query-log data. The dependence between different queries (in terms of the query words and latent user intent) is represented using hypergraphs which allows us to explore more complex relationships compared to graph-based approaches. This time-varying dependence is modeled using the framework of probabilistic graphical models. The inferred interactions are used for query keyword suggestion - a key task in web information retrieval. Preliminary experiments using query logs collected from internal search engine of a large health care organization yield promising results. In particular, our model is able to capture temporal variations between queries relationships that reflect known trends in disease occurrence. Further, hypergraph-based modeling captures relationships significantly better compared to graph-based approaches.
Comparative study of ksvdd and fsvm for classification of mislabeled dataeSAT Journals
Abstract Outlier detection is the important concept in data mining. These outliers are the data that differ from the normal data. Noise in the
application may cause the misclassification of data. Data are more likely to be mislabeled in presence of noise leading to
performance degradation. The proposed work focuses on these issues. Data before classifying is given a value that represents its
willingness towards the class. This data with likelihood value is then given to classifier to predict the data. SVDD algorithm is
used for classification of data with likelihood values.
Keywords: Confusion Matrix, FSVM, Outlier, Outlier Detection, SVDD
The document describes the Mendel approach for understanding class hierarchies in object-oriented programs. Mendel uses a simple model and metrics to identify interesting classes based on their size, novelty within a hierarchy, and a combination of both. It also analyzes subclassing behaviors to distinguish between classes that mainly extend versus override functionality. The approach is demonstrated on example systems like JHotDraw and Azureus to provide insights into their key classes and inheritance usage.
This document presents a proposed churn prediction model based on data mining techniques. The model consists of six steps: identifying the problem domain, data selection, investigating the data set, classification, clustering, and utilizing the knowledge gained. The authors apply their model to a data set of 5,000 mobile service customers using data mining tools. They train classification models using decision trees, neural networks, and support vector machines. Customers are classified as churners or non-churners. Churners are then clustered into three groups. The results are interpreted to gain insights into customer retention.
The document provides an overview of Truth Maintenance Systems (TMS) in artificial intelligence. It discusses key aspects of TMS including:
1. Enforcing logical relations among beliefs by maintaining and updating relations when assumptions change.
2. Generating explanations for conclusions by using cached inferences to avoid re-deriving inferences.
3. Finding solutions to search problems by representing problems as sets of variables, domains, and constraints.
The document also covers justification-based and assumption-based TMS, and how a TMS interacts with a problem solver to add and retract assumptions, detect contradictions, and perform belief revision.
The document discusses multiple criteria decision making (MCDM) approaches. It introduces several common MCDM methods: the weighted score method, TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) method, and Analytic Hierarchy Process (AHP). It then provides a detailed example of how to apply the weighted score method and TOPSIS method to a problem of selecting the best car based on criteria like style, reliability, fuel economy, and cost.
Multi criteria decision support system on mobile phone selection with ahp and...Reza Ramezani
This document proposes using multi-criteria decision making (MCDM) approaches, specifically the Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), to help users select a mobile phone. It outlines the evaluation process, which involves identifying important mobile phone selection criteria, calculating criteria weights using AHP, and then using TOPSIS to rank mobile phone alternatives based on how close they are to an ideal solution and how far they are from a negative ideal solution. The document provides examples of building pairwise comparison matrices in AHP and calculating ideal and non-ideal solutions and alternative distances in TOPSIS to demonstrate the selection approach.
Experience Mazda Zoom Zoom Lifestyle and Culture by Visiting and joining the Official Mazda Community at http://www.MazdaCommunity.org for additional insight into the Zoom Zoom Lifestyle and special offers for Mazda Community Members. If you live in Arizona, check out CardinaleWay Mazda's eCommerce website at http://www.Cardinale-Way-Mazda.com
A Study on Youth Violence and Aggression using DEMATEL with FCM Methodsijdmtaiir
The DEMATEL method is then a good technique for
making decisions. In this paper we analyzed the risk factors of
youth violence and what makes them more aggressive. Since
there are more risk factors of youth violence, to relate each
other more complex to construct FCM and analyze them.
Moreover the data is an unsupervised one obtained from
survey as well as interviews. Hence fuzzy alone has the
capacity to analyses these concepts.
Using Grid Puzzle to Solve Constraint-Based Scheduling Problemcsandit
Constraint programming (CP) is one of the most effe
ctive techniques for solving practical
operational problems. The outstanding feature of th
e method is a set of constraints affecting a
solution of a problem can be imposed without a need
to explicitly defining a linear relation
among variables, i.e. an equation. Nevertheless, th
e challenge of paramount importance in
using this technique is how to present the operatio
nal problem in a solvable Constraint
Satisfaction Problem (CSP) model. The problem model
ling is problem independent and could be
an exhaustive task at the beginning stage of proble
m solving, particularly when the problem is a
real-world practical problem. This paper investigat
es the application of a simple grid puzzle
game when a player attempts to solve a practical sc
heduling problem. The examination
scheduling is presented as an operational game. The
game‘s rules are set up based on the
operational practice. CP is then applied to solve t
he defined puzzle and the results show the
success of the proposed method. The benefit of usin
g a grid puzzle as the model is that the
method can amplify the simplicity of CP in solving
practical problems.
The document outlines a framework for machine learning including: 1) the key components of a learning system including input, knowledge base, learner, and output; 2) different perspectives on machine learning such as optimization, concept formation, and pattern recognition; and 3) different approaches to inductive learning including decision trees, evolutionary algorithms, neural networks, and conceptual clustering. Examples are provided to illustrate different inductive learning systems and how they can generate rules from examples.
Introduction to Machine Learning Aristotelis Tsirigos butest
This document provides an introduction to machine learning, covering several key concepts:
- Machine learning aims to build models from data to make predictions without being explicitly programmed.
- There are different types of learning problems including supervised, unsupervised, and reinforcement learning.
- Popular machine learning algorithms discussed include Bayesian learning, nearest neighbors, decision trees, linear classifiers, and ensembles.
- Proper evaluation of machine learning models is important using techniques like cross-validation.
Sparse representation based classification of mr images of brain for alzheime...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document contains 54 multiple choice questions assessing knowledge of decision analysis concepts from Quantitative Analysis for Management, 11e by Render. The questions cover topics such as expected monetary value, decision making under risk and uncertainty, decision criteria like maximax and maximin, decision trees, utility theory, and calculating expected value with and without perfect information. Correct answers are provided for each question.
This document discusses algorithm-independent machine learning techniques. It introduces concepts like bias and variance, which can quantify how well a learning algorithm matches a problem without depending on a specific algorithm. Methods like cross-validation, bootstrapping, and resampling can be used with different algorithms. While no algorithm is inherently superior, such techniques provide guidance on algorithm use and help integrate multiple classifiers.
Strategic information transmission (trojan teaching) in client-consultant relationships.
The document discusses an experiment on "trojan teaching", where an informed party (consultant) communicates less than full information to an uninformed party (client) for their own benefit. The experiment finds advisors under unpaid conditions send "noisy" signals that do not fully reflect their private information more often. Clients who receive precise signals from advisors strongly follow their advice, while clients receiving noisy advice favor one option. Overall, advisors told the truth 46% of the time and sent deliberately noisy signals 24% of the time.
PREDICTIVE EVALUATION OF THE STOCK PORTFOLIO PERFORMANCE USING FUZZY CMEANS A...ijfls
The aim of this paper is to investigate the trend of the return of a portfolio formed randomly or for any
specific technique. The approach is made using two techniques fuzzy: fuzzy c-means (FCM) algorithm and
the fuzzy transform, where the rules used at fuzzy transform arise from the application of the FCM
algorithm. The results show that the proposed methodology is able to predict the trend of the return of a
stock portfolio, as well as the tendency of the market index. Real data of the financial market are used from
2004 until 2007.
Secondary mathematics wednesday august 22 2012brearatliff
Using CSCOPE for Instructional Planning - a PowerPoint outlining how to use the YAG, TEKS Verification Document, IFD, VAD, and EITG for effective lesson planning.
The document discusses various technical challenges in segmentation analysis including incorporating categorical variables, determining the optimal number of clusters, and accounting for differences in ratings scale usage, and provides solutions such as multiple correspondence analysis, mixture modeling, and response calibration. It also addresses how to speed up the segmentation process through automation and formalizing analysis goals and criteria for success.
This document discusses different approaches to theorizing in design research. It outlines several types of theory, from lower-level theories like frameworks and methods to higher-level design theories. The document also discusses how design research can be used to both develop new design theories and modify existing kernel theories through approaches like Action Design Research. Finally, it emphasizes that theorizing is important for advancing design research and notes that the goal should be to develop design principles even if a full design theory is not achieved.
This document proposes combining supervised and unsupervised classification using belief function theory. It presents an approach that treats uncertainty and limits conflicts between clustering and classification. Experimental results on four datasets show improved classification performance after combining the clustering and classification results, with perfect classification for some datasets. The conclusions are that the new approach provides good results for generic data. Future work includes handling missing data and applying the approach to real images.
Intent-Aware Temporal Query Modeling for Keyword SuggestionFindwise
This paper presents a data-driven approach for capturing the temporal variations in user search behaviour by modeling the dynamic query relationships using query-log data. The dependence between different queries (in terms of the query words and latent user intent) is represented using hypergraphs which allows us to explore more complex relationships compared to graph-based approaches. This time-varying dependence is modeled using the framework of probabilistic graphical models. The inferred interactions are used for query keyword suggestion - a key task in web information retrieval. Preliminary experiments using query logs collected from internal search engine of a large health care organization yield promising results. In particular, our model is able to capture temporal variations between queries relationships that reflect known trends in disease occurrence. Further, hypergraph-based modeling captures relationships significantly better compared to graph-based approaches.
Comparative study of ksvdd and fsvm for classification of mislabeled dataeSAT Journals
Abstract Outlier detection is the important concept in data mining. These outliers are the data that differ from the normal data. Noise in the
application may cause the misclassification of data. Data are more likely to be mislabeled in presence of noise leading to
performance degradation. The proposed work focuses on these issues. Data before classifying is given a value that represents its
willingness towards the class. This data with likelihood value is then given to classifier to predict the data. SVDD algorithm is
used for classification of data with likelihood values.
Keywords: Confusion Matrix, FSVM, Outlier, Outlier Detection, SVDD
The document describes the Mendel approach for understanding class hierarchies in object-oriented programs. Mendel uses a simple model and metrics to identify interesting classes based on their size, novelty within a hierarchy, and a combination of both. It also analyzes subclassing behaviors to distinguish between classes that mainly extend versus override functionality. The approach is demonstrated on example systems like JHotDraw and Azureus to provide insights into their key classes and inheritance usage.
This document presents a proposed churn prediction model based on data mining techniques. The model consists of six steps: identifying the problem domain, data selection, investigating the data set, classification, clustering, and utilizing the knowledge gained. The authors apply their model to a data set of 5,000 mobile service customers using data mining tools. They train classification models using decision trees, neural networks, and support vector machines. Customers are classified as churners or non-churners. Churners are then clustered into three groups. The results are interpreted to gain insights into customer retention.
The document provides an overview of Truth Maintenance Systems (TMS) in artificial intelligence. It discusses key aspects of TMS including:
1. Enforcing logical relations among beliefs by maintaining and updating relations when assumptions change.
2. Generating explanations for conclusions by using cached inferences to avoid re-deriving inferences.
3. Finding solutions to search problems by representing problems as sets of variables, domains, and constraints.
The document also covers justification-based and assumption-based TMS, and how a TMS interacts with a problem solver to add and retract assumptions, detect contradictions, and perform belief revision.
The document discusses multiple criteria decision making (MCDM) approaches. It introduces several common MCDM methods: the weighted score method, TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) method, and Analytic Hierarchy Process (AHP). It then provides a detailed example of how to apply the weighted score method and TOPSIS method to a problem of selecting the best car based on criteria like style, reliability, fuel economy, and cost.
Multi criteria decision support system on mobile phone selection with ahp and...Reza Ramezani
This document proposes using multi-criteria decision making (MCDM) approaches, specifically the Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), to help users select a mobile phone. It outlines the evaluation process, which involves identifying important mobile phone selection criteria, calculating criteria weights using AHP, and then using TOPSIS to rank mobile phone alternatives based on how close they are to an ideal solution and how far they are from a negative ideal solution. The document provides examples of building pairwise comparison matrices in AHP and calculating ideal and non-ideal solutions and alternative distances in TOPSIS to demonstrate the selection approach.
Experience Mazda Zoom Zoom Lifestyle and Culture by Visiting and joining the Official Mazda Community at http://www.MazdaCommunity.org for additional insight into the Zoom Zoom Lifestyle and special offers for Mazda Community Members. If you live in Arizona, check out CardinaleWay Mazda's eCommerce website at http://www.Cardinale-Way-Mazda.com
This document defines key concepts in multi-criteria decision making (MCDM) including criteria, alternatives, and decisions. It provides examples of single-criterion and multiple-criteria decision problems. For multiple-criteria problems, alternatives differ in more than one criterion and criteria are often competing. Formal MCDM analysis is useful when criteria are competing and trade-offs are difficult to evaluate. The document discusses types of MCDM problems and contexts for MCDM including mutually exclusive alternatives, portfolio selection, design, and measurement.
TOPSIS - A multi-criteria decision making approachPresi
This document discusses the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) method for multi-criteria decision making (MCDM). It defines key terms like alternatives, criteria, weights, and decision matrices. It then outlines the steps in the TOPSIS method, which include standardizing the decision matrix, determining the ideal and negative ideal solutions, calculating the separation from each alternative to the ideal and negative ideal solutions, and selecting the alternative with the shortest distance from the ideal and farthest from the negative ideal solution.
This document contains the rules and questions for a trivia quiz game involving 7 rounds with 10 questions each across various categories such as food, history, geography, music, arts and literature, entertainment. The rules specify team size, no outside help, the quizmaster's decisions are final, how scoring and answering other teams' questions will work, and what to do in the event of a tie. It also provides the questions and answer choices for each category of the quiz.
Trainee Recruitment Consultant PresentationNick Williams
The document provides an overview of the roles and responsibilities of a recruitment consultant. It discusses sourcing both clients and candidates through various activities like cold calling, referrals, and networking. It also describes maintaining relationships with clients by providing suitable candidates and meeting hiring needs. The daily tasks involve calling clients and candidates, organizing meetings and placements. The document differentiates between contract and permanent placements and outlines the characteristics of a successful recruiter, including strong work ethic, integrity, and market knowledge.
Kebijakan publik adalah keputusan pemerintah untuk mengatur berbagai aspek kehidupan masyarakat. Analisis kebijakan publik melibatkan proses formulasi alternatif kebijakan dan pemilihan alternatif terbaik. Kebijakan publik dipelajari oleh berbagai disiplin ilmu dan dipengaruhi oleh berbagai faktor seperti institusi pemerintahan, kelompok kepentingan, dan sistem politik suatu negara.
This document is a curriculum vitae for Sandy Van Der Merwe, which includes personal information such as date of birth, address, education history, and employment history. It details that she graduated high school in 2008, is currently studying for a Bachelor of Education degree, and has worked in a variety of roles including as a tutor, teacher, fashion designer, and sales representative. Her employment history spans from 2009 to the present, and she provides references for each role.
This document discusses using mathematical tools like fuzzy sets, soft sets, and fuzzy soft sets to solve decision-making problems involving uncertainty. It provides background on these tools, including definitions of fuzzy sets proposed by Zadeh in 1965, soft sets introduced by Molodstov in 1999 to model vagueness, and fuzzy soft sets combining fuzzy sets and soft sets. The document then discusses applying these tools to decision-making, including using fuzzy sets to represent membership grades for uncertain information, and using soft sets as a parameterization tool to model uncertain data when traditional models fail.
11.selection method by fuzzy set theory and preference matrixAlexander Decker
This document summarizes a research paper that proposes a new fuzzy multi-criteria decision making ranking method. The proposed method combines two existing methods: Shimura's method of using relative preference grades and Blin and Whinston's method of using a preference matrix. The new method aims to overcome limitations of other methods like FAHP and FTOPSIS, which can become less accurate with more alternatives and require extensive calculations. It uses a preference matrix to determine membership grades and then calculates rankings. The paper provides background on fuzzy decision making and reviews existing literature on fuzzy multi-criteria decision making methods and their limitations.
Selection method by fuzzy set theory and preference matrixAlexander Decker
This document discusses methods for fuzzy multi-criteria decision making. It reviews several existing methods including the fuzzy analytic hierarchy process (FAHP), fuzzy technique for order preference by similarity to ideal solution (FTOPSIS), and hybrid methods combining multiple approaches. The document also proposes a new method that uses a preference matrix to calculate membership grades and determine a ranking of alternatives. This aims to develop a reliable ranking method with less complex calculations than some existing techniques.
This document presents a novel (R, S)-norm entropy measure to quantify the degree of fuzziness in intuitionistic fuzzy sets (IFSs). The entropy measure is defined based on (R, S)-norms and is proven to satisfy valid properties of an entropy measure. The entropy measure is then used to propose two decision-making approaches for multi-attribute decision making problems where attribute weights are either partially known or completely unknown. An example is provided to illustrate the decision making process using the novel entropy measure.
This document summarizes an empirical study comparing several supervised machine learning approaches for word sense disambiguation: Naive Bayes, decision tree, decision list, and support vector machine (SVM). The study used a dataset of 15 words annotated with senses from WordNet and Senseval-3. Each approach was implemented and evaluated based on its accuracy in identifying the correct sense of each word. The results showed that the decision list approach achieved the highest overall accuracy of 69.12%, followed by SVM at 56.11%, naive Bayes at 58.32%, and decision tree at 45.14%. Thus, the study concluded that decision list performed best on this dataset for the task of word sense disambiguation.
Comprehensive Survey of Data Classification & Prediction Techniquesijsrd.com
In this paper, we present an literature survey of the modern data classification and prediction algorithms. All these algorithms are very important in real world applications like- heart disease prediction, cancer prediction etc. Classification of data is a very popular and computationally expensive task. The fundamentals of data classification are also discussed in brief.
Methodological Study Of Opinion Mining And Sentiment Analysis Techniques ijsc
Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.
This document summarizes a student project that analyzed speech data to predict schizophrenia using machine learning algorithms. The students collected speech data from schizophrenic and healthy individuals over two days. They tested logistic regression, naive Bayes, random forest, decision tree, and OneR algorithms on the data. Logistic regression performed best, accurately predicting schizophrenia from emotions data over 80% of the time. The small dataset size was a challenge, and future work could involve implementing support vector machines and obtaining a larger dataset.
Methodological study of opinion mining and sentiment analysis techniquesijsc
Decision making both on individual and organizational level is always accompanied by the search of
other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum
discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated
content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining
and sentiment analysis are the formalization for studying and construing opinions and sentiments. The
digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is
an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.
The document proposes new similarity measures for Pythagorean fuzzy sets (PFSs) and applies them to decision-making problems. It first reviews existing similarity measures for PFSs that incorporate membership, non-membership, and indeterminacy degrees. It then proposes four new similarity measures (s1-s4) for PFSs based on these three conventional PFS parameters. An example compares the new measures to existing ones, finding that s4 yields the most reasonable results. The document concludes by applying s4 to decision problems in career placement, medical diagnosis, and elections to demonstrate the measure's applicability.
The document proposes new similarity measures for Pythagorean fuzzy sets (PFSs) and applies them to decision-making problems. It first reviews existing similarity measures for PFSs that incorporate the three conventional parameters of membership degree, non-membership degree, and indeterminacy degree. It then proposes four new similarity measures (s1, s2, s3, s4) for PFSs based on these three parameters. An example is used to compare the results of the new measures to existing measures, finding that s4 yields the most reasonable outputs. The paper concludes by applying s4 to decision problems in career placement, medical diagnosis, and elections to demonstrate the measures' applicability.
The document discusses using fuzzy soft set approaches combined with Dempster–Shafer theory of evidence and grey relational analysis for decision making under uncertainty. Grey relational analysis is used to obtain the uncertain degrees of parameters. Mass functions are assigned to choices based on the uncertain degrees. Dempster's rule of evidence is used to combine choices into a collective choice. This approach avoids selecting soft set levels and parameter weights while improving reliability and the level of decision making.
Uncertainty classification of expert systems a rough set approachEr. rahul abhishek
In this paper, we discussed about the un certainity classifications of the Expert Systems using a Rough Set Approach. It is a Softcomputing technique using this we classified the types of Expert Systems. An expert system has a unique structure, different from traditional programs. It is divided into two parts, one fixed, independent of the expert system: the inference engine, and one variable: the knowledge base. To run an expert system, the engine reasons about the knowledge base like a human. In the 80's a third part appeared: a dialog interface to communicate with users. This ability to conduct a conversation with users was later called "conversational". Rough set theory is a technique deals with uncertainty.
Heuristics for the Maximal Diversity Selection ProblemIJMER
The problem of selecting k items from among a given set of N items such that the ‘diversity’
among the k items is maximum, is a classical problem with applications in many diverse areas such as
forming committees, jury selection, product testing, surveys, plant breeding, ecological preservation,
capital investment, etc. A suitably defined distance metric is used to determine the diversity. However,
this is a hard problem, and the optimal solution is computationally intractable. In this paper we present
the experimental evaluation of two approximation algorithms (heuristics) for the maximal diversity
selection problem
On distributed fuzzy decision trees for big datanexgentechnology
GET IEEE BIG DATA, JAVA ,DOTNET,ANDROID ,NS2,MATLAB,EMBEDED AT LOW COST WITH BEST QUALITY PLEASE CONTACT BELOW NUMBER
FOR MORE INFORMATION PLEASE FIND THE BELOW DETAILS:
Nexgen Technology
No :66,4th cross,Venkata nagar,
Near SBI ATM,
Puducherry.
Email Id: praveen@nexgenproject.com
Mobile: 9791938249
Telephone: 0413-2211159
www.nexgenproject.com
This document presents a new algorithm called UDT-CDF for building decision trees to classify uncertain numerical data. It improves on previous algorithms like UDT that were based on probability density functions (PDFs). The key aspects of the new algorithm are:
1. It uses cumulative distribution functions (CDFs) rather than PDFs to represent uncertain numerical attributes, since CDFs provide more complete probability information.
2. It splits data at decision tree nodes based on the CDF, placing data with values covering the split point into both branches weighted by the CDF.
3. Experimental results show the new CDF-based algorithm achieves more accurate classifications and is more computationally efficient than the PDF-based UDT algorithm,
The document discusses the differences between machine learning (ML), statistical learning, data mining (DM), and automated learning (AL). It argues that while ML and statistical learning developed similar techniques starting in the 1960s, DM emerged in the 1990s from a merging of database research and automated learning. However, industry was much more enthusiastic about adopting DM techniques compared to AL techniques, even though many DM systems are just friendly interfaces of AL systems. The document aims to explain the key differences between DM and AL that led to DM's greater commercial success.
FUZZY ROUGH INFORMATION MEASURES AND THEIR APPLICATIONSijcsity
This document summarizes a research paper that proposes three new information measures for fuzzy rough sets and evaluates their applications. It begins with background on fuzzy rough sets and existing information measures. It then defines a new logarithmic information measure for fuzzy rough values and proves its validity. Another proposed information measure for fuzzy rough sets is described along with an illustration of its application. A weighted information measure for fuzzy rough sets is also discussed. The paper compares the proposed information measures to other existing ones and concludes with references.
FUZZY ROUGH INFORMATION MEASURES AND THEIR APPLICATIONSijcsity
The degree of roughness characterizes the uncertainty contained in a rough set. The rough entropy was
defined to measure the roughness of a rough set. Though, it was effective and useful, but not accurate
enough. Some authors use information measure in place of entropy for better understanding which
measures the amount of uncertainty contained in fuzzy rough set .In this paper three new fuzzy rough
information measures are proposed and their validity is verified. The application of these proposed
information measures in decision making problems is studied and also compared with other existing
information measures.
FUZZY ROUGH INFORMATION MEASURES AND THEIR APPLICATIONSijcsity
This document summarizes a research paper that proposes three new information measures for fuzzy rough sets and evaluates their applications. It begins with background on fuzzy rough sets and existing information measures. It then defines a new logarithmic information measure for fuzzy rough values and proves its validity. Another proposed information measure for fuzzy rough sets is described along with an illustration of its application. A weighted information measure for fuzzy rough sets is also discussed. The paper compares the proposed information measures to other existing ones and concludes with references.
Similar to International Journal of Engineering Research and Development (IJERD) (20)
A Novel Method for Prevention of Bandwidth Distributed Denial of Service AttacksIJERD Editor
Distributed Denial of Service (DDoS) Attacks became a massive threat to the Internet. Traditional
Architecture of internet is vulnerable to the attacks like DDoS. Attacker primarily acquire his army of Zombies,
then that army will be instructed by the Attacker that when to start an attack and on whom the attack should be
done. In this paper, different techniques which are used to perform DDoS Attacks, Tools that were used to
perform Attacks and Countermeasures in order to detect the attackers and eliminate the Bandwidth Distributed
Denial of Service attacks (B-DDoS) are reviewed. DDoS Attacks were done by using various Flooding
techniques which are used in DDoS attack.
The main purpose of this paper is to design an architecture which can reduce the Bandwidth
Distributed Denial of service Attack and make the victim site or server available for the normal users by
eliminating the zombie machines. Our Primary focus of this paper is to dispute how normal machines are
turning into zombies (Bots), how attack is been initiated, DDoS attack procedure and how an organization can
save their server from being a DDoS victim. In order to present this we implemented a simulated environment
with Cisco switches, Routers, Firewall, some virtual machines and some Attack tools to display a real DDoS
attack. By using Time scheduling, Resource Limiting, System log, Access Control List and some Modular
policy Framework we stopped the attack and identified the Attacker (Bot) machines
Hearing loss is one of the most common human impairments. It is estimated that by year 2015 more
than 700 million people will suffer mild deafness. Most can be helped by hearing aid devices depending on the
severity of their hearing loss. This paper describes the implementation and characterization details of a dual
channel transmitter front end (TFE) for digital hearing aid (DHA) applications that use novel micro
electromechanical- systems (MEMS) audio transducers and ultra-low power-scalable analog-to-digital
converters (ADCs), which enable a very-low form factor, energy-efficient implementation for next-generation
DHA. The contribution of the design is the implementation of the dual channel MEMS microphones and powerscalable
ADC system.
Influence of tensile behaviour of slab on the structural Behaviour of shear c...IJERD Editor
-A composite beam is composed of a steel beam and a slab connected by means of shear connectors
like studs installed on the top flange of the steel beam to form a structure behaving monolithically. This study
analyzes the effects of the tensile behavior of the slab on the structural behavior of the shear connection like slip
stiffness and maximum shear force in composite beams subjected to hogging moment. The results show that the
shear studs located in the crack-concentration zones due to large hogging moments sustain significantly smaller
shear force and slip stiffness than the other zones. Moreover, the reduction of the slip stiffness in the shear
connection appears also to be closely related to the change in the tensile strain of rebar according to the increase
of the load. Further experimental and analytical studies shall be conducted considering variables such as the
reinforcement ratio and the arrangement of shear connectors to achieve efficient design of the shear connection
in composite beams subjected to hogging moment.
Gold prospecting using Remote Sensing ‘A case study of Sudan’IJERD Editor
Gold has been extracted from northeast Africa for more than 5000 years, and this may be the first
place where the metal was extracted. The Arabian-Nubian Shield (ANS) is an exposure of Precambrian
crystalline rocks on the flanks of the Red Sea. The crystalline rocks are mostly Neoproterozoic in age. ANS
includes the nations of Israel, Jordan. Egypt, Saudi Arabia, Sudan, Eritrea, Ethiopia, Yemen, and Somalia.
Arabian Nubian Shield Consists of juvenile continental crest that formed between 900 550 Ma, when intra
oceanic arc welded together along ophiolite decorated arc. Primary Au mineralization probably developed in
association with the growth of intra oceanic arc and evolution of back arc. Multiple episodes of deformation
have obscured the primary metallogenic setting, but at least some of the deposits preserve evidence that they
originate as sea floor massive sulphide deposits.
The Red Sea Hills Region is a vast span of rugged, harsh and inhospitable sector of the Earth with
inimical moon-like terrain, nevertheless since ancient times it is famed to be an abode of gold and was a major
source of wealth for the Pharaohs of ancient Egypt. The Pharaohs old workings have been periodically
rediscovered through time. Recent endeavours by the Geological Research Authority of Sudan led to the
discovery of a score of occurrences with gold and massive sulphide mineralizations. In the nineties of the
previous century the Geological Research Authority of Sudan (GRAS) in cooperation with BRGM utilized
satellite data of Landsat TM using spectral ratio technique to map possible mineralized zones in the Red Sea
Hills of Sudan. The outcome of the study mapped a gossan type gold mineralization. Band ratio technique was
applied to Arbaat area and a signature of alteration zone was detected. The alteration zones are commonly
associated with mineralization. The alteration zones are commonly associated with mineralization. A filed check
confirmed the existence of stock work of gold bearing quartz in the alteration zone. Another type of gold
mineralization that was discovered using remote sensing is the gold associated with metachert in the Atmur
Desert.
Reducing Corrosion Rate by Welding DesignIJERD Editor
This document summarizes a study on reducing corrosion rates in steel through welding design. The researchers tested different welding groove designs (X, V, 1/2X, 1/2V) and preheating temperatures (400°C, 500°C, 600°C) on ferritic malleable iron samples. Testing found that X and V groove designs with 500°C and 600°C preheating had corrosion rates of 0.5-0.69% weight loss after 14 days, compared to 0.57-0.76% for 400°C preheating. Higher preheating reduced residual stresses which decreased corrosion. Residual stresses were 1.7 MPa for optimal X groove and 600°C
Router 1X3 – RTL Design and VerificationIJERD Editor
Routing is the process of moving a packet of data from source to destination and enables messages
to pass from one computer to another and eventually reach the target machine. A router is a networking device
that forwards data packets between computer networks. It is connected to two or more data lines from different
networks (as opposed to a network switch, which connects data lines from one single network). This paper,
mainly emphasizes upon the study of router device, it‟s top level architecture, and how various sub-modules of
router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top
module.
Active Power Exchange in Distributed Power-Flow Controller (DPFC) At Third Ha...IJERD Editor
This paper presents a component within the flexible ac-transmission system (FACTS) family, called
distributed power-flow controller (DPFC). The DPFC is derived from the unified power-flow controller (UPFC)
with an eliminated common dc link. The DPFC has the same control capabilities as the UPFC, which comprise
the adjustment of the line impedance, the transmission angle, and the bus voltage. The active power exchange
between the shunt and series converters, which is through the common dc link in the UPFC, is now through the
transmission lines at the third-harmonic frequency. DPFC multiple small-size single-phase converters which
reduces the cost of equipment, no voltage isolation between phases, increases redundancy and there by
reliability increases. The principle and analysis of the DPFC are presented in this paper and the corresponding
simulation results that are carried out on a scaled prototype are also shown.
Mitigation of Voltage Sag/Swell with Fuzzy Control Reduced Rating DVRIJERD Editor
Power quality has been an issue that is becoming increasingly pivotal in industrial electricity
consumers point of view in recent times. Modern industries employ Sensitive power electronic equipments,
control devices and non-linear loads as part of automated processes to increase energy efficiency and
productivity. Voltage disturbances are the most common power quality problem due to this the use of a large
numbers of sophisticated and sensitive electronic equipment in industrial systems is increased. This paper
discusses the design and simulation of dynamic voltage restorer for improvement of power quality and
reduce the harmonics distortion of sensitive loads. Power quality problem is occurring at non-standard
voltage, current and frequency. Electronic devices are very sensitive loads. In power system voltage sag,
swell, flicker and harmonics are some of the problem to the sensitive load. The compensation capability
of a DVR depends primarily on the maximum voltage injection ability and the amount of stored
energy available within the restorer. This device is connected in series with the distribution feeder at
medium voltage. A fuzzy logic control is used to produce the gate pulses for control circuit of DVR and the
circuit is simulated by using MATLAB/SIMULINK software.
Study on the Fused Deposition Modelling In Additive ManufacturingIJERD Editor
Additive manufacturing process, also popularly known as 3-D printing, is a process where a product
is created in a succession of layers. It is based on a novel materials incremental manufacturing philosophy.
Unlike conventional manufacturing processes where material is removed from a given work price to derive the
final shape of a product, 3-D printing develops the product from scratch thus obviating the necessity to cut away
materials. This prevents wastage of raw materials. Commonly used raw materials for the process are ABS
plastic, PLA and nylon. Recently the use of gold, bronze and wood has also been implemented. The complexity
factor of this process is 0% as in any object of any shape and size can be manufactured.
Spyware triggering system by particular string valueIJERD Editor
This computer programme can be used for good and bad purpose in hacking or in any general
purpose. We can say it is next step for hacking techniques such as keylogger and spyware. Once in this system if
user or hacker store particular string as a input after that software continually compare typing activity of user
with that stored string and if it is match then launch spyware programme.
A Blind Steganalysis on JPEG Gray Level Image Based on Statistical Features a...IJERD Editor
This paper presents a blind steganalysis technique to effectively attack the JPEG steganographic
schemes i.e. Jsteg, F5, Outguess and DWT Based. The proposed method exploits the correlations between
block-DCTcoefficients from intra-block and inter-block relation and the statistical moments of characteristic
functions of the test image is selected as features. The features are extracted from the BDCT JPEG 2-array.
Support Vector Machine with cross-validation is implemented for the classification.The proposed scheme gives
improved outcome in attacking.
Secure Image Transmission for Cloud Storage System Using Hybrid SchemeIJERD Editor
- Data over the cloud is transferred or transmitted between servers and users. Privacy of that
data is very important as it belongs to personal information. If data get hacked by the hacker, can be
used to defame a person’s social data. Sometimes delay are held during data transmission. i.e. Mobile
communication, bandwidth is low. Hence compression algorithms are proposed for fast and efficient
transmission, encryption is used for security purposes and blurring is used by providing additional
layers of security. These algorithms are hybridized for having a robust and efficient security and
transmission over cloud storage system.
Application of Buckley-Leverett Equation in Modeling the Radius of Invasion i...IJERD Editor
A thorough review of existing literature indicates that the Buckley-Leverett equation only analyzes
waterflood practices directly without any adjustments on real reservoir scenarios. By doing so, quite a number
of errors are introduced into these analyses. Also, for most waterflood scenarios, a radial investigation is more
appropriate than a simplified linear system. This study investigates the adoption of the Buckley-Leverett
equation to estimate the radius invasion of the displacing fluid during waterflooding. The model is also adopted
for a Microbial flood and a comparative analysis is conducted for both waterflooding and microbial flooding.
Results shown from the analysis doesn’t only records a success in determining the radial distance of the leading
edge of water during the flooding process, but also gives a clearer understanding of the applicability of
microbes to enhance oil production through in-situ production of bio-products like bio surfactans, biogenic
gases, bio acids etc.
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
- Gesture gaming is a method by which users having a laptop/pc/x-box play games using natural or
bodily gestures. This paper presents a way of playing free flash games on the internet using an ordinary webcam
with the help of open source technologies. Emphasis in human activity recognition is given on the pose
estimation and the consistency in the pose of the player. These are estimated with the help of an ordinary web
camera having different resolutions from VGA to 20mps. Our work involved giving a 10 second documentary to
the user on how to play a particular game using gestures and what are the various kinds of gestures that can be
performed in front of the system. The initial inputs of the RGB values for the gesture component is obtained by
instructing the user to place his component in a red box in about 10 seconds after the short documentary before
the game is finished. Later the system opens the concerned game on the internet on popular flash game sites like
miniclip, games arcade, GameStop etc and loads the game clicking at various places and brings the state to a
place where the user is to perform only gestures to start playing the game. At any point of time the user can call
off the game by hitting the esc key and the program will release all of the controls and return to the desktop. It
was noted that the results obtained using an ordinary webcam matched that of the Kinect and the users could
relive the gaming experience of the free flash games on the net. Therefore effective in game advertising could
also be achieved thus resulting in a disruptive growth to the advertising firms.
Hardware Analysis of Resonant Frequency Converter Using Isolated Circuits And...IJERD Editor
-LLC resonant frequency converter is basically a combo of series as well as parallel resonant ckt. For
LCC resonant converter it is associated with a disadvantage that, though it has two resonant frequencies, the
lower resonant frequency is in ZCS region[5]. For this application, we are not able to design the converter
working at this resonant frequency. LLC resonant converter existed for a very long time but because of
unknown characteristic of this converter it was used as a series resonant converter with basically a passive
(resistive) load. . Here, it was designed to operate in switching frequency higher than resonant frequency of the
series resonant tank of Lr and Cr converter acts very similar to Series Resonant Converter. The benefit of LLC
resonant converter is narrow switching frequency range with light load[6] . Basically, the control ckt plays a
very imp. role and hence 555 Timer used here provides a perfect square wave as the control ckt provides no
slew rate which makes the square wave really strong and impenetrable. The dead band circuit provides the
exclusive dead band in micro seconds so as to avoid the simultaneous firing of two pairs of IGBT’s where one
pair switches off and the other on for a slightest period of time. Hence, the isolator ckt here is associated with
each and every ckt used because it acts as a driver and an isolation to each of the IGBT is provided with one
exclusive transformer supply[3]. The IGBT’s are fired using the appropriate signal using the previous boards
and hence at last a high frequency rectifier ckt with a filtering capacitor is used to get an exact dc
waveform .The basic goal of this particular analysis is to observe the wave forms and characteristics of
converters with differently positioned passive elements in the form of tank circuits.
Simulated Analysis of Resonant Frequency Converter Using Different Tank Circu...IJERD Editor
LLC resonant frequency converter is basically a combo of series as well as parallel resonant ckt. For
LCC resonant converter it is associated with a disadvantage that, though it has two resonant frequencies, the
lower resonant frequency is in ZCS region [5]. For this application, we are not able to design the converter
working at this resonant frequency. LLC resonant converter existed for a very long time but because of
unknown characteristic of this converter it was used as a series resonant converter with basically a passive
(resistive) load. . Here, it was designed to operate in switching frequency higher than resonant frequency of the
series resonant tank of Lr and Cr converter acts very similar to Series Resonant Converter. The benefit of LLC
resonant converter is narrow switching frequency range with light load[6] . Basically, the control ckt plays a
very imp. role and hence 555 Timer used here provides a perfect square wave as the control ckt provides no
slew rate which makes the square wave really strong and impenetrable. The dead band circuit provides the
exclusive dead band in micro seconds so as to avoid the simultaneous firing of two pairs of IGBT’s where one
pair switches off and the other on for a slightest period of time. Hence, the isolator ckt here is associated with
each and every ckt used because it acts as a driver and an isolation to each of the IGBT is provided with one
exclusive transformer supply[3]. The IGBT’s are fired using the appropriate signal using the previous boards
and hence at last a high frequency rectifier ckt with a filtering capacitor is used to get an exact dc
waveform .The basic goal of this particular analysis is to observe the wave forms and characteristics of
converters with differently positioned passive elements in the form of tank circuits. The supported simulation
is done through PSIM 6.0 software tool
Amateurs Radio operator, also known as HAM communicates with other HAMs through Radio
waves. Wireless communication in which Moon is used as natural satellite is called Moon-bounce or EME
(Earth -Moon-Earth) technique. Long distance communication (DXing) using Very High Frequency (VHF)
operated amateur HAM radio was difficult. Even with the modest setup having good transceiver, power
amplifier and high gain antenna with high directivity, VHF DXing is possible. Generally 2X11 YAGI antenna
along with rotor to set horizontal and vertical angle is used. Moon tracking software gives exact location,
visibility of Moon at both the stations and other vital data to acquire real time position of moon.
“MS-Extractor: An Innovative Approach to Extract Microsatellites on „Y‟ Chrom...IJERD Editor
Simple Sequence Repeats (SSR), also known as Microsatellites, have been extensively used as
molecular markers due to their abundance and high degree of polymorphism. The nucleotide sequences of
polymorphic forms of the same gene should be 99.9% identical. So, Microsatellites extraction from the Gene is
crucial. However, Microsatellites repeat count is compared, if they differ largely, he has some disorder. The Y
chromosome likely contains 50 to 60 genes that provide instructions for making proteins. Because only males
have the Y chromosome, the genes on this chromosome tend to be involved in male sex determination and
development. Several Microsatellite Extractors exist and they fail to extract microsatellites on large data sets of
giga bytes and tera bytes in size. The proposed tool “MS-Extractor: An Innovative Approach to extract
Microsatellites on „Y‟ Chromosome” can extract both Perfect as well as Imperfect Microsatellites from large
data sets of human genome „Y‟. The proposed system uses string matching with sliding window approach to
locate Microsatellites and extracts them.
Importance of Measurements in Smart GridIJERD Editor
- The need to get reliable supply, independence from fossil fuels, and capability to provide clean
energy at a fixed and lower cost, the existing power grid structure is transforming into Smart Grid. The
development of a smart energy distribution grid is a current goal of many nations. A Smart Grid should have
new capabilities such as self-healing, high reliability, energy management, and real-time pricing. This new era
of smart future grid will lead to major changes in existing technologies at generation, transmission and
distribution levels. The incorporation of renewable energy resources and distribution generators in the existing
grid will increase the complexity, optimization problems and instability of the system. This will lead to a
paradigm shift in the instrumentation and control requirements for Smart Grids for high quality, stable and
reliable electricity supply of power. The monitoring of the grid system state and stability relies on the
availability of reliable measurement of data. In this paper the measurement areas that highlight new
measurement challenges, development of the Smart Meters and the critical parameters of electric energy to be
monitored for improving the reliability of power systems has been discussed.
Study of Macro level Properties of SCC using GGBS and Lime stone powderIJERD Editor
The document summarizes a study on the use of ground granulated blast furnace slag (GGBS) and limestone powder to replace cement in self-compacting concrete (SCC). Tests were conducted on SCC mixes with 0-50% replacement of cement with GGBS and 0-20% replacement with limestone powder. The results showed that replacing 30% of cement with GGBS and 15% with limestone powder produced SCC with the highest compressive strength of 46MPa, meeting fresh property requirements. The study concluded that this ternary blend of cement, GGBS and limestone powder can improve SCC properties while reducing costs.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
LF Energy Webinar: Carbon Data Specifications: Mechanisms to Improve Data Acc...DanBrown980551
This LF Energy webinar took place June 20, 2024. It featured:
-Alex Thornton, LF Energy
-Hallie Cramer, Google
-Daniel Roesler, UtilityAPI
-Henry Richardson, WattTime
In response to the urgency and scale required to effectively address climate change, open source solutions offer significant potential for driving innovation and progress. Currently, there is a growing demand for standardization and interoperability in energy data and modeling. Open source standards and specifications within the energy sector can also alleviate challenges associated with data fragmentation, transparency, and accessibility. At the same time, it is crucial to consider privacy and security concerns throughout the development of open source platforms.
This webinar will delve into the motivations behind establishing LF Energy’s Carbon Data Specification Consortium. It will provide an overview of the draft specifications and the ongoing progress made by the respective working groups.
Three primary specifications will be discussed:
-Discovery and client registration, emphasizing transparent processes and secure and private access
-Customer data, centering around customer tariffs, bills, energy usage, and full consumption disclosure
-Power systems data, focusing on grid data, inclusive of transmission and distribution networks, generation, intergrid power flows, and market settlement data
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Session 1 - Intro to Robotic Process Automation.pdfUiPathCommunity
👉 Check out our full 'Africa Series - Automation Student Developers (EN)' page to register for the full program:
https://bit.ly/Automation_Student_Kickstart
In this session, we shall introduce you to the world of automation, the UiPath Platform, and guide you on how to install and setup UiPath Studio on your Windows PC.
📕 Detailed agenda:
What is RPA? Benefits of RPA?
RPA Applications
The UiPath End-to-End Automation Platform
UiPath Studio CE Installation and Setup
💻 Extra training through UiPath Academy:
Introduction to Automation
UiPath Business Automation Platform
Explore automation development with UiPath Studio
👉 Register here for our upcoming Session 2 on June 20: Introduction to UiPath Studio Fundamentals: https://community.uipath.com/events/details/uipath-lagos-presents-session-2-introduction-to-uipath-studio-fundamentals/
"Choosing proper type of scaling", Olena SyrotaFwdays
Imagine an IoT processing system that is already quite mature and production-ready and for which client coverage is growing and scaling and performance aspects are life and death questions. The system has Redis, MongoDB, and stream processing based on ksqldb. In this talk, firstly, we will analyze scaling approaches and then select the proper ones for our system.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
inQuba Webinar Mastering Customer Journey Management with Dr Graham HillLizaNolte
HERE IS YOUR WEBINAR CONTENT! 'Mastering Customer Journey Management with Dr. Graham Hill'. We hope you find the webinar recording both insightful and enjoyable.
In this webinar, we explored essential aspects of Customer Journey Management and personalization. Here’s a summary of the key insights and topics discussed:
Key Takeaways:
Understanding the Customer Journey: Dr. Hill emphasized the importance of mapping and understanding the complete customer journey to identify touchpoints and opportunities for improvement.
Personalization Strategies: We discussed how to leverage data and insights to create personalized experiences that resonate with customers.
Technology Integration: Insights were shared on how inQuba’s advanced technology can streamline customer interactions and drive operational efficiency.
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
Must Know Postgres Extension for DBA and Developer during MigrationMydbops
Mydbops Opensource Database Meetup 16
Topic: Must-Know PostgreSQL Extensions for Developers and DBAs During Migration
Speaker: Deepak Mahto, Founder of DataCloudGaze Consulting
Date & Time: 8th June | 10 AM - 1 PM IST
Venue: Bangalore International Centre, Bangalore
Abstract: Discover how PostgreSQL extensions can be your secret weapon! This talk explores how key extensions enhance database capabilities and streamline the migration process for users moving from other relational databases like Oracle.
Key Takeaways:
* Learn about crucial extensions like oracle_fdw, pgtt, and pg_audit that ease migration complexities.
* Gain valuable strategies for implementing these extensions in PostgreSQL to achieve license freedom.
* Discover how these key extensions can empower both developers and DBAs during the migration process.
* Don't miss this chance to gain practical knowledge from an industry expert and stay updated on the latest open-source database trends.
Mydbops Managed Services specializes in taking the pain out of database management while optimizing performance. Since 2015, we have been providing top-notch support and assistance for the top three open-source databases: MySQL, MongoDB, and PostgreSQL.
Our team offers a wide range of services, including assistance, support, consulting, 24/7 operations, and expertise in all relevant technologies. We help organizations improve their database's performance, scalability, efficiency, and availability.
Contact us: info@mydbops.com
Visit: https://www.mydbops.com/
Follow us on LinkedIn: https://in.linkedin.com/company/mydbops
For more details and updates, please follow up the below links.
Meetup Page : https://www.meetup.com/mydbops-databa...
Twitter: https://twitter.com/mydbopsofficial
Blogs: https://www.mydbops.com/blog/
Facebook(Meta): https://www.facebook.com/mydbops/
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providers
International Journal of Engineering Research and Development (IJERD)
1. International Journal of Engineering Research and Development
e-ISSN: 2278-067X, p-ISSN: 2278-800X, www.ijerd.com
Volume 6, Issue 10 (April 2013), PP. 71-76
71
Advancements in Multi-Criteria Decision Making Based
on Interval-Valued Intuitionistic Fuzzy Set
Yasir Ahmad1
, Sadia Husain2
, Azmat Ali Khan3
1, 2, 3
Faculty of Computer Science and Information System, Jazan University, K.S.A.
Abstract:- Multi-criteria decision-making (MCDM) problem is the process of finding the best option
from all of the feasible alternatives where all the alternatives can be evaluated according to a number
of criteria or attribute. Since human judgments including preferences are often vague and cannot
estimate his preference with an exact numerical value. Multi-criteria decision-making problems usually
consist of uncertain and imprecise data and information. To deal with vagueness/imprecision
Atanassov‟s intuitionistic fuzzy set [1] has found to be highly useful and they are successfully applied
to the field of multi criteria decision making problems. Later on, integrating interval valued fuzzy sets
with IFSs; Atanassov introduced the concept of interval valued intuitionistic fuzzy sets (IVIFSs) [2].
As a generalization of an intuitionistic fuzzy set, it is more flexible for IVIFSs to deal with uncertain
and fuzzy problems. There are many situations in multi attribute decision making where IVIFS theory
is more appropriate to deal with. In this paper we will review all the major contribution done in the
area of multiple attribute decision-making based on interval-valued intuitionistic fuzzy sets (IVIFSs)
theory.
Keywords:- Intuitionistic Fuzzy sets (IFS), Interval-Valued Intuitionistic Fuzzy sets (IVIFS), Multi-
Criteria Decision Making (MCDM).
I. INTRODUCTION
It has been widely recognized that most decisions made in the real world take place in an environment
in which the goals and constraints, because of their complexity, are not known precisely, and thus, the problem
cannot be exactly defined or precisely represented in a crisp value. Bellman, Zadeh (1970) and Zimmermann
(1978) introduced fuzzy sets into the MCDM field. They cleared the way for a new family of methods to deal
with problems that had been inaccessible to and unsolvable with standard MCDM techniques. Bellman and
Zadeh (1970) introduced the first approach regarding decision making in a fuzzy environment.
The concept of an intuitionistic fuzzy set can be viewed as an alternative approach to define a fuzzy set
in cases where available information is not sufficient for the definition of an imprecise concept by means of a
conventional fuzzy set. In general, the theory of intuitionistic fuzzy sets is the generalization of fuzzy sets.
Therefore, it is expected that intuitionistic fuzzy sets could be used to simulate human decision-making
processes and any activities requiring human expertise and knowledge which are inevitably imprecise or not
totally reliable. IFS theory has been extensively applied to areas like Artificial Intelligence, networking, Soft
decision making, Programming logic, operational research etc. One the promising role of IFS has been emerged
in Decision making Problems. In some real-life situations, a decision makers may not be able to accurately
express their preferences for alternatives as they may not possess a precise or sufficient level of knowledge of
the problem or the decision makers are unable to discriminate explicitly the degree to which one alternative are
better than others [3], in such cases, the decision makers may provide their preference for alternatives to a
certain degree, but it is possible that they are not so sure about it [4]. Thus, it is very suitable to express the
decision maker preference values with the use of intuitionistic fuzzy values rather than exact numerical values
or linguistic variables [5, 6, and 7]. To satisfy the need of decision making problem with imprecision and
uncertainty many researchers have been concentrated on IFS theory. In year 1989 Atanassov introduced
Interval-valued intuitionistic fuzzy sets and many researchers have shown interest in the IVIFS theory and
successfully applied it to the field of multi-criteria decision making. In this paper, we will review the
contribution in the field of multi-criteria decision making based on IVIFS theory.
This paper is organized as follows. The definitions of Interval-valued intuitionistic fuzzy sets and
intuitionistic fuzzy sets are briefly introduced in Section 2. In Section 3 we discuss the role of IVIFS in Multi
criteria decision making, finally conclusion is given in Section 4.
2. Advancements in Multi-Criteria Decision Making Based on Interval-Valued...
72
II. PRELIMINARIES
1.1 Definitions of intuitionistic fuzzy sets
Let a set E be fixed. An IFS A in E is an object of the following form:
A = {(x, A(x), A(x)) xE},
where the functions A( x ) : E [ O , l ] and A(x ) : E [0, 1] determine the degree of membership and
the degree of non-membership of the element xE, respectively, and for every xE:
0 A(x) + A(x) 1
When A(x) = 1 - A(x) for all xE is ordinary fuzzy set.
In addition, for each IFS A in E, if
A(X) = 1-
x −
x
Then
A(X) is called the degree of indeterminacy of x to A [3], or called the degree of hesitancy of x to A.
Especially, if
A(X) = 0, for all x E then the IFS A is reduced to a fuzzy set.
2.2 Definitions of Interval-Valued intuitionistic fuzzy sets
Sometime it is not appropriate to assume that the membership degrees for certain elements of A are
exactly defined, but a value range can be given. In such cases, Atanassov and Gargov defined the notion of
interval-valued intuitionistic fuzzy set (IVIFS) as below:
Let X={x1, x2, x3....xn} be a given set and D[0,1] be the set of all closed subintervals of the interval
[0,1], and X be ordinary finite non-empty sets. An interval valued intuitionistic fuzzy set A in X is an expression
given by
̃A= {< ̃x,̃μà (x )+̃Ã(x) > x X}
Where
μ̃ Ã : XD[0,1], ̃Ã: XD[0,1]
with the condition sup(̃μà (x ))+ sup(̃à (x )) ≤ 1
Especially, if each of the intervals μà (x ) and à (x ) contains exactly one element, i.e., if for every xX
μà (x)=inf(μà (x )) = sup(μà (x )),
à (x)= inf(à (x)) = sup(à (x))
Then, the given IVIFS Ã is transformed to an ordinary intuitionistic fuzzy set. Based on IVIFS, Xu
defined the notion of interval-valued intuitionistic fuzzy number (IVIFN):
Definition: Let à ={< ̃x,̃μà (x )+̃Ã(x) > x X}, be an IVIFS, then we call the pair ( ̃μà (x )+̃Ã(x) ) an
IVIFN.
III. ROLE OF IVIFS IN MULTI CRITERIA DECISION MAKING PROBLEMS
MCDM is concerned with structuring and solving decision and planning problems involving multiple
criteria. Typically, there do not exist a unique optimal solution for such problems so it is necessary to use
decision maker‟s preferences to differentiate between solutions. Multiple criteria decision making (MCDM) is
often use for dealing with complex engineering problems. Cenigiz Kahraman gave very useful description about
MCDM in his book on Fuzzy Multi criteria Decision Making[8]. In his book he explained MCDM problems
with two basic approaches: multiple attribute decision making (MADM) and multiple objective decision making
(MODM). MADM problems are distinguished from MODM problems, which involve the design of a “best”
alternative by considering the tradeoffs within a set of interacting design constraints. MADM refers to making
selections among some courses of action in the presence of multiple, usually conflicting, attributes. In MODM
problems, the number of alternatives is effectively infinite, and the tradeoffs among design criteria are typically
described by continuous functions.
MADM approaches can be viewed as alternative methods for combining the information in a
problem‟s decision matrix together with additional information from the decision maker to determine a final
ranking, screening, or selection from among the alternatives. Besides the information contained in the decision
matrix, all but the simplest MADM techniques require additional information from the decision maker to arrive
at a final ranking, screening, or selection. In the MODM approach, contrary to the MADM approach, the
decision alternatives are not given. Instead, MODM provides a mathematical framework for designing a set of
decision alternatives. Each alternative, once identified, is judged by how close it satisfies an objective or
multiple objectives.
Fuzzy set theory has been used for handling fuzzy decision-making problems for a long span of time
but many researchers have shown interest in the IFS theory and applied it to the field of decision making [9-12].
3. Advancements in Multi-Criteria Decision Making Based on Interval-Valued...
73
But after the introduction of IVIFS, there is a shift from IFS to IVIFS and researcher finds that Interval-valued
intuitionistic fuzzy set (IVIFS) is effective in dealing with fuzziness and uncertainty inherent in decision data
and multi-attribute decision making (MADM).
In year 2006 Chunqiao Tan and Qiang Zhang[13] presented a novel method for multiple attribute decision-
making based on interval valued intuitionistic fuzzy sets (IVIFSs) theory and TOPSIS method in fuzzy
environments. In their paper, the concept of interval-valued intuitionistic fuzzy sets is introduced, and the
distance between two interval valued intuitionistic fuzzy sets is defined. Then, according to the ideal of classical
TOPSIS method, a closeness coefficient is defined to determine the ranking order of all alternatives by
calculating the distances to both the interval valued intuitionistic fuzzy positive-ideal solution and interval
valued intuitionistic fuzzy negative-ideal solution. The multi-attribute decision-making process based on IVIFSs
is given in fuzzy environments.
In year 2007 Zenshui and Chen [14] developed, the ordered weighted aggregation operator and hybrid
aggregation operator for aggregating interval-valued intuitionistic preference information. Interval-valued
intuitionistic judgment matrix and its score matrix and accuracy matrix are defined. Some of their desirable
properties are investigated in detail. The relationships among interval-valued intuitionistic judgment matrix,
intuitionistic judgment matrix, and complement judgment matrix, are discussed. On the basis of the arithmetic
aggregation operator and hybrid aggregation operator, an approach to group decision making with interval-
valued intuitionistic judgment matrices is given.
In year 2008 many authors contributed and their work is briefly discussed as following.
Zhoujing Wang et al. [15] in their paper used interval-valued intuitionistic fuzzy matrices, interval-
valued intuitionistic fuzzy number and developed several optimization model to generate optimal weight of the
attribute and the corresponding decision making methods.
Guiwu Wei and Gang Lan[16] proposed a modified grey relational analysis (GRA) method and used
the traditional GRA method for calculating steps for solving interval-valued intuitionistic fuzzy multiple
attribute decision-making problems with known weight information. The degree of grey relation between every
alternative and positive ideal solution and negative ideal solution are calculated. Then, according to the concept
of the GRA, a relative relational degree is defined to determine the ranking order of all alternatives by
calculating the degree of grey relation to both the positive-ideal solution (PIS) and negative-ideal solution (NIS)
simultaneously. Wei Yang and Yongfeng Pang [17] also worked on GRA method with IVIFS to solve MCDM
problems.
Zhuo Xiao and Guiwu We [18] used interval-valued intuitionistic fuzzy information to deal with the
supplier selection in supply chain management with, in which the information about attribute weights is
completely known, and the attribute values take the form of interval-valued intuitionistic fuzzy numbers. A
modified TOPSIS analysis method is also proposed.
In year 2009 following contribution has been made in area:
Zeng-Tai Gong and Yan Ma [19] introduced a score function and an accuracy function for ranking the
order of interval-valued intuitionistic fuzzy sets. Then, obtain the criteria's optimal weights via a linear
programming model to rank the alternatives and select the best one(s) in accordance with the value of the
weighting function.
Luo Yongbiao et al. [20] used the weighted correlation coefficient of interval-valued intuitionistic
fuzzy sets (IVIFSs) to identify the best alternative in multicriteria decision-making.
Weibo Lee [21] gave a novel method for ranking interval-valued intuitionistic fuzzy numbers and its application
to Decision Making, A new score function is presented by taking into account the expectation of the hesitancy
degree of interval-valued intuitionistic fuzzy sets (IVIFSs). And a deviation function is defined considering the
variance of the hesitancy degree of IVIFSs. A new method for ranking interval-valued intuitionistic fuzzy
numbers is then established based on the score function and deviation function. Their proposed method can
nicely overcome the situation of difficult decision or wrong decision of existing measuring functions to the
alternatives in some cases. Later on some other authors also worked on IVIFSs ranking and proposed new
methods [22-23].
Weibo Lee [24] also gave an enhanced multicriteria decision-making method of machine design
schemes under interval-valued intuitionistic fuzzy environment.
Hongjun Wang [25] utilized the interval-valued intuitionistic fuzzy weighted averaging (IIFWA)
operator to aggregate the interval-valued intuitionistic fuzzy information corresponding to each alternative, and
then rank the alternatives and select the most desirable one(s) according to the score function and accuracy
4. Advancements in Multi-Criteria Decision Making Based on Interval-Valued...
74
function. Finally, an illustrative example about selecting an ERP system is given to verify the developed
approach and to demonstrate its practicality and effectiveness.
Zhou-Jing Wang et al. [26] found that interval-valued intuitionistic preference relations are a powerful
means to express a decision maker's uncertainty and hesitation about its preference over criteria in the process of
multi-criteria decision making. In their paper, they established Goal programming models for generating priority
interval weights based on interval-valued intuitionistic preference relations.
In year 2010 authors proposed various programming models to solve multiattribute decision-making (MADM)
problems using IVIFSs. Their worked is briefed below.
Deng-Feng Li [27-28] developed a Linear and nonlinear-programming methodology that is based on
the technique for order preference by similarity to ideal solution to solve multiattribute decision-making
(MADM) problems with both ratings of alternatives on attributes and weights of attributes expressed with IVIF
sets.
Zhoujing Wang and Jianhui Xu [29] several fractional programming models are derived from TOPSIS
to determine the fuzzy relative closeness intervals of alternatives, and the corresponding decision-making
method has also been developed. Feasibility and effectiveness of the proposed method are illustrated with an
investment decision problem. Zhoujing Wang et al. [30] also worked on a linear programming method for
interval-valued intuitionistic fuzzy multi attribute group decision making (MAGDM).
Yuan Yu et al. [31], proposed an approach that derives a linear program for determining attribute weights. The
weights are subsequently used to synthesize individual IVIFN assessments into an aggregated IVIFN value for
each alternative. In order to rank alternatives based on their aggregated IVIFN values, a novel method is
developed for comparing two IVIFNs by introducing the formula of possibility degree and the ranking vector of
the possibility degree matrix.
In year 2011 many researchers used IVIFS with different methods to deal with MADM problems more
efficiently.
YingJun Zhang et al. [32] proposed a new axiomatic definition of entropy on interval-valued
intuitionistic fuzzy sets (IVIFSs) and a method to construct different entropies on IVIFSs. They also developed a
new multi-attribute decision making (MADM) method based on similarity measures using entropy-based
attribute weights to deal with situations where the alternatives on attributes are expressed by IVIFSs and the
attribute weights information is unknown.
Xiong Wei and Li Jinlong[33] proposed a group decision-making model for emergency decision
making based on interval-valued intuitionistic fuzzy set. Through gathering the experts' evaluations in interval-
valued intuitionistic fuzzy matric, they measure consistent estimate level of experts judgment and present a
solution to get the expert weights, then obtain ranking of contingency plans through ideal point method based on
Hamming distance which is a new method for solving the interval intuitionistic fuzzy problem of the experts,
opinions, Finally, a case study on the fire disaster decision-making is conducted to illustrate the feasibility and
effectiveness of the model.
Shyi-Ming Chen and Li-Wei Lee [34] used Karnik-Mendel algorithms to propose the interval-valued
intuitionistic fuzzy weighted average operator. They also proposed a fuzzy ranking method for intuitionistic
fuzzy values based on likelihood-based comparison relations between intervals. Using the two proposed concept
they introduced a new method for multiattribute decision making.
Zhao Zhitao and Zhang Yingjun [35] developed a method that is based on the accuracy function to solve
MADM problems with both ratings of alternatives on attributes and weights of attributes expressed with IVIFSs.
V. Lakshmana Gomathi Nayagam [36] et al. proposed a new method for ranking of interval-valued intuitionistic
fuzzy sets and compared their new method with other methods. They use an illustrative example to verify the
developed approach and to demonstrate its practicality and effectiveness.
Some other researcher like ZuBei Ying et al. [37] and Ting-Yu Chen et al. [38] also worked on multiple
attribute group decision making based on interval-valued intuitionistic fuzzy sets.
In year 1212 only few researchers worked in this area and their worked is brief here:
Jian-qiang et al. [39] in their article analyze the limitations of existing score functions of intuitionistic
fuzzy set. They introduced a new score function based on the prospect value function and used this prospect
score function to develop an interval-valued intuitionistic fuzzy multi-criteria decision-making approach. This
approach gives a matrix of score function values and a comprehensive evaluation value of each alternative. And
the order of alternatives is listed by comparing the projection values of each alternative to the positive ideal
solution.
Dejian Yu et al. [40] their study investigates the group decision making under interval-valued
intuitionistic fuzzy environment in which the attributes and experts are in different priority level. They first
5. Advancements in Multi-Criteria Decision Making Based on Interval-Valued...
75
propose some interval-valued intuitionistic fuzzy aggregation operators such as the interval-valued intuitionistic
fuzzy prioritized weighted average (IVIFPWA) operator, the interval-valued intuitionistic fuzzy prioritized
weighted geometric (IVIFPWG) operator. These proposed operators can capture the prioritization phenomenon
among the aggregated arguments. Also introduce an approach to multi-criteria group decision making based on
the proposed operators is given under interval-valued intuitionistic fuzzy environment.
IV. CONCLUSION
As we know that decision makers face many problems with incomplete and vague information in
decision making problems since the characteristics of these problems often require this kind of information. So
fuzzy approaches are suitable to use when the modelling of human knowledge is necessary and when human
evaluations are needed. Out of several generalizations of fuzzy set theory for various objectives, the notions
introduced by in defining intuitionistic fuzzy sets and interval-valued intuitionistic fuzzy sets (IVIFS) are
interesting and very useful in modelling real life problems. An IVIFS set plays a vital role in decision-making,
data analysis, artificial intelligence and socioeconomic system.
Considering this as an interesting area many researchers have worked in decision making problem based on
Interval-valued intuitionistic fuzzy. In this paper we have discussed about all the major paper published in this
area by different authors. We have also given the brief introduction of their approaches and methods introduced
by the researchers in their paper.
REFERENCES
[1]. K.T. Atanassov, Intuitionistic fuzzy sets [J]. Fuzzy Sets and Systems,1986,20: 87-96.
[2]. K.T. Atanassov, G.Gargov, Interval valued intuitionistic fuzzy sets [J],Fuzzy Sets and Systems. 1989,
31, 343-349.
[3]. E. Herrera-Viedma, F. Chiclana, F. Herrera, S. Alonso, A group decision-making model with
incomplete fuzzy preference relations based on additive consistency, IEEE Transactions on Systems,
Man and Cybernetics-Part B, in press.
[4]. G. Deschrijver, E.E. Kerre, on the composition of intuitionistic fuzzy relations, Fuzzy Sets and Systems
136 (2003) 333–361.
[5]. F. Herrera, L. Martinez, P.J. Sanchez, Managing non-homogeneous information in group decision
making, European Journal of Operational Research 166 (2005) 115–132.
[6]. E. Szmidt, J. Kacprzyk, Group decision making under intuitionistic fuzzy preference relations, in:
Proceedings of 7th IPMU Conference, Paris 1998, pp. 172–178.
[7]. E. Szmidt, J. Kacprzyk, Using intuitionistic fuzzy sets in group decision making, Control and
Cybernetics 31 (2002) 1037–1053.
[8]. Cengiz Kahraman, "Fuzzy Multi-Criteria Decision Making: Theory and Applications with Recent
Developments" Series: Springer Optimization and Its Applications, August 19, 2008.
[9]. E. Szmidt, J. Kacprzyk, Intuitionistic fuzzy sets in group decision making, NIFS 2 (1) (1996) 15–32.
[10]. E. Szmidt, J. Kacprzyk, Remarks on some applications of intuitionistic fuzzy sets in decision making,
NIFS 2 (3) (1996)22–31.
[11]. E. Szmidt, J. Kacprzyk, Group decision making via intuitionistic fuzzy sets, FUBEST‟96, Sofia,
Bulgaria, October 9–11, 1996, pp. 107–112.
[12]. E. Szmidt, J. Kacprzyk, Intuitionistic fuzzy sets for more realistic group decision making, International
Conference on Transition to Advanced Market Institutions and economies, Warsaw, June 18–21, 1997,
pp. 430–433.
[13]. C. Tan, Q. Zhang, Fuzzy Multiple Attribute Decision Making Based on Interval Valued Intuitionistic
Fuzzy Sets, Systems, Man and Cybernetics, 2006. Volume 2, 2006 , 1404- 1407.
[14]. Ze. Xu, J. Chen, Approach to Group Decision Making Based on Interval-Valued Intuitionistic
Judgment Matrices, Systems Engineering - Theory & Practice, Volume 27, Issue 4, April 2007,
Pagesiu 126-133.
[15]. Z. Wang; W. Wang; Lie, K.W., Multi-attribute decision making models and methods under interval-
valued intuitionistic fuzzy environment, Control and Decision Conference, (2008) 2420- 2425.
[16]. G. Wei; G. Lan „Grey Relational Analysis Method for Interval-Valued Intuitionistic Fuzzy Multiple
Attribute Decision Making‟, Fuzzy Systems and Knowledge Discovery, 2008.Volume:1 2008 , 291-
295.
[17]. W. Yang; Y. Pang, A new MCDM method based on GRA in interval-valued intuitionistic fuzzy
setting, Computer Science and Information Technology (ICCSIT), Volume 4 , 2010 , Page(s): 299-
302.
[18]. Z. Xiao; G. We „Application Interval-Valued Intuitionistic Fuzzy Set to Select Supplier‟, Fuzzy
Systems and Knowledge Discovery Volume: 3, 2008, 351- 355.
6. Advancements in Multi-Criteria Decision Making Based on Interval-Valued...
76
[19]. Zeng-Tai Gong; Y. Ma, Multicriteria fuzzy decision making method under interval-valued intuitionistic
fuzzy environment, Machine Learning and Cybernetics, 2009 Volume 2, 2009 ,728-731.
[20]. L. Yongbiao; Y. Jun; M. Xiaowen, „Multicriteria fuzzy decision-making method based on weighted
correlation coefficients under interval-valued intuitionistic fuzzy environment‟, Computer-Aided
Industrial Design & Conceptual Design, 2009. 2057- 2060
[21]. W. Lee „A Novel Method for Ranking Interval-Valued Intuitionistic Fuzzy Numbers and Its
Application to Decision Making‟, Intelligent Human-Machine Systems and Cybernetics, 2009 , 282-
285.
[22]. V. L. G. Nayagam, G. Sivaraman „Ranking of interval-valued intuitionistic fuzzy sets‟ , Applied Soft
Computing, Volume 11, Issue 4, June 2011, Pages 3368-3372
[23]. Shyi-Ming Chen; Ming-Wey Yang; Churn-Jung Liau „A new method for multicriteria fuzzy decision
making based on ranking interval-valued intuitionistic fuzzy values‟ Machine Learning and
Cybernetics (ICMLC), 2011, Page(s): 154 - 159
[24]. W. Lee, An enhanced multicriteria decision-making method of machine design schemes under interval-
valued intuitionistic fuzzy environment, Computer-Aided Industrial Design & Conceptual Design,
2009, 721-725.
[25]. H. Wang, „Model for Selecting an ERP system based on IIFWA operator under interval-valued
intuitionistic fuzzy environment‟, Industrial Mechatronics and Automation, 2009, 390- 393.
[26]. Z. Wang; W. Wang; Li, K.W „A goal programming method for generating priority weights based on
interval-valued intuitionistic preference relations‟, Machine Learning and Cybernetics, 2009 , Page(s):
1309 - 1314
[27]. D. Li, „TOPSIS-Based Nonlinear-Programming Methodology for Multiattribute Decision Making With
Interval-Valued Intuitionistic Fuzzy Sets‟ Fuzzy Systems,Volume:18, Issue: 2, 2010 ,299- 311
[28]. D. Li ,Linear programming method for MADM with interval-valued intuitionistic fuzzy sets ,Expert
Systems with Applications, Volume 37, Issue 8, August 2010, Pages 5939-5945
[29]. Z. Wang, J. Xu ,A fractional programming method for interval-valued intuitionistic fuzzy multi-
attribute decision making, Control and Decision Conference (CCDC), 2010, 636- 641.
[30]. Z. Wang; L. Wang; Li, K.W. , A linear programming method for interval-valued intuitionistic fuzzy
multi attribute group decision making Control and Decision Conference (CCDC), 2011, Page(s): 3833-
3838.
[31]. Y. Yu; Li Yi-jun; J. Wei; Z. Wang, A method based on possibility degree for interval-valued
intuitionistic fuzzy decision making, Management Science and Engineering (ICMSE), 2010 , 110- 115.
[32]. Y. Zhang; P. Ma; X. Su; C. Zhang, „Entropy on interval-valued intuitionistic fuzzy sets and its
application in multi-attribute decision making‟ Information Fusion (FUSION), 2011 , Page(s): 1- 7.
[33]. X. Wei and Li Jinlong, „An emergency group decision-making model based on interval-valued
intuitionistic fuzzy set‟, Artificial Intelligence, Management Science and Electronic Commerce
(AIMSEC), 2011 , Page(s): 4931 - 4934.
[34]. S. Chen; L. Lee, „A new method for multiattribute decision making using interval-valued intuitionistic
fuzzy values‟, Machine Learning and Cybernetics (ICMLC), 2011 , 148- 153.
[35]. Z. Zhitao; Z. Yingjun, „Multiple attribute decision making method in the frame of interval-valued
intuitionistic fuzzy sets‟ Fuzzy Systems and Knowledge Discovery (FSKD)Volume:1 2011 ,192- 196
[36]. V. L. G. Nayagam, S. Muralikrishnan, G. Sivaraman, „Multi-criteria decision-making method based on
interval-valued intuitionistic fuzzy sets „Expert Systems with Applications, Volume 38, Issue 3, March
2011, Pages 1464-1467
[37]. Z. Ying; F. Ye; J. Li; Z. Hong, Multiple attribute group decision making with incomplete information
based on interval-valued intuitionistic fuzzy sets theory‟, Multimedia Technology (ICMT), 2011 ,
Page(s): 433- 436
[38]. T. Chen, H. Wang, Y. Lu, „A multicriteria group decision-making approach based on interval-valued
intuitionistic fuzzy sets‟, Expert Systems with Applications, Volume 38, Issue 6, June 2011, Pages
7647-7658.
[39]. J. Wang, K. Li, H. Zhang, „Interval-valued intuitionistic fuzzy multi-criteria decision-making approach
based on prospect score, Knowledge-Based Systems, Volume 27, March 2012, Pages 119-125.
[40]. D. Yu, Y. Wu, T. Lu, „Interval-valued intuitionistic fuzzy prioritized operators and their application in
group decision making‟, Knowledge-Based Systems, Volume 30, June 2012, Pages 57-66