Fuzzy soft set approach in decision making plays a crucial role by using Dempster–Shafer theory of evidence. First, the uncertain degrees of several parameters are obtained via grey relational analysis that apply to calculate the grey mean relational degree. Secondly, a mass functions of different independent choices with several parameters have given according to the uncertain degree. Lastly, aggregate the choices into a collective choices, Dempster’s rule of evidence combination have been utilized. The aforesaid soft computing based method have been applied on decision making problem.
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
Fuzzy neural networks
1.
2. The fuzzy neural network is an architecture that combines
standard MLP network with fuzzy logic in one system. The FNN
consists of multi layers of neurons with partial feed forward
connection. The input layer of neurons represent the input
variable as crisp value. These values are fed to the condition
layer which performs fuzzi-fication by triangular membership
function. The output from the condition layer is propagated to
the rule layer. The rule layer is matching in its structure and act
to a hidden layer of a standard MLP network. The difference is
that in FNN each node in the rule layer represents one fuzzy rule.
The semantic sense of the activation of a rule layer node is that it
symbolizes the degree to which input data matches the
antecedent module of an associated rule. Outputs from the rule
layer are fed to the action component layer. As for the rule layer,
the operation and the building of the activation layer is identical
to the standard hidden layers of MLP networks.
3. Main Points:-
Fuzzy neural networks
PNN’s
Match Degree Criterion , Minimum Believable
Level
4. Our fuzzy neural networks are related to the
PNN’s. Let there be K classes and let x be any
feature vector from the people of interest to be
accepted. The Class k example feature vectors
are represented by x(q(k)) for q(k) = 1,...,Q(k).
Fuzzy logic uses truth values between 0 and 1,
so the output values f1(x) and f2(x) are the fuzzy
truths that the input vector belongs to Class 1
and Class 2, respectively. We say the fuzzy truth
are the values of fuzzy set membership function
whose functional values are the fuzzy truths of
membership each of the classes.
5. . The feature vector x belongs to class with the maximum fuzzy
true value .When there is a clean winner, then x belongs to a
single class, but otherwise it may belongs to more than one class
with a given relative fuzzy truths. No training of weights is
required. The feature vectors can be trimmed here and the
spread parameters improved for efficiency. An alternative way is
to find the values for all Gaussians, f1(x),...,fM(x), but not sum
the ones for each class. Instead, we compute the values of the
multiple Gaussians for each class and take the class maximum
value as the output from the output node we can have multiple
fuzzy set membership, the input feature vector may belong to,
Class 1 with a fuzzy value of 0.62 and it may also belong to
Class 3 with fuzzy value of 0.57. If the data is good and not
noisy, one fuzzy set will be a clear winner. This alternate method
of defining the fuzzy truths for each of the K classes declares
that the fuzzy truth of membership in any class is between 0 and
1, whereas the 1st method above could maybe permit the fuzzy
truth to exceed 1.
6. There are many types of constructs for fuzzy NN.
One described here uses fuzzy set membership
functions for each feature. For example, if the
range of values for each of N features is the
domain for 3 fuzzy set membership functions for
3 ranges of value separately. Given the value x of
a single feature from an input vector, we put it
through the 3 fuzzy set membership function,
called LOW, MEDIUM and HIGH, for that feature. A
value x1 we see that it yield the fuzzy truth of
membership in the LOW fuzzy set with fuzzy
truth fLOW(x1) and fuzzy truth of membership in
MEDIUM with fuzzy truth fMEDIUM(x1).
7. Similarly, we do this for each of the other
structures. Then we have a set of fuzzy
membership truth that is the value at the hidden
node. These are compared to fuzzy truth of the
feature in the fuzzy set for each class. There are
many possibilities for doing this. One way is to
let each article vote for the class whose fuzzy set
membership is closest with majority vote the
conqueror in determining the class. The
advantage of this is that if there is noise on a
smaller of feature quantities it doesn’t decide the
winner. The other schemes are used and many
more are possible.