SlideShare a Scribd company logo
1 of 72
Download to read offline
1/72
ROUGH SET AND FUZZY ROUGH
SETS IN DECISION MAKING
Dr. A.Tamilarasi
Professor/MCA
Kongu Engineering College
Perundurai, Erode 638052
TAMILNADU
2/72
Outline
 Introduction
 Rough sets
 Basic Notations
 Fuzzy sets and Fuzzy Logic
 Fuzzy Rough sets
 Rough sets in Decision Making
3/72
Introduction - Rough sets
One goal of the Knowledge Discovery is extract meaningful
knowledge.
Rough Sets theory was introduced by Z. Pawlak (1982) as
a mathematical tool for data analysis.
Rough sets have many applications in the field of
Knowledge Discovery, feature selection, Banking etc.,
Rough set have been introduced as a tool to deal with,
uncertain Knowledge in Artificial Intelligence Application
4/72
Equivalence Relation and Equivalence class
 A relation on a set X is subset of X×X.
Let X be a set and let x, y, and z be elements of X. An
equivalence relation R on X is a relation on X such that:
Reflexive Property: xRx for all x in X.
Symmetric Property: if xRy, then yRx.
Transitive Property: if xRy and yRz, then xRz.
 Let R be an equivalence relation on a set X. For each a ∈ X,
we define the equivalence class of a, denoted by [a], to be
the set [a] = {x ∈ SX: x R a}. The equivalence classes form a
partition of X. This partition – the set of equivalence classes –
is sometimes called the quotient set or the elementary set of
X by R and is denoted by X / R.
5/72
Rough Sets Theory
Let T = (U, A, C, D,), be a Decision system data, where:
U is a non-empty, finite set called the universe, A is a
non-empty finite set of attributes, C and D are subsets of
A, Conditional and Decision attributes subsets
respectively.  : U C  A information function
for a  A, Va is called the value set of a ,
The elements of U are objects, cases, states,
observations.
The Attributes are interpreted as features, variables,
characteristics conditions, etc.
aVUa :
6/72
information table : Example 1:
Let U = {x1, x2, x3, x4, x5,x6 }, the universe set. C = {a1, a2,
a3, a4 } the conditional features set. V1 = {good, Medium,
bad}, V2 ={good, bad},V3 = {good, bad}, V4 = {good, bad}.
 The information function  (x1, a1 ) = good, so on
Student a1 a2 a3 a4
x1 good good bad good
x2 Medium bad bad bad
x3 Medium bad bad good
x4 bad bad bad bad
x5 Medium good good bad
x6 good bad good good
7/72
 If in the set of attributes A, condition attributes C = {a1;
a2; a3 }
 and decision attribute D = {a4},were distinguished, the
data table could be seen as a decision table.
 In order to explain the evaluations of the decision
attribute in terms of the evaluations of the condition
attributes, one can represent the data table as a set of
decision rules. Such a representation gives the following
rules, for example:
 If the level in Mathematics is good and the level in
Physics is good And the level in Literature is bad, then
the students is good.
8/72
information table – Example 2
H M T F
p1 No Yes High Yes
p2 Yes No High Yes
p3 Yes Yes V. High Yes
p4 No Yes Normal No
p5 Yes No High No
p6 No Yes V. High Yes
Columns of the table are labeled by attributes Headache (H), Muscle-
pain (M),Temperature (T) and Flu (F) and rows –by patients (objects) – p1
p2 ,p3 ,p4 ,p5 ,p6 . Each row of the table can be seen as information about
specific patient. For example, take p2,attribute-value set {(Headache,
yes), (Muscle-pain, no), (Temperature, high), (Flu, yes)}
9/72
R-lower approximation & R-upper
approximation
Let X  U and R  C, R is a subset of conditional features, then the R-
lower approximation set of X, is the set of all elements of U which
 can be with certainty classified as elements of X.
R ↓ X =
 R-lower approximation set of X is a subset of X. The R-upper
approximation set of X, is the set of all elements of U such that:
R↑X =
X is a subset of R-upper approximation set of X. R-upper approximation
contains all data which can possibly be classified as belonging to the set X
the R-Boundary set of X is defined as:
}:/{ XYRUYXR 
}:/{  XYRUYXR
XRXRXBN )(
10/72
Information system-
Example
P1 P2 P3 p4 p5
O1
1 2 0 1 1
O2
1 2 0 1 1
O3
2 0 0 1 0
O4
0 0 1 2 1
O5
2 1 0 2 1
O6
0 0 1 2 2
O7
2 0 0 1 0
O8
0 1 2 2 1
O9
2 1 0 2 2
O10 2 0 0 1 0
4.11.2016
When the full set of attributes P = { P1, P2,
P3 , P4 , P5 } is considered, we see that
we have the following seven equivalence
classes:
{ {O1 , O 2} {O3 , O7 , O10} {O 4} {O 5} {O 6 }
{O 8 } {O 9}.
It is apparent that different attribute subset
selections will in general lead to different
indiscernibility classes. For example, if
attribute P = {P1} alone is selected, we
obtain the following
{O1 , O 2}
{O3 , O5, O7 ,O9, O10},
{O4 , O6 , O8}
11/72
 consider the target set X = { O1 , O2 , O3 , O4 }, and let attribute
subset P = { P1, P2 ,P3 , P4 , P5 }, the full available set of features. It
will be noted that the set X cannot be expressed exactly, because in
[x ] P , objects { O3 , O7 , O10 } are indiscernible. Thus, there is no
way to represent any set X which includes O3 but excludes objects
O7 and O10
However, the target set X can be approximated using only the
information contained within P by constructing the P -lower and P -
upper approximations of X :
The P-lower approximation, or positive region, is the union of all
equivalence classes in [x]P which are contained by (i.e., are subsets
of) the target set – in the example, = { O1 , O2 } ∪ {O4}, the union of
the two equivalence classes in [x] P which are contained in the target
set. The lower approximation is the complete set of objects in U / P
positively (i.e., unambiguously) classified as belonging to target set X
.
4.11.2016
XP
12/72
Upper approximation and negative region
 The P -upper approximation is the union of all equivalence classes in
[x] P which have non-empty intersection with the target set – in the
example, = {O1 , O2 } ∪ {O4 } ∪ {O3 , O7 , O10 }, the union of the
three equivalence classes in [ x ] P that have non-empty intersection
with the target set.
 The upper approximation is the complete set of objects that in U / P
that cannot be positively (i.e., unambiguously) classified as
belonging to the complement ( X ¯) of the target set X.
 In other words, the upper approximation is the complete set of
objects that are possibly members of the target set X.
 The set U − therefore represents the negative region, containing
the set of objects that can be definitely ruled out as members of the
target set.
4.11.2016
XP
XP
13/72
Indiscernibility Relation
The Indiscernibility relation IND(P) is an equivalence
relation.
Let a  A, P A, the indiscernibility relation IND(P), is
defined as follows:
IND (P) = {(x.y)  U  U : for all a  P, a(x) = a(y) }
The indiscernibility relation defines a partition in U. Let
P A, U/IND(P) denotes a family of all equivalence
classes of the relation IND(P), called elementary sets.
Two other equivalence classes U/IND(C) and U/IND(D),
called condition and decision equivalence classes
respectively, can also be defined.
14/72
Representation of the approximation sets
If = then, X is R-definible (the boundary set is empty)
If then X is Rough with respect to R.
ACCURACY := Card(Lower)/ Card (Upper)
αR =  / 
XRXR 
XB XR
XR XR
15/72
Example
Lets consider U={x1, x2, x3, x4, x5, x6, x7, x8} and the equivalence
relation R with the equivalence classes:
X1={x1,x3,x5}, X2={x2,x4} and X3={x6,x7,x8} is a Partition.
Let the classification C={Y1,Y2,Y3} such that
Y1={x1, x2, x4}, Y2={x3, x5, x8}, Y3={x6, x7}
Only Y1has lower approximation, i.e.
,21 XYR 
4.11.2016
16/72
Let us depict above definitions by example 2
 Consider the concept "flu", i.e., the set X= {p1, p2, p3, p6} and
the set of attributes B = {Headache, Muscle-pain,
Temperature}. Hence
 = {p1,p3, p6} and = {p1, p2, p3, p5, p6}. For this case
we get αB(“flu") = 3/5. It means that the concept "flu" can be
characterized partially employing symptoms, Headache,
Muscle-pain and Temperature. Taking only one symptom B=
{Headache} we get
 = ∅, = U and αB(“flu") = 0, which means that the concept
"flu“ cannot be characterized in terms of attribute Headache
only i.e., this attribute is not characteristic for flu whatsoever.
However, taking the attribute B= {Temperature} we get = {p3,
p6 }, = {p1, p2,p3,p5, p6} and αB(X) = 2/5, which means that
the single symptom Temperature is less characteristic for flu,
than the whole set of symptoms, but also characterizes flu
partially.
XB
XB
XB XB
XB
XB
17/72
Positive region and Reduct
Positive region
POSR(d) is called the positive region of classification
CLASST(d) is equal to the union of all lower approximation
of decision classes.
Reducts ,are defined as minimal subset of condition
attributes which preserve positive region defined by the
set of all condition attributes, i.e.
A subset is a relative reduct iff
1 R  C, POSR(D) = POSC (D)
2 For every proper subset R’  R, condition 1 is not true
18/72
Dependency coefficient
Is a measure of association, Dependency coefficient
between condition attributes A and a decision attribute d is
defined by the formula:
Where, Card represent the cardinality of a set.
)(
))((
),(
UCard
dPOSCard
dA A

19/72
Rough sets for Decision Making
 INFORMATION SYSTEM FOR ACCIDENT DATASET
 Let B = {A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11, A12, A13, A14,
A15} be the set of 15 accidents. The set of Condition attributes of
Information System C = {Drunk Driving, Distracted Driving, Over
Speed, Night Driving, Health Issue / Stress, Tire Blowouts, Brake
Failure Accidents}.
 The set of Decision attribute of information system D = {Accident
Occurs, No Accident}
 Decision Parameter
 (Accident Occurs) = Number of positive condition attributes
 Number of objects
 Decision Parameter
 (No Accident) = Number of negative condition attributes
 Number of objects
20/72
Decision Table
Drunk Distract
ed
Driving
Over
Speed
Night
Drivin
g
Health
Issues
/
Stress
Tire
Blowo
ut
Brak
e
Failu
re
Decision
A1 Y Y Y Y N N Y Y
A2 N N N Y N N Y N
A3
..
..
..
A15 Y Y N Y Y N Y Y
21/72
 IND ({Distracted Driving}) = {{A1, A3, A5, A6, A7, A8, A9, A11, A15},
{A2, A4, A10, A12, A13, A14}}
 IND ({Over Speed}) = {{A1, A4, A5, A7, A8, A9, A10, A11, A12, A14,
A15}, {A2, A3, A6, A13}}
 IND ({Night Driving}) = {{A1, A2, A6, A14, A15}, {A3, A4, A5, A7, A8,
A9, A10, A11, A12, A13}}
 Quality Coefficient of upper and lower approximation can be
calculated,
 αB = 10/15, for areas with various attributes that have possibility
to meet accident.
 αB( ) = 7/15, for areas with various attributes that have the
possibility of no accident.
XB
XB
22/72
 αB( ) = 8/15, for areas with various attributes that certainly meet
an accident. i.e., that is, 53% of areas certainly meet an accident
 αB( ) = 5/15, for areas with various attributes that certainly do not
meet an accident. i.e., approximately 33% of areas certainly do not
meet an accident.
 Dependency of Accident Dataset
 In Accident dataset we have 8 elements in lower approximation that
is the areas with various attributes that meet an accident and 5
elements in lower approximation that is the areas with various
attributes that do not meet an accident and the total element in lower
approximation is 13 then the dependency coefficient is calculated as
 γ(C,D) = 13/15 = 0.86. So D depends partially (with a degree k=0.86)
on C.
XB
XB
23/72
 The rules generated by reduct are called ‘Reduct Rules’ and
decision based on these rules are generally more precise and
accurate.
 The first step towards the ‘Reduct Rule’ generation is removal
of redundancy.
 The next step towards the removal of redundancy or reduction
is to analyze each condition attribute one by one
independently with decision attribute.
 Finally we get : Rule 1
 If (Drunk Driving = Yes) and (Over Speed = Yes) and (Tire
Blowout = Yes) , Then Accident Possibility = Yes
 Rule 2
 If (Drunk Driving = No) and (Over Speed = No) and (Tire
Blowout = Yes) , Then Accident Possibility = No
24/72
Fuzzy sets
 A classical set is defined by crisp boundaries
 A fuzzy set is prescribed by vague or ambiguous
properties; hence its boundaries are ambiguously
specified. The boundaries of the fuzzy sets are vague
and ambiguous. Hence, membership of an element from
the universe in this set is measured by a function that
attempts to describe vagueness and ambiguity
 Definition : If U is a collection of objects denoted
generically by x, then a fuzzy set A in U is defined as a
set of ordered pairs:
A = {(x, A(x)) : x U} , A: U  [0,1]
25/72
Membership Function
 Membership function (MF) -A function that specifies the
degree to which a given input belongs to a set.
 Degree of membership -The output of a membership
function, this value is always limited to between 0 and 1.
Also known as a membership value or membership
grade.
 Membership functions are used in the fuzzification and
defuzzification steps of a FLS (fuzzy logic system), to
map the non -fuzzy input values to fuzzy linguistic terms
and vice versa.
26/72
27/72
Fuzzy Sets
We give things a degree of membership between 0 and 1 in
several sets (to a combined total of 1).
We then label these sets using human terms.
Age
Degreeofmembership
0
1
90
Young Middle
Aged
Old
0.5
50
Membership
function
38yr old = 10% Young + 90% Middle Aged
28/72
When the universe of discourse, X, is discrete and finite, is as follows for a fuzzy
setA∼ :
When the universe, X, is continuous and infinite, the fuzzy setA∼
Membership function for fuzzy set A∼
29/72
Three fuzzy sets A , B, and C on the universe X
For a given element x of the universe, the following function-theoretic
operations for the set-theoretic operations of union, intersection, and
complement are defined for aA, B, and C on X
Standard fuzzy operations
30/72
Union of fuzzy sets Aand B
Intersection of fuzzy sets A and B Complement of fuzzy sets A and B
31/72
A={a1,a2} B={b1,b2} C={c1,c2}
AxBxC =
{(a1,b1,c1),(a1,b1,c2),(a1,b2,c1),(a1,b2,c2),(a2,b1,c1),(a2,b1,c2),(a2,b2,c1),
(a2,b2,c2)}
32/72
Crisp relation
 A relation among crisp sets A1 , A2 , …, An is a subset of the Cartesian
product. It is denoted by R .
 Using the membership function defines the crisp relation R :
1 2 nR A A A   
1 2
1 2
1 1 2 2
1 iff ( , , ..., ) ,
( , , , )
0 otherwise
where , ,...,
n
R n
n n
x x x R
x x x
x A x A x A


 

  
33/72
Fuzzy Relation
 
1 2
1 2 1 2 1 1 2 2
: ... [0,1]
(( , ,..., ), )| ( , ,..., ) 0, , ,...,
R n
n R R n n n
A A A
R x x x x x x x A x A x A

 
   
    
A fuzzy relation is a fuzzy set defined on the Cartesian
product of crisp sets A1, A2, ..., An where tuples (x1, x2, ..., xn)
may have varying degrees of membership within the relation.
The membership grade indicates the strength of the relation
present between the elements of the tuple.
Fuzzy relation R has membership function
34/72
Example
(Crisp) (Fuzzy)
35/72
Example
36/72
Fuzzy Equivalence Classes
 reflexivity (μS(x; x) = 1), symmetry (μS(x; y) = μS(y; x)) and transitivity
(μS(x; z)  μS(x; y) ^ μS(y; z)) hold.
 Using the fuzzy similarity relation, the fuzzy equivalence class [x]S for
objects close to x can be defined:
 μ[x]S(y) = μS(x;y)
 The following axioms should hold for a fuzzy equivalence class F
 x, μF(x) = 1 (μF is normalised)
 μF(x) ^ μS(x; y)  μF(y)
 μF(x) ^ μF(y)  μS(x; y)
37/72
Max-Min Composition
 R S
A fuzzy relation defined on X and Z.
X Y Z
R: fuzzy relation defined on X and Y.
S: fuzzy relation defined on Y and Z.
R。S: the composition of R and S.
 ( , ) max min ( , ), ( , )R S y R Sx z x y y z  
 ( , ) ( , )y R Sx y y z   
38/72
Example
 X = { x1, x2}, Y = {y1 , y2} and Z = {z1 , z2, z 3}
 Consider the following fuzzy relations:
 y1 y 2 z1 z2 z3
 R = x1 0.7 0.5 S = y1 0.9 0.6 0.5
x2 0.8 0.4 y2 0.1 0.7 0.5
Using max-min composition
T (x1 , z 1) =  (R (x1 , y )  S(y , z1 ) )
y  Y
Max (min (0.7,0.9), min(0.5,0.1)) = 0.7
z1 z2 z3
T = x1 0.7 0.6 0.5
x2 0.8 0.6 0.4
39/72
T-Norm
 Any t-norm operator, denoted as t(x,y) must satisfy five
axioms. T-norms map from [0,1]x[0,1]  [0,1]
 T(0,0) = 0
 T.2 T(a,b) = T(b,a) commutative
 T.3 T(a,1) = a neuter
 T.4 T(T(a,b),c)=T(a,T(b,c)) associative
 T.5 T(c,d) <=T(a,b) if c<=a and d<=b monotonic
(x), (x)) <= min((x), (x))
the algebraic product TP(x; y)=x ∗ y;
• the bold intersection (also called the Lukasiewicz t-norm)
TL(x; y)= max{0; x + y − 1}.
40/72
Fuzzy intersection and Union
 AB = T(A(x), B(x)) where T is T-norm operator. There are some
possible T-Norm operators.
 Minimum: min(a,b)=a ٨ b
 Algebraic product: ab
 Bounded product: 0 ٧ (a+b-1)
41/72
Membership function design
Automatic or Adaptive
- Neural Networks
- Genetic Algorithms
- Inductive reasoning
- Gradient search
 Always use parameterizable membership functions. Do not define a
membership function point by point.
 Triangular and Trapezoid membership functions are sufficient for
most practical applications!
11
42/72
Triangular MF
Gaussian MF
A triangular membership function is specified by three parameters {a,
b, c}:
Triangle(x; a, b, c) = 0 if x  a;
= (x-a)/(b-a) if a  x  b;
= (c-b)/(c-b) if b  x  c;
= 0 if c  x.
43/72
Trapezoidal MF
 Trapezoid(x; a, b, c, d) = 0 if x  a;
 = (x-a)/(b-a) if a  x  b;
 = 1 if b  x  c;
 = (d-x)/(d-c) 0 if c  x  d;
 = 0, if d  x.
44/72
Types of Membership Functions
x
(x) 1
0 a b c d
Trapezoid: <a,b,c,d>
x
(x)
1
0 a b
Singleton: (a,1) and (b,0.5)
x
(x)
1
0 a b d
Triangular: <a,b,d>
45/72
Operators on Fuzzy Sets
Union
x
1
0
AB(x)=min{A(x),B(x)}
A(x) B(x)
x
1
0
AB(x)=max{A(x),B(x)}
A(x) B(x)
Intersection
x
1
0
AB(x)=A(x)  B(x)
A(x) B(x)
x
1
0
AB(x)=min{1,A(x)+B(x)}
A(x) B(x)
46/72
Fuzzy Logic
 Fuzzy logic can be defined as a superset of conventional
(Boolean) logic that has been extended to handle the
concept of partial truth - truth values between “completely
true” and “completely false”
 Linguistic Variables: Variables used in fuzzy systems to
express qualities such as height, which can take values
such as “tall”, “short” or “very tall”.
 Each linguistic variable may be assigned one or more
linguistic values, which are in turn connected to a
numeric value through the mechanism of membership
functions.
 Example: if temperature is cold, then pressure is low
47/72
Linguistic Variables
A linguistic variable combines several fuzzy sets.
linguistic variable : temperature
linguistics terms (fuzzy sets) : { cold, warm, hot }
x [C]
(x)
1
0
cold warm hot
6020
48/72
How the models work
Inputs converted to
degrees of
membership of fuzzy
sets.
Fuzzy rules applied to
get new sets of
members.
These sets are then
converted back to real
numbers.
Crisp data
Fuzzifier
Member 90% hot
10% cold
Fuzzy rules
IF 90% hot THEN 80% open
IF 10% cold THEN 20% closed
Fuzzy output set
80% open, 20% closed
Defuzzifier
Crisp data
49/72
Fuzzy Rules
 Operates on a bunch of IF-THEN statements
 A fuzzy rule can be defined as a conditional statement in the form:
A  B: IF x is A THEN y is B
where x and y are linguistic variables; and A and B are linguistic values
determined by fuzzy sets on the universe of discourses X and Y,
respectively.
Antecedent of Fuzzy Rules
If the annual income of a person is High, then the person is rich.
If the annual income of a person is High AND the amount requested is
NOT huge, then recommend approving the loan
50/72
Fuzzy Rules
 causal dependencies can be expressed in form of if-then-
rules
 general form:
if <antecedent> then <consequence>
 example:
if temperature is cold and oil is cheap
then heating is high
linguistic values/terms (fuzzy sets)linguistic variables
51/72
Implicator
 If p is proposition of the form if x is A, then q is a proposition of the
form if y is B, then we define the fuzzy implication A → B as a fuzzy
relation, A and B are Fuzzy sets.
 It is clear that (A→B)(u, v) should be defined point wise and likewise
 , i.e.(A→B)(u,v) depends only on A (u) and B(v).
 (A→B)(u,v)=I (A(u),B(v))=A(u)→B(v)
52/72
Consequent of Fuzzy Rules
 Fuzzy Consequent: If … then y is A
 P: A  B  If x is A then y is B
 Functional Consequent:
If x1  A1 AND x2  A2 AND… xn  An then y = a0 + ai xi
 Designing Antecedent Membership: Two conditions should be satisfied
 Each membership function overlaps only with the closest neighboring
membership functions . For any possible input data its membership rules in
all relevant fuzzy sets should sum to 1 (or nearly)
 Ex:  A1(10) +  A2(10) +  A3(10) = 0.25 + 0.25 + 0 = 0.5 1
 1 A1 A2 A3
 .5
 0 10 20 30 x
53/72
Firing Fuzzy Rules
Tall men Heavy men
180
Degree of
Membership
1.0
0.0
0.2
0.4
0.6
0.8
Height, cm
190 200 70 80 100160
Weight, kg
120
Degree of
Membership
1.0
0.0
0.2
0.4
0.6
0.8
These fuzzy sets provide the basis for a weight estimation model.
The model is based on a relationship between a man’s height and
his weight:
IF height is tall THEN weight is heavy
54/72
 The value of the output or a truth membership grade of the rule consequent
can be estimated directly from a corresponding truth membership grade in
the antecedent.
55/72
Generalized Modus Ponens
 Single rule with single antecedent
 Single Rule with Multiple Antecedents
Rule:
Fact:
Conclusion:
if x is A then y is B
x is A’
y is B’
56/72
Single Rule with Multiple Antecedents
Rule:
Fact:
Conclusion:
if x is A and y is B then z is C
x is A and y is B
z is C
57/72
Multiple Antecedent
IF x is AAND y is B THEN z is C
IF x is A OR y is B THEN z is C
Use unification (OR) or intersection (AND) operations to calculate a
membership value for the whole antecedent.
AND: C(z) = min (A (x), B(y) )
OR: C(z) = max (A (x), B(y) )
Example:
IF project duration is long AND project staffing is large AND
project funding is inadequate THEN risk is high
IF service is excellent OR food is delicious THEN tip is generous
58/72
Multiple Consequents
IF x is A THEN y is B AND z is C. Each consequent is
affected equally by the membership in the antecedent
class(eg).
E.g., IF x is tall THEN x is heavy AND x has large feet.
Tall (x) = 0.7  ( Heavy(y) = 0.7  Largefeet (y) = 0.7)
59/72
Multiple Consequents
Multiple Rules with Multiple Antecedents

 A’( )x
x
A1
( )y
y
B1
( )z
z
C1
( )x
x
A2
( )y
y
B2
( )z
z
C2B’
B’
A’
Rule1:
Fact:
Conclusion:
if x is A1 and y is B1 then z is C1
x is A’ and y is B’
z is C’
Rule2: if x is A2 and y is B2 then z is C2
60/72
Fuzzy Inference Systems
 Fuzzy rule based systems, fuzzy models, and fuzzy expert systems are also
known as fuzzy inference systems. The key unit of a fuzzy logic system is
FIS. The primary work of this system is decision-making. FIS uses
“IF...THEN” rules along with connectors “OR” or “AND” for making
necessary decision rules.
 The input to FIS may be fuzzy or crisp, but the output from FIS is always a
fuzzy set. When FIS is used as a controller, it is necessary to have crisp
output. Hence, there should be a defuzzification unit for converting fuzzy
variables into crisp variables along FIS
61/72
Mamdani Fuzzy Inference
 Fuzzification of the input variables,
Determine membership values
 Rule evaluation
Based on membership values of ( composite)
antecedents
 Aggregation of the rule outputs,
Unify all membership values for the output from all rules
Defuzzification.
62/72
Sugeno style inference
 The main steps of the fuzzy inference process
 1. Fuzzifying the inputs
 2. Applying the fuzzy operator
 Are exactly the same as in Mamdani FIS
 The main difference is only the output membership in Sugeno’s type is
either linear or a constant
63/72
Two-input, one-output example
 Rule: 1
IF funding is adequate OR staffing is small THEN risk is low
 Rule: 2
IF funding is marginal AND staffing is large THEN risk is normal
 Rule: 3
IF funding is inadequate THEN risk is high
64/72
Inference steps
 Step 1: Fuzzification: The first step is to take the crisp inputs, (let funding
and staffing be x1 and y1), and determine the degree to which these inputs
belong to each of the appropriate fuzzy sets.
 Step 2 : Rule Evaluation:
(xA1) = 0.5; (x  A2) = 0.2; (y  B1) = 0.1 and (y  B2) = 0.7,
and apply them to the antecedents of the fuzzy rules. If a given fuzzy rule has
multiple antecedents, the fuzzy operator (AND or OR) is used to obtain a single
number that represents the result
of the antecedent evaluation. This number (the truth value) is then applied to the
consequent membership function. To evaluate the disjunction of the rule
antecedents, we use the OR fuzzy operation. Typically, using the fuzzy operation
union: A[B(x) = max[A(x); B(x)]
Similarly, in order to evaluate the conjunction of the rule antecedents, we apply the
AND fuzzy operation intersection: AB(x) = min[A(x); B(x)]
65/72
Mamdani Rule Evaluation
66/72
The result of the antecedent evaluation can be now be applied to the
membership function of the consequent.
There are two methods available: Clipping and Scaling
 Clipping: Cuts of the top of the membership function whose value is
higher than the matching degree.
 Scaling: Scales down the membership function in proportion to the
matching degree
 Clipping and Scaling Methods
67/72
Step 3 : Aggregation of Rule outputs
 Aggregation is the process of unification of the outputs of all
rules.
We take the membership functions of all rule consequents previously
clipped or scaled and combine them into a single fuzzy set. The input
of the aggregation process is the list of clipped or scaled consequent
membership functions, and the output is one fuzzy set for each
output variable
68/72
Defuzzification Methods
 Max-membership principle
 c(Z*)  c(z)  z  Z
z* z

1
Centroid principle





dz)z(
zdz)z(
z
c
c*
z* z

1
69/72
Fuzzy Rough sets
Fuzzy sets and rough sets address two important, and mutually orthogonal,
characteristics of imperfect data and knowledge: while the former allow that
objects belong to a set or relation to a given degree, the latter provide
approximations of concepts in the presence of incomplete information.
Within the hybrid theory, Pawlak’s well-known framework for the construction
of lower and upper approximations of a concept C given incomplete
information.
A subset A of a given universe X, containing examples of C), and an
equivalence relation R in X that models “indiscernibility” or
“indistinguishability”, has been extended in two ways:
1. The set A may be generalized to a fuzzy set in X, allowing that objects can
belong to a concept (i.e., meet its characteristics) to varying degrees.
2. Rather than modeling elements’ indistinguishability, we may assess their
similarity (objects are similar to a certain degree), represented by a fuzzy
relation R. As a result, objects are categorized into classes, or granules,
with “soft” boundaries based on their similarity to one another.
70/72
Fuzzy Rough sets
 Define the lower and upper approximation of a fuzzy set A in X as the
fuzzy sets R↓A and R↑A in X, constructed by means of an implicator
I, a t-norm T and a fuzzy T-equivalence relation R in X,
R↓A(y) = inf x∈X I(R(x, y), A(x))
R↑A(y) = sup x∈X T(R(x, y), A(x))
for all y in X. (A1, A2) is called a fuzzy rough set (in (X, R)).
interpreted as the degree of inclusion of Ry in A and the degree of
overlap of Ry and A respectively.
Let R be a fuzzy relation in X and A a fuzzy set in X.
1. The tight, loose and (usual) lower approximation of A are defined
R↓ A(y) = inf z∈XI(Rz(y), inf x∈X I(Rz(x), A(x) ) )
 2. R↑↓A(y) = sup z∈XT(Rz(y),inf x∈X I(Rz(x), A(x)))
 3. R↓A(y) = inf x∈X I(Ry(x), A(x))
71/72
 The tight, loose and (usual) upper approximation of
 A are defined as
 (a) R↓↑A(y) = inf z∈X I(Rz(y),sup x∈X T(Rz(x), A(x)))
 (b) R↑↑A(y) = sup z∈X T(Rz(y),sup x∈X T(Rz(x), A(x)))
 (c) R↑A(y) = sup x∈X T(Ry(x), A(x))
 for all y in X
 Consider the fuzzy T-equivalence relation R on X ={a, b} given by
 and the fuzzy set A in X defined by A(a) = 1 and
 A(b) = 0.8. Then R↑A (a) = 1 and R ↑A(b) = 0.8,
 hence(R↓↑A)(a) = min(max(0,1),max(0.8,0.8)) = 0.8
 It is clear that A  R↓↑A
 .
R a b
a 1.0 0.2
b 0.2 1.0
72/72
THANK YOU

More Related Content

What's hot

Extension principle
Extension principleExtension principle
Extension principleSavo Delić
 
Fuzzy Set Theory
Fuzzy Set TheoryFuzzy Set Theory
Fuzzy Set TheoryAMIT KUMAR
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relationsnaugariya
 
Genetic Algorithm in Artificial Intelligence
Genetic Algorithm in Artificial IntelligenceGenetic Algorithm in Artificial Intelligence
Genetic Algorithm in Artificial IntelligenceSinbad Konick
 
Version spaces
Version spacesVersion spaces
Version spacesGekkietje
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
 
Artificial intelligence agents and environment
Artificial intelligence agents and environmentArtificial intelligence agents and environment
Artificial intelligence agents and environmentMinakshi Atre
 
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Mohammed Bennamoun
 
Computational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesComputational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesAntonis Antonopoulos
 
Gaussian Elimination Method
Gaussian Elimination MethodGaussian Elimination Method
Gaussian Elimination MethodAndi Firdaus
 

What's hot (20)

Chapter 13
Chapter 13Chapter 13
Chapter 13
 
Hasse diagram
Hasse diagramHasse diagram
Hasse diagram
 
Extension principle
Extension principleExtension principle
Extension principle
 
Fuzzy Set Theory
Fuzzy Set TheoryFuzzy Set Theory
Fuzzy Set Theory
 
L8 fuzzy relations contd.
L8 fuzzy relations contd.L8 fuzzy relations contd.
L8 fuzzy relations contd.
 
lattice
 lattice lattice
lattice
 
Fuzzy relations
Fuzzy relationsFuzzy relations
Fuzzy relations
 
Engineering mathematics 1
Engineering mathematics 1Engineering mathematics 1
Engineering mathematics 1
 
Genetic Algorithm in Artificial Intelligence
Genetic Algorithm in Artificial IntelligenceGenetic Algorithm in Artificial Intelligence
Genetic Algorithm in Artificial Intelligence
 
Crisp Realation
Crisp RealationCrisp Realation
Crisp Realation
 
Version spaces
Version spacesVersion spaces
Version spaces
 
Fuzzy Set
Fuzzy SetFuzzy Set
Fuzzy Set
 
Bayesian learning
Bayesian learningBayesian learning
Bayesian learning
 
Bayesian inference
Bayesian inferenceBayesian inference
Bayesian inference
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Artificial intelligence agents and environment
Artificial intelligence agents and environmentArtificial intelligence agents and environment
Artificial intelligence agents and environment
 
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
Artificial Neural Network Lecture 6- Associative Memories & Discrete Hopfield...
 
Computational Complexity: Complexity Classes
Computational Complexity: Complexity ClassesComputational Complexity: Complexity Classes
Computational Complexity: Complexity Classes
 
First order logic
First order logicFirst order logic
First order logic
 
Gaussian Elimination Method
Gaussian Elimination MethodGaussian Elimination Method
Gaussian Elimination Method
 

Similar to Rough sets and fuzzy rough sets in Decision Making

Improving circuit miniaturization and its efficiency using Rough Set Theory( ...
Improving circuit miniaturization and its efficiency using Rough Set Theory( ...Improving circuit miniaturization and its efficiency using Rough Set Theory( ...
Improving circuit miniaturization and its efficiency using Rough Set Theory( ...Sarvesh Singh
 
Soft Lattice in Approximation Space
Soft Lattice in Approximation SpaceSoft Lattice in Approximation Space
Soft Lattice in Approximation Spaceijtsrd
 
Calculus of One Variable
Calculus of One VariableCalculus of One Variable
Calculus of One Variabledilip ghule
 
Set theory- Introduction, symbols with its meaning
Set theory- Introduction, symbols with its meaningSet theory- Introduction, symbols with its meaning
Set theory- Introduction, symbols with its meaningDipakMahurkar1
 
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكرو
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكروDiscrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكرو
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكروDr. Khaled Bakro
 
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...inventionjournals
 
Mean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptMean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptsadafshahbaz7777
 
Sets functions-sequences-exercises
Sets functions-sequences-exercisesSets functions-sequences-exercises
Sets functions-sequences-exercisesRoshayu Mohamad
 
Ch1 sets and_logic(1)
Ch1 sets and_logic(1)Ch1 sets and_logic(1)
Ch1 sets and_logic(1)Kwonpyo Ko
 
Fuzzy logic andits Applications
Fuzzy logic andits ApplicationsFuzzy logic andits Applications
Fuzzy logic andits ApplicationsDrATAMILARASIMCA
 

Similar to Rough sets and fuzzy rough sets in Decision Making (20)

Chapter 2
Chapter 2Chapter 2
Chapter 2
 
Chapter 1
Chapter 1Chapter 1
Chapter 1
 
Bf4101327334
Bf4101327334Bf4101327334
Bf4101327334
 
Improving circuit miniaturization and its efficiency using Rough Set Theory( ...
Improving circuit miniaturization and its efficiency using Rough Set Theory( ...Improving circuit miniaturization and its efficiency using Rough Set Theory( ...
Improving circuit miniaturization and its efficiency using Rough Set Theory( ...
 
Soft Lattice in Approximation Space
Soft Lattice in Approximation SpaceSoft Lattice in Approximation Space
Soft Lattice in Approximation Space
 
Chapter 3
Chapter 3Chapter 3
Chapter 3
 
Calculus of One Variable
Calculus of One VariableCalculus of One Variable
Calculus of One Variable
 
Chapter 5
Chapter 5Chapter 5
Chapter 5
 
Discrete Sets
Discrete  SetsDiscrete  Sets
Discrete Sets
 
Per6 basis2_NUMBER SYSTEMS
Per6 basis2_NUMBER SYSTEMSPer6 basis2_NUMBER SYSTEMS
Per6 basis2_NUMBER SYSTEMS
 
Set theory- Introduction, symbols with its meaning
Set theory- Introduction, symbols with its meaningSet theory- Introduction, symbols with its meaning
Set theory- Introduction, symbols with its meaning
 
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكرو
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكروDiscrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكرو
Discrete mathematics Ch1 sets Theory_Dr.Khaled.Bakro د. خالد بكرو
 
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...
A Comparative Study of Two-Sample t-Test Under Fuzzy Environments Using Trape...
 
Relation and function_xii
Relation and function_xiiRelation and function_xii
Relation and function_xii
 
Mean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.pptMean, variable , moment characteris function.ppt
Mean, variable , moment characteris function.ppt
 
Sets functions-sequences-exercises
Sets functions-sequences-exercisesSets functions-sequences-exercises
Sets functions-sequences-exercises
 
FUZZY LOGIC
FUZZY LOGICFUZZY LOGIC
FUZZY LOGIC
 
Ch1 sets and_logic(1)
Ch1 sets and_logic(1)Ch1 sets and_logic(1)
Ch1 sets and_logic(1)
 
Fuzzy logic andits Applications
Fuzzy logic andits ApplicationsFuzzy logic andits Applications
Fuzzy logic andits Applications
 
Pertemuan 5_Relation Matriks_01 (17)
Pertemuan 5_Relation Matriks_01 (17)Pertemuan 5_Relation Matriks_01 (17)
Pertemuan 5_Relation Matriks_01 (17)
 

Recently uploaded

Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfadityarao40181
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,Virag Sontakke
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTiammrhaywood
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 

Recently uploaded (20)

Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Biting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdfBiting mechanism of poisonous snakes.pdf
Biting mechanism of poisonous snakes.pdf
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,भारत-रोम व्यापार.pptx, Indo-Roman Trade,
भारत-रोम व्यापार.pptx, Indo-Roman Trade,
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPTECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
ECONOMIC CONTEXT - LONG FORM TV DRAMA - PPT
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 

Rough sets and fuzzy rough sets in Decision Making

  • 1. 1/72 ROUGH SET AND FUZZY ROUGH SETS IN DECISION MAKING Dr. A.Tamilarasi Professor/MCA Kongu Engineering College Perundurai, Erode 638052 TAMILNADU
  • 2. 2/72 Outline  Introduction  Rough sets  Basic Notations  Fuzzy sets and Fuzzy Logic  Fuzzy Rough sets  Rough sets in Decision Making
  • 3. 3/72 Introduction - Rough sets One goal of the Knowledge Discovery is extract meaningful knowledge. Rough Sets theory was introduced by Z. Pawlak (1982) as a mathematical tool for data analysis. Rough sets have many applications in the field of Knowledge Discovery, feature selection, Banking etc., Rough set have been introduced as a tool to deal with, uncertain Knowledge in Artificial Intelligence Application
  • 4. 4/72 Equivalence Relation and Equivalence class  A relation on a set X is subset of X×X. Let X be a set and let x, y, and z be elements of X. An equivalence relation R on X is a relation on X such that: Reflexive Property: xRx for all x in X. Symmetric Property: if xRy, then yRx. Transitive Property: if xRy and yRz, then xRz.  Let R be an equivalence relation on a set X. For each a ∈ X, we define the equivalence class of a, denoted by [a], to be the set [a] = {x ∈ SX: x R a}. The equivalence classes form a partition of X. This partition – the set of equivalence classes – is sometimes called the quotient set or the elementary set of X by R and is denoted by X / R.
  • 5. 5/72 Rough Sets Theory Let T = (U, A, C, D,), be a Decision system data, where: U is a non-empty, finite set called the universe, A is a non-empty finite set of attributes, C and D are subsets of A, Conditional and Decision attributes subsets respectively.  : U C  A information function for a  A, Va is called the value set of a , The elements of U are objects, cases, states, observations. The Attributes are interpreted as features, variables, characteristics conditions, etc. aVUa :
  • 6. 6/72 information table : Example 1: Let U = {x1, x2, x3, x4, x5,x6 }, the universe set. C = {a1, a2, a3, a4 } the conditional features set. V1 = {good, Medium, bad}, V2 ={good, bad},V3 = {good, bad}, V4 = {good, bad}.  The information function  (x1, a1 ) = good, so on Student a1 a2 a3 a4 x1 good good bad good x2 Medium bad bad bad x3 Medium bad bad good x4 bad bad bad bad x5 Medium good good bad x6 good bad good good
  • 7. 7/72  If in the set of attributes A, condition attributes C = {a1; a2; a3 }  and decision attribute D = {a4},were distinguished, the data table could be seen as a decision table.  In order to explain the evaluations of the decision attribute in terms of the evaluations of the condition attributes, one can represent the data table as a set of decision rules. Such a representation gives the following rules, for example:  If the level in Mathematics is good and the level in Physics is good And the level in Literature is bad, then the students is good.
  • 8. 8/72 information table – Example 2 H M T F p1 No Yes High Yes p2 Yes No High Yes p3 Yes Yes V. High Yes p4 No Yes Normal No p5 Yes No High No p6 No Yes V. High Yes Columns of the table are labeled by attributes Headache (H), Muscle- pain (M),Temperature (T) and Flu (F) and rows –by patients (objects) – p1 p2 ,p3 ,p4 ,p5 ,p6 . Each row of the table can be seen as information about specific patient. For example, take p2,attribute-value set {(Headache, yes), (Muscle-pain, no), (Temperature, high), (Flu, yes)}
  • 9. 9/72 R-lower approximation & R-upper approximation Let X  U and R  C, R is a subset of conditional features, then the R- lower approximation set of X, is the set of all elements of U which  can be with certainty classified as elements of X. R ↓ X =  R-lower approximation set of X is a subset of X. The R-upper approximation set of X, is the set of all elements of U such that: R↑X = X is a subset of R-upper approximation set of X. R-upper approximation contains all data which can possibly be classified as belonging to the set X the R-Boundary set of X is defined as: }:/{ XYRUYXR  }:/{  XYRUYXR XRXRXBN )(
  • 10. 10/72 Information system- Example P1 P2 P3 p4 p5 O1 1 2 0 1 1 O2 1 2 0 1 1 O3 2 0 0 1 0 O4 0 0 1 2 1 O5 2 1 0 2 1 O6 0 0 1 2 2 O7 2 0 0 1 0 O8 0 1 2 2 1 O9 2 1 0 2 2 O10 2 0 0 1 0 4.11.2016 When the full set of attributes P = { P1, P2, P3 , P4 , P5 } is considered, we see that we have the following seven equivalence classes: { {O1 , O 2} {O3 , O7 , O10} {O 4} {O 5} {O 6 } {O 8 } {O 9}. It is apparent that different attribute subset selections will in general lead to different indiscernibility classes. For example, if attribute P = {P1} alone is selected, we obtain the following {O1 , O 2} {O3 , O5, O7 ,O9, O10}, {O4 , O6 , O8}
  • 11. 11/72  consider the target set X = { O1 , O2 , O3 , O4 }, and let attribute subset P = { P1, P2 ,P3 , P4 , P5 }, the full available set of features. It will be noted that the set X cannot be expressed exactly, because in [x ] P , objects { O3 , O7 , O10 } are indiscernible. Thus, there is no way to represent any set X which includes O3 but excludes objects O7 and O10 However, the target set X can be approximated using only the information contained within P by constructing the P -lower and P - upper approximations of X : The P-lower approximation, or positive region, is the union of all equivalence classes in [x]P which are contained by (i.e., are subsets of) the target set – in the example, = { O1 , O2 } ∪ {O4}, the union of the two equivalence classes in [x] P which are contained in the target set. The lower approximation is the complete set of objects in U / P positively (i.e., unambiguously) classified as belonging to target set X . 4.11.2016 XP
  • 12. 12/72 Upper approximation and negative region  The P -upper approximation is the union of all equivalence classes in [x] P which have non-empty intersection with the target set – in the example, = {O1 , O2 } ∪ {O4 } ∪ {O3 , O7 , O10 }, the union of the three equivalence classes in [ x ] P that have non-empty intersection with the target set.  The upper approximation is the complete set of objects that in U / P that cannot be positively (i.e., unambiguously) classified as belonging to the complement ( X ¯) of the target set X.  In other words, the upper approximation is the complete set of objects that are possibly members of the target set X.  The set U − therefore represents the negative region, containing the set of objects that can be definitely ruled out as members of the target set. 4.11.2016 XP XP
  • 13. 13/72 Indiscernibility Relation The Indiscernibility relation IND(P) is an equivalence relation. Let a  A, P A, the indiscernibility relation IND(P), is defined as follows: IND (P) = {(x.y)  U  U : for all a  P, a(x) = a(y) } The indiscernibility relation defines a partition in U. Let P A, U/IND(P) denotes a family of all equivalence classes of the relation IND(P), called elementary sets. Two other equivalence classes U/IND(C) and U/IND(D), called condition and decision equivalence classes respectively, can also be defined.
  • 14. 14/72 Representation of the approximation sets If = then, X is R-definible (the boundary set is empty) If then X is Rough with respect to R. ACCURACY := Card(Lower)/ Card (Upper) αR =  /  XRXR  XB XR XR XR
  • 15. 15/72 Example Lets consider U={x1, x2, x3, x4, x5, x6, x7, x8} and the equivalence relation R with the equivalence classes: X1={x1,x3,x5}, X2={x2,x4} and X3={x6,x7,x8} is a Partition. Let the classification C={Y1,Y2,Y3} such that Y1={x1, x2, x4}, Y2={x3, x5, x8}, Y3={x6, x7} Only Y1has lower approximation, i.e. ,21 XYR  4.11.2016
  • 16. 16/72 Let us depict above definitions by example 2  Consider the concept "flu", i.e., the set X= {p1, p2, p3, p6} and the set of attributes B = {Headache, Muscle-pain, Temperature}. Hence  = {p1,p3, p6} and = {p1, p2, p3, p5, p6}. For this case we get αB(“flu") = 3/5. It means that the concept "flu" can be characterized partially employing symptoms, Headache, Muscle-pain and Temperature. Taking only one symptom B= {Headache} we get  = ∅, = U and αB(“flu") = 0, which means that the concept "flu“ cannot be characterized in terms of attribute Headache only i.e., this attribute is not characteristic for flu whatsoever. However, taking the attribute B= {Temperature} we get = {p3, p6 }, = {p1, p2,p3,p5, p6} and αB(X) = 2/5, which means that the single symptom Temperature is less characteristic for flu, than the whole set of symptoms, but also characterizes flu partially. XB XB XB XB XB XB
  • 17. 17/72 Positive region and Reduct Positive region POSR(d) is called the positive region of classification CLASST(d) is equal to the union of all lower approximation of decision classes. Reducts ,are defined as minimal subset of condition attributes which preserve positive region defined by the set of all condition attributes, i.e. A subset is a relative reduct iff 1 R  C, POSR(D) = POSC (D) 2 For every proper subset R’  R, condition 1 is not true
  • 18. 18/72 Dependency coefficient Is a measure of association, Dependency coefficient between condition attributes A and a decision attribute d is defined by the formula: Where, Card represent the cardinality of a set. )( ))(( ),( UCard dPOSCard dA A 
  • 19. 19/72 Rough sets for Decision Making  INFORMATION SYSTEM FOR ACCIDENT DATASET  Let B = {A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11, A12, A13, A14, A15} be the set of 15 accidents. The set of Condition attributes of Information System C = {Drunk Driving, Distracted Driving, Over Speed, Night Driving, Health Issue / Stress, Tire Blowouts, Brake Failure Accidents}.  The set of Decision attribute of information system D = {Accident Occurs, No Accident}  Decision Parameter  (Accident Occurs) = Number of positive condition attributes  Number of objects  Decision Parameter  (No Accident) = Number of negative condition attributes  Number of objects
  • 21. 21/72  IND ({Distracted Driving}) = {{A1, A3, A5, A6, A7, A8, A9, A11, A15}, {A2, A4, A10, A12, A13, A14}}  IND ({Over Speed}) = {{A1, A4, A5, A7, A8, A9, A10, A11, A12, A14, A15}, {A2, A3, A6, A13}}  IND ({Night Driving}) = {{A1, A2, A6, A14, A15}, {A3, A4, A5, A7, A8, A9, A10, A11, A12, A13}}  Quality Coefficient of upper and lower approximation can be calculated,  αB = 10/15, for areas with various attributes that have possibility to meet accident.  αB( ) = 7/15, for areas with various attributes that have the possibility of no accident. XB XB
  • 22. 22/72  αB( ) = 8/15, for areas with various attributes that certainly meet an accident. i.e., that is, 53% of areas certainly meet an accident  αB( ) = 5/15, for areas with various attributes that certainly do not meet an accident. i.e., approximately 33% of areas certainly do not meet an accident.  Dependency of Accident Dataset  In Accident dataset we have 8 elements in lower approximation that is the areas with various attributes that meet an accident and 5 elements in lower approximation that is the areas with various attributes that do not meet an accident and the total element in lower approximation is 13 then the dependency coefficient is calculated as  γ(C,D) = 13/15 = 0.86. So D depends partially (with a degree k=0.86) on C. XB XB
  • 23. 23/72  The rules generated by reduct are called ‘Reduct Rules’ and decision based on these rules are generally more precise and accurate.  The first step towards the ‘Reduct Rule’ generation is removal of redundancy.  The next step towards the removal of redundancy or reduction is to analyze each condition attribute one by one independently with decision attribute.  Finally we get : Rule 1  If (Drunk Driving = Yes) and (Over Speed = Yes) and (Tire Blowout = Yes) , Then Accident Possibility = Yes  Rule 2  If (Drunk Driving = No) and (Over Speed = No) and (Tire Blowout = Yes) , Then Accident Possibility = No
  • 24. 24/72 Fuzzy sets  A classical set is defined by crisp boundaries  A fuzzy set is prescribed by vague or ambiguous properties; hence its boundaries are ambiguously specified. The boundaries of the fuzzy sets are vague and ambiguous. Hence, membership of an element from the universe in this set is measured by a function that attempts to describe vagueness and ambiguity  Definition : If U is a collection of objects denoted generically by x, then a fuzzy set A in U is defined as a set of ordered pairs: A = {(x, A(x)) : x U} , A: U  [0,1]
  • 25. 25/72 Membership Function  Membership function (MF) -A function that specifies the degree to which a given input belongs to a set.  Degree of membership -The output of a membership function, this value is always limited to between 0 and 1. Also known as a membership value or membership grade.  Membership functions are used in the fuzzification and defuzzification steps of a FLS (fuzzy logic system), to map the non -fuzzy input values to fuzzy linguistic terms and vice versa.
  • 26. 26/72
  • 27. 27/72 Fuzzy Sets We give things a degree of membership between 0 and 1 in several sets (to a combined total of 1). We then label these sets using human terms. Age Degreeofmembership 0 1 90 Young Middle Aged Old 0.5 50 Membership function 38yr old = 10% Young + 90% Middle Aged
  • 28. 28/72 When the universe of discourse, X, is discrete and finite, is as follows for a fuzzy setA∼ : When the universe, X, is continuous and infinite, the fuzzy setA∼ Membership function for fuzzy set A∼
  • 29. 29/72 Three fuzzy sets A , B, and C on the universe X For a given element x of the universe, the following function-theoretic operations for the set-theoretic operations of union, intersection, and complement are defined for aA, B, and C on X Standard fuzzy operations
  • 30. 30/72 Union of fuzzy sets Aand B Intersection of fuzzy sets A and B Complement of fuzzy sets A and B
  • 31. 31/72 A={a1,a2} B={b1,b2} C={c1,c2} AxBxC = {(a1,b1,c1),(a1,b1,c2),(a1,b2,c1),(a1,b2,c2),(a2,b1,c1),(a2,b1,c2),(a2,b2,c1), (a2,b2,c2)}
  • 32. 32/72 Crisp relation  A relation among crisp sets A1 , A2 , …, An is a subset of the Cartesian product. It is denoted by R .  Using the membership function defines the crisp relation R : 1 2 nR A A A    1 2 1 2 1 1 2 2 1 iff ( , , ..., ) , ( , , , ) 0 otherwise where , ,..., n R n n n x x x R x x x x A x A x A        
  • 33. 33/72 Fuzzy Relation   1 2 1 2 1 2 1 1 2 2 : ... [0,1] (( , ,..., ), )| ( , ,..., ) 0, , ,..., R n n R R n n n A A A R x x x x x x x A x A x A             A fuzzy relation is a fuzzy set defined on the Cartesian product of crisp sets A1, A2, ..., An where tuples (x1, x2, ..., xn) may have varying degrees of membership within the relation. The membership grade indicates the strength of the relation present between the elements of the tuple. Fuzzy relation R has membership function
  • 36. 36/72 Fuzzy Equivalence Classes  reflexivity (μS(x; x) = 1), symmetry (μS(x; y) = μS(y; x)) and transitivity (μS(x; z)  μS(x; y) ^ μS(y; z)) hold.  Using the fuzzy similarity relation, the fuzzy equivalence class [x]S for objects close to x can be defined:  μ[x]S(y) = μS(x;y)  The following axioms should hold for a fuzzy equivalence class F  x, μF(x) = 1 (μF is normalised)  μF(x) ^ μS(x; y)  μF(y)  μF(x) ^ μF(y)  μS(x; y)
  • 37. 37/72 Max-Min Composition  R S A fuzzy relation defined on X and Z. X Y Z R: fuzzy relation defined on X and Y. S: fuzzy relation defined on Y and Z. R。S: the composition of R and S.  ( , ) max min ( , ), ( , )R S y R Sx z x y y z    ( , ) ( , )y R Sx y y z   
  • 38. 38/72 Example  X = { x1, x2}, Y = {y1 , y2} and Z = {z1 , z2, z 3}  Consider the following fuzzy relations:  y1 y 2 z1 z2 z3  R = x1 0.7 0.5 S = y1 0.9 0.6 0.5 x2 0.8 0.4 y2 0.1 0.7 0.5 Using max-min composition T (x1 , z 1) =  (R (x1 , y )  S(y , z1 ) ) y  Y Max (min (0.7,0.9), min(0.5,0.1)) = 0.7 z1 z2 z3 T = x1 0.7 0.6 0.5 x2 0.8 0.6 0.4
  • 39. 39/72 T-Norm  Any t-norm operator, denoted as t(x,y) must satisfy five axioms. T-norms map from [0,1]x[0,1]  [0,1]  T(0,0) = 0  T.2 T(a,b) = T(b,a) commutative  T.3 T(a,1) = a neuter  T.4 T(T(a,b),c)=T(a,T(b,c)) associative  T.5 T(c,d) <=T(a,b) if c<=a and d<=b monotonic (x), (x)) <= min((x), (x)) the algebraic product TP(x; y)=x ∗ y; • the bold intersection (also called the Lukasiewicz t-norm) TL(x; y)= max{0; x + y − 1}.
  • 40. 40/72 Fuzzy intersection and Union  AB = T(A(x), B(x)) where T is T-norm operator. There are some possible T-Norm operators.  Minimum: min(a,b)=a ٨ b  Algebraic product: ab  Bounded product: 0 ٧ (a+b-1)
  • 41. 41/72 Membership function design Automatic or Adaptive - Neural Networks - Genetic Algorithms - Inductive reasoning - Gradient search  Always use parameterizable membership functions. Do not define a membership function point by point.  Triangular and Trapezoid membership functions are sufficient for most practical applications! 11
  • 42. 42/72 Triangular MF Gaussian MF A triangular membership function is specified by three parameters {a, b, c}: Triangle(x; a, b, c) = 0 if x  a; = (x-a)/(b-a) if a  x  b; = (c-b)/(c-b) if b  x  c; = 0 if c  x.
  • 43. 43/72 Trapezoidal MF  Trapezoid(x; a, b, c, d) = 0 if x  a;  = (x-a)/(b-a) if a  x  b;  = 1 if b  x  c;  = (d-x)/(d-c) 0 if c  x  d;  = 0, if d  x.
  • 44. 44/72 Types of Membership Functions x (x) 1 0 a b c d Trapezoid: <a,b,c,d> x (x) 1 0 a b Singleton: (a,1) and (b,0.5) x (x) 1 0 a b d Triangular: <a,b,d>
  • 45. 45/72 Operators on Fuzzy Sets Union x 1 0 AB(x)=min{A(x),B(x)} A(x) B(x) x 1 0 AB(x)=max{A(x),B(x)} A(x) B(x) Intersection x 1 0 AB(x)=A(x)  B(x) A(x) B(x) x 1 0 AB(x)=min{1,A(x)+B(x)} A(x) B(x)
  • 46. 46/72 Fuzzy Logic  Fuzzy logic can be defined as a superset of conventional (Boolean) logic that has been extended to handle the concept of partial truth - truth values between “completely true” and “completely false”  Linguistic Variables: Variables used in fuzzy systems to express qualities such as height, which can take values such as “tall”, “short” or “very tall”.  Each linguistic variable may be assigned one or more linguistic values, which are in turn connected to a numeric value through the mechanism of membership functions.  Example: if temperature is cold, then pressure is low
  • 47. 47/72 Linguistic Variables A linguistic variable combines several fuzzy sets. linguistic variable : temperature linguistics terms (fuzzy sets) : { cold, warm, hot } x [C] (x) 1 0 cold warm hot 6020
  • 48. 48/72 How the models work Inputs converted to degrees of membership of fuzzy sets. Fuzzy rules applied to get new sets of members. These sets are then converted back to real numbers. Crisp data Fuzzifier Member 90% hot 10% cold Fuzzy rules IF 90% hot THEN 80% open IF 10% cold THEN 20% closed Fuzzy output set 80% open, 20% closed Defuzzifier Crisp data
  • 49. 49/72 Fuzzy Rules  Operates on a bunch of IF-THEN statements  A fuzzy rule can be defined as a conditional statement in the form: A  B: IF x is A THEN y is B where x and y are linguistic variables; and A and B are linguistic values determined by fuzzy sets on the universe of discourses X and Y, respectively. Antecedent of Fuzzy Rules If the annual income of a person is High, then the person is rich. If the annual income of a person is High AND the amount requested is NOT huge, then recommend approving the loan
  • 50. 50/72 Fuzzy Rules  causal dependencies can be expressed in form of if-then- rules  general form: if <antecedent> then <consequence>  example: if temperature is cold and oil is cheap then heating is high linguistic values/terms (fuzzy sets)linguistic variables
  • 51. 51/72 Implicator  If p is proposition of the form if x is A, then q is a proposition of the form if y is B, then we define the fuzzy implication A → B as a fuzzy relation, A and B are Fuzzy sets.  It is clear that (A→B)(u, v) should be defined point wise and likewise  , i.e.(A→B)(u,v) depends only on A (u) and B(v).  (A→B)(u,v)=I (A(u),B(v))=A(u)→B(v)
  • 52. 52/72 Consequent of Fuzzy Rules  Fuzzy Consequent: If … then y is A  P: A  B  If x is A then y is B  Functional Consequent: If x1  A1 AND x2  A2 AND… xn  An then y = a0 + ai xi  Designing Antecedent Membership: Two conditions should be satisfied  Each membership function overlaps only with the closest neighboring membership functions . For any possible input data its membership rules in all relevant fuzzy sets should sum to 1 (or nearly)  Ex:  A1(10) +  A2(10) +  A3(10) = 0.25 + 0.25 + 0 = 0.5 1  1 A1 A2 A3  .5  0 10 20 30 x
  • 53. 53/72 Firing Fuzzy Rules Tall men Heavy men 180 Degree of Membership 1.0 0.0 0.2 0.4 0.6 0.8 Height, cm 190 200 70 80 100160 Weight, kg 120 Degree of Membership 1.0 0.0 0.2 0.4 0.6 0.8 These fuzzy sets provide the basis for a weight estimation model. The model is based on a relationship between a man’s height and his weight: IF height is tall THEN weight is heavy
  • 54. 54/72  The value of the output or a truth membership grade of the rule consequent can be estimated directly from a corresponding truth membership grade in the antecedent.
  • 55. 55/72 Generalized Modus Ponens  Single rule with single antecedent  Single Rule with Multiple Antecedents Rule: Fact: Conclusion: if x is A then y is B x is A’ y is B’
  • 56. 56/72 Single Rule with Multiple Antecedents Rule: Fact: Conclusion: if x is A and y is B then z is C x is A and y is B z is C
  • 57. 57/72 Multiple Antecedent IF x is AAND y is B THEN z is C IF x is A OR y is B THEN z is C Use unification (OR) or intersection (AND) operations to calculate a membership value for the whole antecedent. AND: C(z) = min (A (x), B(y) ) OR: C(z) = max (A (x), B(y) ) Example: IF project duration is long AND project staffing is large AND project funding is inadequate THEN risk is high IF service is excellent OR food is delicious THEN tip is generous
  • 58. 58/72 Multiple Consequents IF x is A THEN y is B AND z is C. Each consequent is affected equally by the membership in the antecedent class(eg). E.g., IF x is tall THEN x is heavy AND x has large feet. Tall (x) = 0.7  ( Heavy(y) = 0.7  Largefeet (y) = 0.7)
  • 59. 59/72 Multiple Consequents Multiple Rules with Multiple Antecedents   A’( )x x A1 ( )y y B1 ( )z z C1 ( )x x A2 ( )y y B2 ( )z z C2B’ B’ A’ Rule1: Fact: Conclusion: if x is A1 and y is B1 then z is C1 x is A’ and y is B’ z is C’ Rule2: if x is A2 and y is B2 then z is C2
  • 60. 60/72 Fuzzy Inference Systems  Fuzzy rule based systems, fuzzy models, and fuzzy expert systems are also known as fuzzy inference systems. The key unit of a fuzzy logic system is FIS. The primary work of this system is decision-making. FIS uses “IF...THEN” rules along with connectors “OR” or “AND” for making necessary decision rules.  The input to FIS may be fuzzy or crisp, but the output from FIS is always a fuzzy set. When FIS is used as a controller, it is necessary to have crisp output. Hence, there should be a defuzzification unit for converting fuzzy variables into crisp variables along FIS
  • 61. 61/72 Mamdani Fuzzy Inference  Fuzzification of the input variables, Determine membership values  Rule evaluation Based on membership values of ( composite) antecedents  Aggregation of the rule outputs, Unify all membership values for the output from all rules Defuzzification.
  • 62. 62/72 Sugeno style inference  The main steps of the fuzzy inference process  1. Fuzzifying the inputs  2. Applying the fuzzy operator  Are exactly the same as in Mamdani FIS  The main difference is only the output membership in Sugeno’s type is either linear or a constant
  • 63. 63/72 Two-input, one-output example  Rule: 1 IF funding is adequate OR staffing is small THEN risk is low  Rule: 2 IF funding is marginal AND staffing is large THEN risk is normal  Rule: 3 IF funding is inadequate THEN risk is high
  • 64. 64/72 Inference steps  Step 1: Fuzzification: The first step is to take the crisp inputs, (let funding and staffing be x1 and y1), and determine the degree to which these inputs belong to each of the appropriate fuzzy sets.  Step 2 : Rule Evaluation: (xA1) = 0.5; (x  A2) = 0.2; (y  B1) = 0.1 and (y  B2) = 0.7, and apply them to the antecedents of the fuzzy rules. If a given fuzzy rule has multiple antecedents, the fuzzy operator (AND or OR) is used to obtain a single number that represents the result of the antecedent evaluation. This number (the truth value) is then applied to the consequent membership function. To evaluate the disjunction of the rule antecedents, we use the OR fuzzy operation. Typically, using the fuzzy operation union: A[B(x) = max[A(x); B(x)] Similarly, in order to evaluate the conjunction of the rule antecedents, we apply the AND fuzzy operation intersection: AB(x) = min[A(x); B(x)]
  • 66. 66/72 The result of the antecedent evaluation can be now be applied to the membership function of the consequent. There are two methods available: Clipping and Scaling  Clipping: Cuts of the top of the membership function whose value is higher than the matching degree.  Scaling: Scales down the membership function in proportion to the matching degree  Clipping and Scaling Methods
  • 67. 67/72 Step 3 : Aggregation of Rule outputs  Aggregation is the process of unification of the outputs of all rules. We take the membership functions of all rule consequents previously clipped or scaled and combine them into a single fuzzy set. The input of the aggregation process is the list of clipped or scaled consequent membership functions, and the output is one fuzzy set for each output variable
  • 68. 68/72 Defuzzification Methods  Max-membership principle  c(Z*)  c(z)  z  Z z* z  1 Centroid principle      dz)z( zdz)z( z c c* z* z  1
  • 69. 69/72 Fuzzy Rough sets Fuzzy sets and rough sets address two important, and mutually orthogonal, characteristics of imperfect data and knowledge: while the former allow that objects belong to a set or relation to a given degree, the latter provide approximations of concepts in the presence of incomplete information. Within the hybrid theory, Pawlak’s well-known framework for the construction of lower and upper approximations of a concept C given incomplete information. A subset A of a given universe X, containing examples of C), and an equivalence relation R in X that models “indiscernibility” or “indistinguishability”, has been extended in two ways: 1. The set A may be generalized to a fuzzy set in X, allowing that objects can belong to a concept (i.e., meet its characteristics) to varying degrees. 2. Rather than modeling elements’ indistinguishability, we may assess their similarity (objects are similar to a certain degree), represented by a fuzzy relation R. As a result, objects are categorized into classes, or granules, with “soft” boundaries based on their similarity to one another.
  • 70. 70/72 Fuzzy Rough sets  Define the lower and upper approximation of a fuzzy set A in X as the fuzzy sets R↓A and R↑A in X, constructed by means of an implicator I, a t-norm T and a fuzzy T-equivalence relation R in X, R↓A(y) = inf x∈X I(R(x, y), A(x)) R↑A(y) = sup x∈X T(R(x, y), A(x)) for all y in X. (A1, A2) is called a fuzzy rough set (in (X, R)). interpreted as the degree of inclusion of Ry in A and the degree of overlap of Ry and A respectively. Let R be a fuzzy relation in X and A a fuzzy set in X. 1. The tight, loose and (usual) lower approximation of A are defined R↓ A(y) = inf z∈XI(Rz(y), inf x∈X I(Rz(x), A(x) ) )  2. R↑↓A(y) = sup z∈XT(Rz(y),inf x∈X I(Rz(x), A(x)))  3. R↓A(y) = inf x∈X I(Ry(x), A(x))
  • 71. 71/72  The tight, loose and (usual) upper approximation of  A are defined as  (a) R↓↑A(y) = inf z∈X I(Rz(y),sup x∈X T(Rz(x), A(x)))  (b) R↑↑A(y) = sup z∈X T(Rz(y),sup x∈X T(Rz(x), A(x)))  (c) R↑A(y) = sup x∈X T(Ry(x), A(x))  for all y in X  Consider the fuzzy T-equivalence relation R on X ={a, b} given by  and the fuzzy set A in X defined by A(a) = 1 and  A(b) = 0.8. Then R↑A (a) = 1 and R ↑A(b) = 0.8,  hence(R↓↑A)(a) = min(max(0,1),max(0.8,0.8)) = 0.8  It is clear that A  R↓↑A  . R a b a 1.0 0.2 b 0.2 1.0