SlideShare a Scribd company logo
1 of 14
Download to read offline
ORIGINAL ARTICLE
An incremental approach to attribute reduction of dynamic
set-valued information systems
Guangming Lang • Qingguo Li • Tian Yang
Received: 3 July 2013 / Accepted: 15 December 2013 / Published online: 1 January 2014
Ó Springer-Verlag Berlin Heidelberg 2013
Abstract Set-valued information systems are important
generalizations of single-valued information systems. In
this paper, three relations are proposed for attribute
reduction of set-valued information systems. Then, we
convert a large-scale set-valued information system into a
smaller relation information system. An incremental algo-
rithm is designed to compress dynamic set-valued infor-
mation systems. Concretely, we mainly address the
compression updating from three aspects: variations of
attribute set, immigration and emigration of objects and
alterations of attribute values. Finally, several illustrative
examples are employed to demonstrate that attribute
reduction of dynamic set-valued information systems are
simplified significantly by our proposed approaches.
Keywords Rough sets Á Attribute reduction Á
Homomorphism Á Set-valued information system Á
Dynamic set-valued information system
1 Introduction
Rough set theory proposed by Pawlak [40] is a powerful
mathematical tool to deal with vagueness and uncertainty
of information. But the condition of equivalence relation is
so restrictive that limits applications of rough sets in
practice. By combining with fuzzy sets [1–6, 17, 23, 24, 36,
37, 39, 48, 49], probability theory [32, 33, 44, 45, 53–55,
62], topology [16, 18, 42. 43, 50, 52, 56, 59], matroid
theory [47] and other theories [29, 38], rough set theory has
been successfully applied to various areas such as knowl-
edge discovery, data mining and pattern recognition.
Set-valued information systems as generalized models
of single-valued information systems and a representation
of incomplete information have attracted a great deal of
attention [8–15, 20, 22, 25, 31, 41, 51, 52, 57]. For
example, Guan et al. [22] initially introduced set-valued
information systems and investigated their basic properties.
Chen et al. [8, 9] studied attribute reduction of set-valued
information systems based on tolerance relations and var-
iable tolerance relations. Liu et al. [31] discussed attribute
reduction of set-valued information systems on the basis of
maximal variable precision tolerance classes. Zhang et al.
[57] introduced matrix approaches for approximations of
concepts in set-valued information systems with dynamic
attribute variation. In practice, the tolerance relation dis-
cerns objects on the basis of whether there are common
attribute values or not, and it neglects some differences
between objects. For instance, there are three objects
x, y and z in an incomplete information system, and their
attribute values a(x), a(y) and a(z) are 1, 0 and à with
respect to attribute a, respectively, where à stands for the
lost value. Since the lost value is considered to be similar to
any value in the domain of the corresponding attribute,
x and y are in the tolerance class for z with respect to a. But
G. Lang Á Q. Li (&)
College of Mathematics and Econometrics, Hunan University
Changsha, 410082 Hunan, People’s Republic of China
e-mail: liqingguoli@aliyun.com
G. Lang
e-mail: langguangming1984@126.com
T. Yang
College of Science, Central South University of Forestry
and Technology, Changsha 410082, Hunan,
People’s Republic of China
e-mail: math-yangtian@126.com
123
Int. J. Mach. Learn. & Cyber. (2014) 5:775–788
DOI 10.1007/s13042-013-0225-x
a(x) = a(y). In other words, it may happen that two
objects are in the same tolerance class with respect to an
attribute, but there are no common attribute values with
respect to the attribute. Therefore, it is important to present
new relations for set-valued information systems.
In recent years, homomorphisms [19, 21, 27, 30, 46, 60,
61] have been considered as an important approach for
attribute reduction of information systems. For instance,
Grzymala-Busse [21] initially introduced seven kinds of
homomorphisms of knowledge representation systems and
investigated their basic properties in detail. Afterwards,
scholars discussed the relationship between information
systems by means of different homomorphisms [19, 27, 46,
60, 61]. But few attempts have been made on compressing
set-valued information systems under the condition of
homomorphisms. Furthermore, there have been many
papers concerning dynamic information systems [7, 26, 28,
34, 35, 58]. For example, Chen et al. [7] brought forward a
dynamic maintenance approach for approximations in
coarsening and refining attribute values. Li et al. [26]
investigated incremental updating approximations in
dominance-based rough sets approach under the variation
of attribute set. Li et al. [28] introduced the characteristic
relation approach for dynamic attribute generalization. Liu
et al. [34, 35] proposed an incremental approach for
inducing knowledge from dynamic information systems.
To deal with numerical data, Zhang et al. [58] presented a
new dynamic method for incrementally updating approxi-
mations of a concept under neighborhood rough sets. In
practice, set-valued information systems also vary with
time due to dynamic characteristics of data collection, and
the non-incremental approach to compressing dynamic set-
valued information systems is often very costly or even
intractable. Therefore, it is essential to apply an incre-
mental updating scheme to maintain the compression
dynamically and avoid unnecessary computations.
The purpose of this paper is to further study set-valued
information systems. First, we present three new relations
and two types of discernibility matrixes for set-valued
information systems. We also investigate their basic
properties. Second, a large-scale set-valued information
system is compressed into a relatively smaller relation
information system by using the proposed relations and
information system homomorphisms. Third, we design an
incremental algorithm of compressing dynamic set-valued
information systems. Particularly, we mainly address the
compression updating from three aspects: variations of
attribute set, immigration and emigration of objects and
alterations of attribute values. The computational com-
plexity of attribute reduction of dynamic set-valued infor-
mation systems can be reduced greatly.
The rest of this paper is organized as follows. Section 2
briefly reviews the basic concepts of set-valued
information systems and consistent functions. In Sect. 3,
we put forward three relations and two types of discern-
ibility matrixes for set-valued information systems. Section
4 is devoted to compressing set-valued information systems
for attribute reduction. In Sect. 5, we compress dynamic
set-valued information systems by using an incremental
algorithm. We conclude the paper and set further research
directions in Sect. 6.
2 Preliminaries
In this section, we briefly review some concepts about set-
valued information systems and relation information sys-
tems. In addition, an example is employed to illustrate set-
valued information systems.
Definition 2.1 [22] Suppose S = (U, A, V, f) is a set-
valued information system (denoted as SIS), where
U = {x1, x2, . . ., xn} is a non-empty finite set of objects,
A = {a1, a2, . . ., am} is a non-empty finite set of attri-
butes,V is the set of attribute values, and f is a mapping
from U 9 A to V, where f:U9A?2V
is a set-valued
mapping.
Single-valued information systems are regarded as
special cases of set-valued information systems. There
are many semantic interpretations for set-valued infor-
mation systems, we summarize two types of them as
follows:
Type 1: For x 2 U; a 2 A; fðx; aÞ is interpreted con-
junctively. For example, if a is the attribute ‘‘speaking
language’’, then f(x, a) = {German, French, Polish} can be
viewed as: x speaks German, French and Polish, and x can
speak three languages.
Type 2: For x 2 U; a 2 A; fðx; aÞ is interpreted dis-
junctively. For instance, if a is the attribute ‘‘speaking
language’’, then f(x, a) = {German, French, Polish} can be
regarded as: x speaks German, French or Polish, and x can
speak only one of them.
For set-valued information systems, Guan et al. and
Chen et al. presented concepts of tolerance relation and
variable precision tolerance relation, respectively.
Definition 2.2 [22] Let S = (U, A, V, f) be a set-valued
information system, a 2 A; and B  A. Then the tolerance
relations Ra and RB are defined as follows:
Ra ¼ fðxi; xjÞjfðxi; aÞ  fðxj; aÞ 6¼ ;; xi; xj 2 Ug;
RB ¼ fðxi; xjÞj8b 2 B; fðxi; bÞ  fðxj; bÞ 6¼ ;; xi; xj 2 Ug:
In other words, ðx; yÞ 2 RB is viewed as x and y are
indiscernible with respect to B, and RB(x) is seen as the
tolerance class for x with respect to B.
776 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
Definition 2.3 [8] Let S = (U, A, V, f) be a set-valued
information system, al 2 A; B  A; xi; xj 2 U; cl
ij ¼
jfðxi; alÞ  fðxj; alÞj=jfðxi; alÞ [ fðxj; alÞj; and a 2 ð0; 1Š.
Then the relations Ra
al
and RB
a
are defined as follows:
Ra
al
¼fðxi; xjÞ 2 U Â Ujcl
ij ! ag;
Ra
B ¼fðxi; xjÞ 2 U Â Uj8al 2 B; cl
ij ! ag:
The following example shows that there are some issues
related to the tolerance relation and variable precision
tolerance relation.
Example 2.4 Table 1 depicts a set-valued information
system. In the sense of Definition 2.2,
Ra1
ðx2Þ ¼ fx1; x2; x3; x4; x5; x6g. Obviously, we have that
fðx1; x2Þ; ðx3; x2Þg  Ra1
. But |f(x1, a1) f(x2, a1)| = 1 and
|f(x2, a1) f(x3, a1)| = 2. Furthermore, we obtain that
fðx1; x4Þ; ðx6; x4Þg  Ra1
. But f(x1, a1) f(x4, a1) = {0}
and f(x6, a1) f(x4, a1) = {1}. Although there are some
differences between objects which are in the same toler-
ance class, Ra1
cannot discern them.
By Definition 2.3, we have that
fðx1; x4Þ; ðx2; x3Þ; ðx4; x6Þ; ðx5; x6Þg  R0:5
a1
. Furthermore,
we obtain that f(x1, a1)  f(x4, a1) = {0}, f(x2, a1) 
f(x3, a1) = {1, 2}, f(x4, a1)  f(x6, a1) = {1} and f(x5, a1)
 f(x6, a1) = {1}. It is obvious that {1,2} = {1} and
{1} = {0}. But we cannot get this difference in terms of
Definition 2.3.
Wang et al. presented a concept of consistent functions
for attribute reduction of relation information systems.
Definition 2.5 [46] Let U1 and U2 be two universes, f a
mapping from U1 to U2, the relation R a mapping from
U 9 U to {0, 1}, and ½xŠf ¼ fy 2 U1jfðxÞ ¼ f ðyÞg. For any
x; y 2 U1; if R(u, v) = R(s, t) for any two pairs
ðu; vÞ; ðs; tÞ 2 ½xŠf  ½yŠf ; then f is said to be consistent with
respect to R.
If the consistent function is a surjection, then it is a
homomorphism between relation information systems. We
compress a large-scale information system into a smaller
one under the condition of a homomorphism. It has been
proved that attribute reduction of the original system and
image system are equivalent to each other. Therefore, the
consistent functions provide an approach to compressing
relation information systems.
3 Three relations for set-valued information systems
In this section, we propose three relations to address the
problems illustrated in Example 2.4 and present two types
of discernibility matrixes for set-valued information
systems.
Definition3.1 Let (U, A, V, f) be a set-valued informa-
tion system, a 2 A; and B  A. Then the relations
R(a, h)
[
and R[
ðB;HBÞ are defined as follows:
R[
ða;hÞ ¼ fðx; yÞjjfðx; aÞ  fðy; aÞj [ h; x; y 2 Ug;
R[
ðB;HBÞ ¼ fðx; yÞjjfðx; aiÞ  fðy; aiÞj [ hi; x; y 2 U; ai 2 Bg;
where j Á j denotes the cardinality of a set,
HB = (h1, h2, . . ., hm) and 0 hi jVai
j:
The physical meaning of ðx; yÞ 2 R[
ða;hÞ is that there exist
at least h ? 1 common values between x and y with respect
to a. In terms of Definition 3.1, we obtain that Ra = R(a,
0)
[
, R(B,(0,0,. . .,0))
[
= RB and R[
ðB;HBÞ ¼
T
ai2B R[
ðai;hiÞ. For the
convenient representation, we denote
R[
ðB;HBÞðxÞ ¼ ½xŠ[
ðB;HBÞ ¼ fyjðx; yÞ 2 R[
ðB;HBÞg. Furthermore,
K = (k1, k2, . . ., km) B HB if and only if ki B hi for
1 B i B m. If fR[
ða;hÞðxÞjx 2 Ug is a covering of U, then
R(a, h)
[
is called a [ h -relation. In general, R(a, h)
[
and
R[
ðB;HBÞ are symmetric and intransitive, R(a,h)
[
and R[
ðB;HBÞ are
not reflexive necessarily if h [ 0 and HB = (1, 1, . . ., 1),
respectively. For example, we obtain that R[
ða1;hÞðx1Þ ¼ ; by
considering Table 1.
Example 3.2 (Continuation of Example 2.4) Let a1, 0, 1
and 2 denote language, German, French and Polish,
respectively, then we have that fðx2; a1Þ ¼
fGerman, French; Polishg and
fðx3; a1Þ ¼ fFrench, Polishg. Consequently, we obtain that
ðx2; x3Þ 2 R[
ða1;1Þ. In other words, x2 and x3 speak at least
two common languages.
Proposition 3.3 Let (U, A, V, f) be a set-valued infor-
mation system, and B; C  A. Then we have (1) if
HB B HC B HA, then R[
ðA;HAÞ  R[
ðC;HCÞ  R[
ðB;HBÞ; (2) if
HB B HC B HA, then ½xŠ[
ðA;HAÞ  ½xŠ[
ðC;HCÞ  ½xŠ[
ðB;HBÞ:
Definition 3.4 Let S = (U, A, V, f) be a set-valued
information system, R[
A ¼ fR[
ða1;h1Þ; R[
ða2;h2Þ; . . .; R[
ðam;hmÞg;
Table 1 A set-valued information system
U a1 a2 a3 a4
x1 {0} {0} {1, 2} {1, 2}
x2 {0, 1, 2} {1, 2} {1, 2} {0, 1, 2}
x3 {1, 2} {1} {1} {1, 2}
x4 {0, 1} {0, 2} {1, 2} {1, 2}
x5 {1, 2} {1, 2} {1, 2} {1}
x6 {1} {1} {0, 1} {0, 1}
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 777
123
and R[
ðai;hiÞ a [ hi -relation. Then ðU; R[
A Þ is called an
induced [ -relation information system of S.
Example 3.5 Considering Table 1, we obtain an induced
[ -relation information system ðU; R[
A Þ, where R[
A ¼
fRij1 i 4g and
R[
ða1;0Þðx1Þ ¼ fx1; x2; x4g;
R[
ða1;0Þðx2Þ ¼ R[
ða1;0Þðx4Þ ¼ fx1; x2; x3; x4; x5; x6g;
R[
ða1;0Þðx3Þ ¼ R[
ða1;0Þðx5Þ ¼ R[
ða1;0Þðx6Þ ¼ fx2; x3; x4; x5; x6g;
R[
ða2;0Þðx1Þ ¼ fx1; x4g;
R[
ða2;0Þðx2Þ ¼ R[
ða2;0Þðx5Þ ¼ fx2; x3; x4; x5; x6g;
R[
ða2;0Þðx3Þ ¼ R[
ða2;0Þðx6Þ ¼ fx2; x3; x5; x6g;
R[
ða2;0Þðx4Þ ¼ fx1; x2; x4; x5g;
R[
ða3;0Þðx1Þ ¼ R[
ða3;0Þðx2Þ ¼ R[
ða3;0Þðx3Þ ¼ R[
ða3;0Þðx4Þ
¼ R[
ða3;0Þðx5Þ ¼ R[
ða3;0Þðx6Þ ¼ fx1; x2; x3; x4; x5; x6g;
R[
ða4;0Þðx1Þ ¼ R[
ða4;0Þðx2Þ ¼ R[
ða4;0Þðx3Þ ¼ R[
ða4;0Þðx4Þ
¼ R[
ða4;0Þðx5Þ ¼ R[
ða4;0Þðx6Þ ¼ fx1; x2; x3; x4; x5; x6g:
Definition 3.6 Let ðU; R[
A Þ be an induced [ -relation
information system of S = (U, A, V, f), and P  A. If
T
R[
P ¼
T
R[
A and
T
R[
PÃ 6¼
T
R[
A for any
R[
PÃ $R[
P ; then R[
P is called a reduct of ðU; R[
A Þ:
By Definition 3.6, the reduct is the minimal subset
preserving R[
A : In Example 3.5, we can get a reduct {R2}
for ðU; R[
A Þ:
In the sense of Definition 3.1, we propose a discern-
ibility matrix for set-valued information systems.
Definition 3.7 Let S = (U, A, V, f) be a set-valued
information system. Then its discernibility matrix
MA = (M(x, y)) is a |U| 9 |U| matrix, the element
M(x, y) is defined by
Mðx; yÞ ¼ fa 2 Ajðx; yÞ 62 R[
ða;haÞ; x; y 2 Ug;
where R[
ða;haÞ is a [ ha -relation.
Thatis,thephysicalmeaningofM(x, y)isthatobjectsx and
y can be distinguished by any element of M(x, y) in S. If
M(x, y) = ;, then objects x and y can be discerned. It is suf-
ficient to consider only the lower triangle or the upper triangle
of the matrix since the discernibility matrix M is symmetric.
Definition 3.8 Let S = (U, A, V, f) be a set-valued
information system, and M = (M(x, y)) the discernibility
matrix of S. Then M ¼
V
ðx;yÞ2U2
W
Mðx; yÞ is called a dis-
cernibility function of S.
The expression
W
Mðx; yÞ denotes the disjunction of all
attributes in M(x, y), and the expression
V
f
W
Mðx; yÞg
stands for the conjunction of all
W
Mðx; yÞ. In addition,
V
B
is a prime implicant of the discernibility function D if and
only if B is a reduct of S.
Definition 3.9 Let S = (U, A[ {d}, V, f) be a set-valued
decision information system(denoted as SDIS), where d is
a decision attribute. Then its discernibility matrix
Md = (Md(x, y)) is defined as a |U| 9 |U| matrix, where
Mdðx;yÞ ¼
;; dðxÞ ¼ dðyÞ;
fa 2 Ajðx;yÞ 62 R[
ða;haÞ;x;y 2 Ug; otherwise:

In other words, the physical meaning of Md(x, y) is that
objects x and y can be distinguished by any element of
Md(x, y) in S. If Md(x, y) = [, then objects x and y can be
discerned. It is sufficient to consider only the lower triangle
or the upper triangle of Md since it is symmetric.
Definition 3.10 Let S = (U, A[ {d}, V, f) be a SDIS,
and Md = (Md(x, y)) the discernibility matrix of S. Then
Md ¼
V
ðx;yÞ2U2
W
Mdðx; yÞ is called a discernibility function
of S.
The expression
W
Mdðx; yÞ denotes the disjunction of all
attributes in Md(x, y), and the expression
V
f
W
Mdðx; yÞg
stands for the conjunction of all
W
Mdðx; yÞ. In addition,
V
B is a prime implicant of the discernibility function Dd if
and only if B is a reduct of S.
Definition 3.11 Let (U, A, V, f) be a set-valued infor-
mation system, a 2 A; and B  A. Then the relations R(a,h)
and RðB;HBÞ are defined as
Rða;hÞ ¼ fðx; yÞjjfðx; aÞ  fðy; aÞj ¼ h; x; y 2 Ug;
RðB;HBÞ ¼ fðx; yÞjjfðx; aiÞ  fðy; aiÞj ¼ hi; x; y 2 U; ai 2 Bg:
The physical meaning of ðx; yÞ 2 Rða;hÞ is that there exist
h common values between x and y with respect to a. In the
sense of Definition 3.11, R(a,h) and RðB;HBÞ are special cases
of Definition 3.1. But the condition of relations presented
in Definition 3.11 is stricter than Definition 3.1. They can
be applied in practice with respect to different requests.
Meanwhile, we have that R[
ða;hÞ ¼
S
j [ h Rða;jÞ and
R[
ðB;HBÞ ¼
S
K [ HB
RðB;KÞ. For simplicity, we note that
RðB;HBÞðxÞ ¼ ½xŠðB;HBÞ ¼ fyjðx; yÞ 2 RðB;HBÞg. For example,
we get that ðx2; x3Þ 2 Rða1;2Þ in Example 3.2. In other
words, x2 and x3 speak two common languages.
Property 3.12 Let (U, A, V, f) be a set-valued informa-
tion system, and B; C  A. Then we have
(1) if HB B HC B HA, then RðA;HAÞ  RðC;HCÞ  RðB;HBÞ;
(2) if HB B HC B HA, then ½xŠðA;HAÞ  ½xŠðC;HCÞ  ½xŠðB;HBÞ:
778 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
Definition 3.13 Let (U, A, V, f) be a set-valued infor-
mation system, a 2 A; B  A; and P  Va. Then the rela-
tions R(a,P) and RðB;PÞ are defined as
Rða;PÞ ¼ fðx; yÞjfðx; aÞ  fðy; aÞ ¼ P; x; y 2 Ug;
RðB;PÞ ¼ fðx; yÞjfðx; aiÞ  fðy; aiÞ ¼ Pi; x; y 2 U; ai 2 Bg;
where P ¼ ðP1; P2; . . .; PmÞ; and Pi is defined as Pi 
Vai
ðrespectively; Pi ¼ ;Þ if ai 2 B ðrespectively;ai 62 BÞ:
The physical meaning of ðx; yÞ 2 Rða;PÞ is that the
common value between x and y with respect to a is P. In
the sense of Definition 3.13, we obtain that ðx2; x3Þ 2
Rða1;PÞ in Example 3.2, where P ¼ fFrench, Polishg. In
other words, x2 and x3 speak French and Polish. The con-
dition of relations presented in Definition 3.13 is stricter
than Definitions 3.1 and 3.11. The proposed relations can
be applied in practical situations with respect to different
requests. For simplicity, we do not present discernibility
matrixes based on Definitions 3.11 and 3.13 in this section.
By Definitions 3.11 and 3.13, we observe that Rða;hÞ ¼
S
fRða;PÞjP 2 2A
; jPj ¼ hg: Furthermore, R(a,P) and RðB;PÞ
are symmetric and intransitive. According to Definitions
3.1, 3.11 and 3.13, we obtain
R[
ða;hÞ ¼
[
i [ h
Rða;iÞ ¼
[
i ! h
[
fRða;PÞjjPj ¼ i; P 2 2Va
g
and
R[
ðB;IÞ ¼

a2B
f
[
i [ h
Rða;iÞg ¼

a2B
f
[
i [ h
[
fRða;PÞjjPj ¼ i; P
2 2Va
gg:
In the sense of Definitions 3.1, 3.11 and 3.13, we obtain
different granularities of the universe. Furthermore, we
learn that Definition 3.1, Definition 3.11 and Definition
3.13 is the ordering of decreasing granularity of the uni-
verse. In other words, ranking the granularities obtained by
using Definitions 3.1, 3.11 and 3.13, Definition 3.1 is No.1,
Definition 3.11 is No.2, Definition 3.13 is No.3. For
example, we get the coarsest granularity by using Defini-
tion 3.1 among the proposed relations. In practice, we take
the corresponding relation with respect to the request.
4 Attribute reduction of SIS and SDIS
under homomorphisms
In practical situations, it is time-consuming to conduct
attribute reduction of the large-scale set-valued information
systems. To solve this issue, we propose a transforming
algorithm (from an original SIS/SDIS to a relation SIS/SDIS)
for attribute reduction of SIS and SDIS in this section.
4.1 Attribute reduction of SIS by using
homomorphisms
In this subsection, after deriving an induced C -relation
information system of SIS, we convert it into a smaller one
under the condition of homomorphisms. Several examples
are employed to illustrate that the computational com-
plexity of computing attribute reducts is reduced greatly by
means of homomorphisms.
Definition 4.1 Let ðU1; R[
A Þ be an induced [ -relation
information system of S = (U1, A, V, f), and
R[
ðai;hiÞ 2 R[
A . Then U1=R[
ðai;hiÞ ¼ f½xŠR[
ðai;hiÞ
jx 2 U1g is
called a partition based on R[
ðai;hiÞ; where ½xŠR[
ðai;hiÞ
¼
fyjR[
ðai;hiÞðxÞ ¼ R[
ðai;hiÞðyÞ; y 2 U1g for x 2 U1:
We can discuss set-valued information systems in the
sense of Definitions 3.11 and 3.13. For convenience, we
only consider hi = 0 and denote R[
ðai;hiÞ as Ri in this
section.
Following, we employ Table 2 to show the partition
based on each relation for ðU1; R[
A Þ; where Pixj
stands for
the block containing xj based on Ri. It is easy to see that
PAxj
¼
T
1 i m Pixj
; where PAxj
denotes the block con-
taining xj based on R[
A :
We present an algorithm of compressing set-valued
information systems.
Algorithm 4.2 Let S = (U1, A, V, f) be a set-valued
information system, where U1 = {x1, . . ., xn} and
A = {a1, . . ., am}.
Step 1. Input the set-valued information system
S = (U1, A, V, f) and obtain an induced [ -relation
information system ðU1; R[
A Þ, where
R[
A ¼ fR1; R2; . . .; Rmg;
Step 2. Compute U1/Ri (1 B i B m) and obtain
U1=R[
A ¼ fCij1 i Ng;
Step 3. Obtain ðU2; gðR[
A ÞÞ by defining g(x) = yi for
any x 2 Ci; where U2 ¼ fgðxiÞjxi 2 U1g and
gðR[
A Þ={g(R1), g(R2), . . ., g(Rm)};
Step 4. Get a reduct {g(Ri1), g(Ri2), . . ., g(Rik)} of
(U2, {g(R1), g(R2), . . ., g(Rm)});
Step 5. Obtain a reduct {Ri1, Ri2, . . ., Rik} of ðU1; R!
A Þ
and output the results.
The mapping g presented in Algorithm 4.2 is a
homomorphism from ðU1; R[
A Þ to ðU2; gðR[
A ÞÞ. Attribute
reducts of ðU1; R[
A Þ and ðU2; gðR[
A ÞÞ are equivalent to
each other under the condition of g. The computational
complexity of constructing g is m * O(n2
) ? O((m - 1) *
n2
). Furthermore, by transforming a large-scale set-valued
information system S = (U1, A, V, f) into a relation
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 779
123
information system ðU1; R[
A Þ; if there is a homomorphism
between the relation information system and another
relation information system ðU2; R[ Ã
A Þ, we can get attri-
bute reducts of S by the reducts of ðU2; R[ Ã
A Þ.
Remark In Example 3.1, Wang et al. [46] also obtained
U1=R[
A . But we get U1=R[
A by computing U1/Ri for any
Ri 2 R[
A in Algorithm 4.2. By using the proposed
approach, we compress dynamic set-valued information
systems in Sect. 5.
The process of compressing set-valued information
systems with Algorithm 4.2 is illustrated by the following
example.
Example 4.3 Table 3 depicts a set-valued information
system S1 = (U1, A, V, f). According to Definition 3.1 and
Example 3.5, we obtain ðU1; R[
A Þ; and R[
A ¼
fR1; R2; R3; R4g; where
R1ðx1Þ ¼ R1ðx7Þ ¼ fx1; x2; x4; x7g; R1ðx2Þ ¼ R1ðx4Þ
¼ fx1; x2; x3; x4; x5; x6; x7; x8g;
R1ðx3Þ ¼ R1ðx5Þ ¼ R1ðx6Þ ¼ R1ðx8Þ
¼ fx2; x3; x4; x5; x6; x8g;
R2ðx1Þ ¼ R1ðx7Þ ¼ fx1; x2; x3; x4; x7g; R2ðx2Þ ¼ R2ðx3Þ
¼ R2ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g;
R2ðx5Þ ¼ R2ðx6Þ ¼ R2ðx8Þ ¼ fx2; x3; x4; x5; x6; x8g;
R3ðx1Þ ¼ R3ðx2Þ ¼ R3ðx3Þ ¼ R3ðx4Þ ¼ R3ðx5Þ ¼ R3ðx6Þ
¼ R3ðx7Þ ¼ R3ðx8Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g;
R4ðx1Þ ¼ R4ðx2Þ ¼ R4ðx3Þ ¼ R4ðx4Þ ¼ R4ðx5Þ ¼ R4ðx6Þ
¼ R4ðx7Þ ¼ R4ðx8Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g:
By Definition 4.1, we derive U1/R1, U1/R2, U1/R3 and
U1/R4 shown in Table 4 and get U1=R[
A ¼
ffx1; x7g; fx2; x4g; fx3g; fx5; x6; x8gg. Then we define a
mapping g : U1 À! U2 as follows:
gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ
¼ gðx6Þ ¼ gðx8Þ ¼ y4:
Consequently, we derive ðU2; gðR[
A ÞÞ; where U2 ¼
fy1; y2; y3; y4g; gðR[
A Þ ¼ fgðR1Þ; gðR2Þ; gðR3Þ; gðR4Þg;
and
gðR1Þðy1Þ ¼ fy1; y2g; gðR1Þðy2Þ ¼ fy1; y2; y3; y4g; gðR1Þðy3Þ
¼ gðR1Þðy4Þ ¼ fy2; y3; y4g;
gðR2Þðy1Þ ¼ fy1; y2; y3g; gðR2Þðy2Þ ¼ gðR2Þðy3Þ
¼ fy1; y2; y3; y4g; gðR2Þðy4Þ ¼ fy2; y3; y4g;
gðR3Þðy1Þ ¼ gðR3Þðy2Þ ¼ gðR3Þðy3Þ ¼ gðR3Þðy4Þ
¼ fy1; y2; y3; y4g;
gðR4Þðy1Þ ¼ gðR4Þðy2Þ ¼ gðR4Þðy3Þ ¼ gðR4Þðy4Þ
¼ fy1; y2; y3; y4g:
Table 2 The partitions based on Ri (1 B i B m) and R[
A , respectively
U1 R1 R2 . . . Rm R[
A
x1 P1x1
P2x1
. . . Pmx1
PAx1
x2 P1x2
P2x2
. . . Pmx2
PAx2
. . . . . . . .
. . . . . . . .
. . . . . . . .
xn P1xn
P2xn
. . . Pmxn
PAxn
Table 3 A set-valued information system
U1 a1 a2 a3 a4
x1 {0} {0} {1, 2} {1, 2}
x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2}
x3 {1, 2} {0, 1} {1, 2} {1, 2}
x4 {0, 1} {0, 2} {1, 2} {1}
x5 {1, 2} {1, 2} {1, 2} {1}
x6 {1} {1, 2} {0, 1} {0, 1}
x7 {0} {0} {1, 2} {1, 2}
x8 {1} {1, 2} {0, 1} {0, 1}
Table 4 The partitions based on R1, R2, R3, R4 and R[
A ,
respectively
U1 R1 R2 R3 R4 R[
A
x1 {x1, x7} {x1, x7} U1 U1 {x1, x7}
x2 {x2, x4} {x2, x3, x4} U1 U1 {x2, x4}
x3 {x3, x5, x6, x8} {x2, x3, x4} U1 U1 {x3}
x4 {x2, x4} {x2, x3, x4} U1 U1 {x2, x4}
x5 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8}
x6 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8}
x7 {x1, x7} {x1, x7} U1 U1 {x1, x7}
x8 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8}
780 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
Afterwards, we obtain the following results: (1) g is a
homomorphism from ðU1; R[
A Þ to ðU2; gðR[
A ÞÞ; (2) g(R2),
g(R3) and g(R4) are superfluous in gðR!
A Þ if and only if R2,
R3 and R4 are superfluous in R[
A ; (3) {g(R1)} is a reduct of
gðR[
A Þ if and only if {R1} is a reduct of R[
A :
In Example 4.3, we see that the size of ðU2; gðR[
A ÞÞ is
smaller than ðU1; R[
A Þ; and their attribute reducts are
equivalent to each other under the condition of
homomorphisms.
We employ an example to illustrate that the computa-
tional complexity of computing attribute reducts is reduced
greatly by means of homomorphisms from the view of
discernibility matrix.
Example 4.4 (Continuation of Example 4.3) By Defini-
tion 3.7, we obtain discernibility matrixes D1 and D2 for
ðU1; R[
A Þ and ðU2; gðR[
A ÞÞ, respectively.
D1 ¼
;
fa1g ;
; ; ;
fa1; a2g ; ; ;
fa1; a2g ; ; ; ;
; ; fa1g ; fa1; a2g fa1; a2g
fa1; a2g ; ; ; ; ; fa1; a2g
2
6
6
6
6
6
6
6
6
4
3
7
7
7
7
7
7
7
7
5
;
D2 ¼
;
fa1g ;
fa1; a2g ; ;
2
4
3
5:
In Example 4.4, we observe that the size of D1 is larger
than D2, and {a1} is the reduct of ðU1; R[
A Þ and
ðU2; gðR[
A ÞÞ. The computational complexity of computing
D2 is relatively lower than computing D1.
4.2 Attribute reduction of SDIS under the condition
of homomorphisms
In this subsection, we study attribute reduction of SDIS
under the condition of homomorphisms.
Example 4.5 (Continuation of Example 4.4). Let Table 5
be a SDIS. Then, we have
Rdðx1Þ ¼ Rdðx2Þ ¼ Rdðx4Þ ¼ Rdðx7Þ ¼ fx1; x2; x4; x7g;
Rdðx3Þ ¼ Rdðx5Þ ¼ Rdðx6Þ ¼ Rdðx8Þ ¼ fx3; x5; x6; x8g:
Thus, we get U1/Rd = {{x1, x2, x4, x7}, {x3, x5, x6, x8}}
and define a mapping g : U1 À! U2 as follows:
gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ
¼ gðx6Þ ¼ gðx8Þ ¼ y4:
Consequently, we derive ðU2; gðR[
A ÞÞ; where U2 ¼
fy1; y2; y3; y4g; gðR[
A Þ ¼ fgðR1Þ; gðR2Þ; gðR3Þ; gðR4Þg;
and
gðR1Þðy1Þ ¼ fy1; y2g; gðR1Þðy2Þ ¼ fy1; y2; y3; y4g; gðR1Þðy3Þ
¼ gðR1Þðy4Þ ¼ fy2; y3; y4g;
gðR2Þðy1Þ ¼ fy1; y2; y3g; gðR2Þðy2Þ ¼ gðR2Þðy3Þ
¼ fy1; y2; y3; y4g; gðR2Þðy4Þ ¼ fy2; y3; y4g;
gðR3Þðy1Þ ¼ gðR3Þðy2Þ ¼ gðR3Þðy3Þ ¼ gðR3Þðy4Þ
¼ fy1; y2; y3; y4g;
gðR4Þðy1Þ ¼ gðR4Þðy2Þ ¼ gðR4Þðy3Þ ¼ gðR4Þðy4Þ
¼ fy1; y2; y3; y4g;
gðRdÞðy1Þ ¼ gðRdÞðy2Þ ¼ fy1; y2g; gðRdÞðy3Þ ¼ gðRdÞðy4Þ
¼ fy3; y4g:
Afterwards, we obtain the following results: (1) g is a
homomorphism from ðU1; R[
A[fdgÞ to ðU2; gðR[
A[fdgÞÞ; (2)
g(R2), g(R3) and g(R4) are superfluous in gðR[
A[fdgÞ if and
only if R2,R3 and R4 are superfluous in R[
A[fdg; (3) {g(R1)}
is a reduct of gðR[
A[fdgÞ if and only if {R1} is a reduct of
R[
A[fdg:
In Example 4.5, we see that the size of ðU2; gðR[
A[fdgÞÞ
is smaller than ðU1; R[
A[fdgÞ; and their attribute reducts are
equivalent to each other under the condition of homo-
morphisms. Furthermore, by transforming a large-scale
SDIS S = (U1, A[ {d}, V, f) into a relation information
system ðU1; R[
A[fdgÞ; if there is a homomorphism between
the relation information system and another relation
information system ðU2; R[ Ã
A[fdgÞ; we can get attribute re-
ducts of S = (U1, A[ {d}, V, f) by the reducts of
ðU2; R[ Ã
A[fdgÞ:
We employ an example to show that the computational
complexity of computing attribute reducts is reduced
greatly by means of homomorphisms from the view of
discernibility matrix.
Example 4.6 (Continuation of Example 4.5). By Defini-
tion 3.9, we obtain discernibility matrixes D3 and D4 for
ðU1; R[
A[fdgÞ and ðU2; gðR[
A[fdgÞÞ, respectively.
Table 5 A set-valued information system with a decision attribute
U1 a1 a2 a3 a4 d
x1 {0} {0} {1, 2} {1, 2} 0
x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} 0
x3 {1, 2} {0, 1} {1, 2} {1, 2} 1
x4 {0, 1} {0, 2} {1, 2} {1} 0
x5 {1, 2} {1, 2} {1, 2} {1} 1
x6 {1} {1, 2} {0, 1} {0, 1} 1
x7 {0} {0} {1, 2} {1, 2} 0
x8 {1} {1, 2} {0, 1} {0, 1} 1
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 781
123
D3 ¼
;
fa1g ;
; ; ;
fa1; a2g ; ; ;
fa1; a2g ; ; ; ;
; ; fa1g ; fa1; a2g fa1; a2g
fa1; a2g ; ; ; ; ; fa1; a2g
2
6
6
6
6
6
6
6
6
4
3
7
7
7
7
7
7
7
7
5
;
D4 ¼
;
fa1g ;
fa1; a2g ; ;
2
4
3
5:
Notice that the size of D3 is larger than D4, and {a1} is
the reduct of ðU1; R[
A[fdgÞ and ðU2; gðR[
A[fdgÞÞ. The
computational complexity of computing D4 is relatively
lower than computing D3.
There are n objects in the original set-valued informa-
tion system. If we transform it into a relation information
system, then computational complexity of getting a reduct
by using a discernibility matrix is Oðm à n2
Þ. If we com-
press the relation information system into one with
N objects, then the computational complexity is Oðm à N2
Þ:
In practice, it may be difficult to construct reducts of
large-scale set-valued information systems. So we convert
it into a relation information system and compress the
relation information system into a relatively smaller one
under the condition of homomorphisms. By conducting
attribute reduction of the smaller relation information
system, we obtain reducts of the set-valued information
system. Moreover, if there is a homomorphism between an
induced [ -relation information system and a relation
information system, we can obtain reducts of the others by
conducting attribute reduction of one relation information
system.
5 Compressing dynamic set-valued information
systems
Illustrated by Examples 4.4 and 4.6, the computational
complexity of attribute reduction for large-scale set-valued
information systems can be reduced greatly by using
homomorphisms, and most of time is spent on constructing
the homomorphisms between relation information systems.
However, we have not seen the work on constructing the
homomorphisms for dynamic set-valued information sys-
tems so far. In this section, we mainly construct homomor-
phisms from three aspects: variations of attribute set,
immigration and emigration of objects and alterations of
attribute values for dynamic set-valued information systems.
5.1 Variations of attribute set
In this subsection, we show that how to compress a
dynamic set-valued information system when adding and
deleting attributes.
Suppose that we have obtained Table 2 by compressing
set-valued information system S1 = (U1, A, V1, f1). Now
we get S2 = (U1, A[ P, V2, f2) by adding an attribute set
P into A, where A  P = ; and P = {am?1, am?2, . . ., ak}.
There are three steps to compress S2 by utilizing Algorithm
4.2 as follows.
Step 1: Derive U1/Ri by inducing Ri (m ? 1 B i B k).
Step 2: Get Table 6 by adding U1/
Ri (m ? 1 B i B k) into Table 2 and derive U1=R[
A[P:
Step 3: Obtain S3 ¼ ðgðU1Þ; gðR[
A[PÞÞ by defining the
homomorphism g based on U1=R[
A[P:
By using the Algorithm 4.2, the computational com-
plexity ðk À mÞ Ã Oðn2
Þ þ Oððk À 1Þ Ã n2
Þ of constructing
g is reduced without computing U1/Ri (1 B i B m). But we
need to compute them without Table 2.
Example 5.1 We obtain Table 7 by adding a5 into
Table 3. By Definition 4.1, we first get U1/
R5 = {{x1, x2, x3, x4, x5, x6, x7}, {x8}}. Then we obtain
Table 8 and derive U1=R[
A[fa5g ¼ ffx1; x7g; fx2; x4g;
fx3g; fx5; x6g; fx8gg. Afterwards, we define the mapping
g : U1 À! U2 as follows:
gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ
¼ gðx6Þ ¼ y4; gðx8Þ ¼ y5;
where U2 = {y1, y2, y3, y4, y5}. Consequently, we obtain a
relation information system ðU2; gðR[
A[fa5gÞÞ. For
Table 6 The partitions based on Ri (1 B i B k) and R[
A[P, respectively
U1 R1 R2 . . . Rk R[
A[P
x1 P1x1
P2x1
. . . Pkx1
PðA[PÞx1
x2 P1x2
P2x2
. . . Pkx2
PðA[PÞx2
. . . . . . . .
. . . . . . . .
. . . . . . . .
xn P1xn
P2xn
. . . Pkxn
PðA[PÞxn
782 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
simplicity, we do not list the relation information system in
this subsection.
Subsequently, we show the process of compressing the
dynamic set-valued information system S2 = (U1, A -
{al}, V2, f2), where al 2 A. There are two steps to com-
press S2. We firstly get U=R[
ðAÀfalgÞ by using U1/
Ri (i = l) shown in Table 2 and define g as Example 4.3.
Secondly, we obtain S3 ¼ ðgðU1Þ; gðR[
ðAÀfalgÞÞÞ. By using
the Algorithm 4.2, the computational complexity Oððm À
2Þ Ã n2
Þ of constructing g is reduced without computing U1/
Ri (1 B i B l - 1,l ? 1 B i B m). But we need to com-
pute them without Table 2. Furthermore, we can compress
S2 when deleting an attribute set with the same approach.
Example 5.2 By deleting a1 in a set-valued information
system S1 shown in Table 3, we obtain information system
S2 shown in Table 9. To compress S2 based on the com-
pression of S1, we get Table 10 by deleting U1/R1 based on
a1. Then, we obtain U1=R[
ðAÀfa1gÞ ¼ ffx1; x7g; fx2; x3; x4g;
fx5; x6; x8gg and define the mapping g : U1 À! U2 as
follows:
gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx3Þ ¼ gðx4Þ ¼ y2; gðx5Þ
¼ gðx6Þ ¼ gðx8Þ ¼ y3;
where U2 = {y1, y2, y3}. Consequently, the set-valued
information system (U1, A - {a1}, V, f1) can be
compressed into a smaller relation information system
(U2, {g(R2), g(R3), g(R4)}). To express clearly, we do not
list all the relations in this subsection.
In Example 5.2, we compress a dynamic set-valued
information system when deleting an attribute. The same
approach can be applied to the set-valued information
system when deleting an attribute set.
5.2 Immigration and emigration of objects
In this subsection, we introduce an equivalence relation for
set-valued information systems and show a process of
compressing dynamic set-valued information systems in
terms of object variation.
Definition 5.3 Let S1 = (U1, A, V, f1) be a set-valued
information system. Then an equivalence relation TA is
defined as follows:
TA ¼ fðx; yÞj8a 2 A; fðx; aÞ ¼ fðy; aÞ; x; y 2 U1g:
The physical meaning of ðx; yÞ 2 TA is that there exist the
same attribute values for x and y with respect to any a 2 A.
For convenience, we denote ½xŠ1
A ¼ fyjðx; yÞ 2 TA; x; y
2 U1g. We derive U1=A ¼ f½xŠ1
Ajx 2 U1g ¼ fC1; C2;
. . .; CNg and obtain S2 = (U2, A, V, f2) by defining
g1(x) = yk for any x 2 Ck, where U2 = {yk|1
B k B N}, f2(yk, a) = f1(x, a) for a 2 A; and x 2 gÀ1
1 ðykÞ:
Now, we obtain S4 = (U1[ U3, A, V, f1[ f2) by adding
S3 = (U3, A, V, f3) into S1. To compress S4 by utilizing S2,
we firstly obtain S5 = (U5, A, V, f5) by compressing S3 as
S1. Then, we compress S2 [ S5 as S1 and get S7 which is the
same as the compression of S1 [ S3. To express clearly, the
process of compressing dynamic set-valued information
systems is illustrated below.
S1#S2
S3#S5
'
S6 ¼ S2 [ S5#S7S4 ¼ S1 [ S3
S1;
S3

where # (respectively, ) denotes the process of
compressing set-valued information systems. The
Table 7 A set-valued information system by adding a5 into Table 3
U1 a1 a2 a3 a4 a5
x1 {0} {0} {1, 2} {1, 2} {1, 2}
x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} {0, 2}
x3 {1, 2} {0, 1} {1, 2} {1, 2} {1, 2}
x4 {0, 1} {0, 2} {1, 2} {1} {2}
x5 {1, 2} {1, 2} {1, 2} {1} {2}
x6 {1} {1, 2} {0, 1} {0, 1} {0, 1, 2}
x7 {0} {0} {1, 2} {1, 2} {0, 2}
x8 {1} {1, 2} {0, 1} {0, 1} {3}
Table 8 The partitions based on R1, R2, R3, R4, R5 and R[
A[fa5g, respectively
U1 R1 R2 R3 R4 R5 R[
A[fa5g
x1 {x1, x7} {x1, x7} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x1, x7}
x2 {x2, x4} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x2, x4}
x3 {x3, x5, x6, x8} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x3}
x4 {x2, x4} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x2, x4}
x5 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x5,x6}
x6 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x5,x6}
x7 {x1, x7} {x1, x7} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x1, x7}
x8 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x8} {x8}
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 783
123
computational complexity of constructing the homomor-
phism is m à OðjU3j2
Þ þ m à OðjU2 [ U5j2
Þþ Oððm À 1Þ Ã
jU2 [ U5j2
Þ with the incremental algorithm. But the com-
putational complexity is m à OðjU1 [ U3j2
Þ þ Oððm À 1Þ Ã
jU1 [ U3j2
Þ without S2.
Example 5.4 Let Table 11 be a set-valued information
system S1 = {U1, A, V, f1}. By Definition 5.3, we obtain
that U1/A = {{x1, x2}, {x3, x4}, {x5, x6}}. Then, we define
g1 and f2 as follows:
g1ðx1Þ ¼ g1ðx2Þ ¼ y1; g1ðx3Þ ¼ g1ðx4Þ ¼ y2; g1ðx5Þ
¼ g1ðx6Þ ¼ y3; f2ðyi; aiÞ ¼ f1ðx; aiÞ;
where x 2 gÀ1
1 ðyiÞ. Consequently, we compress S1 into
S2 = (U2, A, V, f2) shown in Table 12, where
U2 ¼ fgðxÞjx 2 U1g:
The following example is employed to illustrate how to
update the compression when adding an object set.
Example 5.5 By adding S3 shown in Table 13 into S1, we
obtain S4 = S1[ S3 shown in Table 14. To compress S4, we
compress S3 to S5 = (U5, A, V, f5) shown in Table 15 as
Example 5.4. Then we compress S6 = S2[ S5 shown in
Table 16 and obtain S7 = {U7, A, V, f7} shown in
Table 17. Afterwards, we can continue to compress S7 as
Example 4.3 in Sect. 4.
Below we compress the dynamic set-valued information
system when deleting an object set. Suppose
S1 = (U1, A, V, f1) is a set-valued information system, we
compress S1 to S2 = (U2, A, V, f2) under the condition of
g1. We obtain S4 = (U4, A, V, f4) by deleting
S3 = (U3, A, V, f3), where U3  U1 and U4 = U1 - U3.
There are three steps to compress S4 = (U4, A, V, f4) based
on S2. Firstly, we obtain that U1=A ¼ f½xŠ1
Ajx 2 U1g and
U3=A ¼ f½xŠ3
Ajx 2 U3g by Definition 5.3. It is obvious that
½xŠ3
A  ½xŠ1
A for any x 2 U3. Subsequently, we cancel
g1(x) in U2 if [x]A
3
= [x]A
1
and keep g1(x) in U2 if
[x]A
3
= [x]A
1
. Finally, we obtain S5 = (U5, A, V, f5) and
compress it as Example 4.3. The computational complexity
of constructing the homomorphism is m à OðjU5j2
Þ þ
Oððm À 1Þ Ã jU5j2
Þ with the incremental algorithm. But the
computational complexity is m à OðjU1 À U3j2
Þ þ Oððm À
1Þ Ã jU1 À U3j2
Þ without S2.
Example 5.6 We take S4 and S7 shown in Example 5.5 as
the original set-valued information system S1 and the
compressed information system S2, respectively. By
Table 9 An updated set-valued information system
U1 a2 a3 a4
x1 {0} {1, 2} {1, 2}
x2 {0, 1, 2} {1, 2} {0, 1, 2}
x3 {0, 1} {1, 2} {1, 2}
x4 {0, 2} {1, 2} {1}
x5 {1, 2} {1, 2} {1}
x6 {1, 2} {0, 1} {0, 1}
x7 {0} {1, 2} {1, 2}
x8 {1, 2} {0, 1} {0, 1}
Table 10 The partitions based on R2, R3, R4 and R[
ðAÀfa1gÞ,
respectively
U1 R2 R3 R4 R[
ðAÀfa1gÞ
x1 {x1, x7} U1 U1 {x1, x7}
x2 {x2, x3, x4} U1 U1 {x2, x3, x4}
x3 {x2, x3, x4} U1 U1 {x2, x3, x4}
x4 {x2, x3, x4} U1 U1 {x2, x3, x4}
x5 {x5, x6, x8} U1 U1 {x5, x6, x8}
x6 {x5, x6, x8} U1 U1 {x5, x6, x8}
x7 {x1, x7} U1 U1 {x1, x7}
x8 {x5, x6, x8} U1 U1 {x5, x6, x8} Table 11 The set-valued information system S1
U1 a1 a2 a3
x1 {0, 1} {0, 2} {1, 2}
x2 {0, 1} {0, 2} {1, 2}
x3 {0, 1} {1} {0, 1}
x4 {0, 1} {1} {0, 1}
x5 {1, 2} {1} {1, 2}
x6 {1, 2} {1} {1, 2}
Table 12 The compressed set-valued information system S2 of S1
U2 a1 a2 a3
y1 {0, 1} {0, 2} {1, 2}
y2 {0, 1} {1} {0, 1}
y3 {1, 2} {1} {1, 2}
Table 13 The set-valued information system S3
U3 a1 a2 a3
x7 {1, 2} {0, 2} {0, 1}
x8 {1, 2} {0, 2} {0, 1}
x9 {0, 1} {1} {0, 1}
x10 {0, 1} {1} {0, 1}
784 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
deleting S3 = (U3, A, V, f) shown in Table 18, we obtain
the set-valued information system S4 shown in Table 19.
To compress S4, we first get that U1/A =
{{x1, x2}, {x3, x4, x9, x10}, {x5, x6}, {x7, x8}} and U3/
A = {{x1, x2}, {x3}}. Obviously, [x1]A
1
= [x2]A
1
= {x1,
x2} = [x1]A
3
= [x2]A
3
and ½x3Š3
A ¼ fx3g  fx3; x4;
x9; x10g ¼ ½x3Š1
A. Then we cancel z1 and keep {z2, z3, z4} in
Table 17. Afterwards, we obtain the compressed set-valued
information system S5 shown in Table 20. We can continue
to compress S5 as Example 4.3 in Sect. 4.
5.3 Alterations of attribute values
In this subsection, we show a process of compressing
dynamic set-valued information systems in terms of attri-
bute value variation.
Suppose S1 = (U1, A, V1, f1) is a set-valued information
system, we get S2 = (U1, A, V2, f2) when revising
f(xj, ai), where xj 2 U1 and ai 2 A. By utilizing Algorithm
4.2 there are three steps to compress S2 as follows.
Step 1: Derive U1=RÃ
i by inducing [ -relations RÃ
i (the
relation based on the attribute ai after revising f(xj, ai) is
denoted by RÃ
i ).
Step 2: Get U1=RÃ [
A based on U1=RÃ
i and U1/Rl
(1 B l B m, l = i).
Step 3: Obtain S3 ¼ ðgðU1Þ; gðU1=RÃ [
A Þ by defining
g based on U1=RÃ [
A :
The computational complexity of constructing g is
Oðn2
Þ þ Oððm À 1Þ Ã n2
Þ with the incremental algorithm.
But the computational complexity is m à Oðn2
Þ þ Oððm À
1Þ Ã n2
Þ without S2. Similarly, we can compress the set-
valued information system when there are more alterations
of attribute values.
Example 5.7 (Continuation of Example 4.3) Consider
Table 3, we revise f(x8,a1) = {1} to f(x8,a1) = {0} and
obtain Table 21. In Example 4.3, we have that
Table 14 The set-valued information system S4 = S1[ S3
U4 = U1[ U3 a1 a2 a3
x1 {0, 1} {0, 2} {1, 2}
x2 {0, 1} {0, 2} {1, 2}
x3 {0, 1} {1} {0, 1}
x4 {0, 1} {1} {0, 1}
x5 {1, 2} {1} {1, 2}
x6 {1, 2} {1} {1, 2}
x7 {1, 2} {0, 2} {0, 1}
x8 {1, 2} {0, 2} {0, 1}
x9 {0, 1} {1} {0, 1}
x10 {0, 1} {1} {0, 1}
Table 15 The set-valued information system S5
U5 a1 a2 a3
y4 {1, 2} {0, 2} {0, 1}
y5 {0, 1} {1} {0, 1}
Table 16 The set-valued information system S6 = S2 [ S5
U6 = U2 [ U4 a1 a2 a3
y1 {0, 1} {0, 2} {1, 2}
y2 {0, 1} {1} {0, 1}
y3 {1, 2} {1} {1, 2}
y4 {1, 2} {0, 2} {0, 1}
y5 {0, 1} {1} {0, 1}
Table 17 The set-valued information system S7
U7 a1 a2 a3
z1 {0, 1} {0, 2} {1, 2}
z2 {0, 1} {1} {0, 1}
z3 {1, 2} {1} {1, 2}
z4 {1, 2} {0, 2} {0, 1}
Table 18 The set-valued information system S3
U3 a1 a2 a3
x1 {0, 1} {0, 2} {1, 2}
x2 {0, 1} {0, 2} {1, 2}
x3 {0, 1} {1} {0, 1}
Table 19 The set-valued information system S4
U4 = U1 - U3 a1 a2 a3
x4 {0, 1} {1} {0, 1}
x5 {1, 2} {1} {1, 2}
x6 {1, 2} {1} {1, 2}
x7 {1, 2} {0, 2} {0, 1}
x8 {1, 2} {0, 2} {0, 1}
x9 {0, 1} {1} {0, 1}
x10 {0, 1} {1} {0, 1}
Table 20 The set-valued information system S5
U5 a1 a2 a3
z2 {0, 1} {1} {0, 1}
z3 {1, 2} {1} {1, 2}
z4 {1, 2} {0, 2} {0, 1}
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 785
123
R1ðx1Þ ¼R1ðx7Þ ¼ fx1; x2; x4; x7g;
R1ðx2Þ ¼R1ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g;
R1ðx3Þ ¼R1ðx5Þ ¼ R1ðx6Þ ¼ R1ðx8Þ ¼ fx2; x3; x4; x5; x6; x8g
:
Then we obtain RÃ
1ðx8Þ ¼ fx1; x2; x4; x7; x8g and
RÃ
1ðx1Þ ¼RÃ
1ðx7Þ ¼ fx1; x2; x4; x7; x8g;
RÃ
1ðx2Þ ¼RÃ
1ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g;
RÃ
1ðx3Þ ¼RÃ
1ðx5Þ ¼ RÃ
1ðx6Þ ¼ fx2; x3; x4; x5; x6g;
RÃ
1ðx8Þ ¼fx2; x3; x4; x5; x6; x8g:
Consequently, we get that U1=RÃ
1 ¼ ffx1; x1g; fx2; x4g;
fx3; x5; x6g; fx8gg. Based on U1=RÃ
1; U1=R2; U1=R3 and U1/
R4, we can construct a homomorphism and compress the
original set-valued information system shown in Table 21
into a smaller one. For simplicity, we do not show the
process of constructing the homomorphism.
At the end of this section, we show the computational
complexities of compressing the dynamic set-valued
information system and attribute reduction of dynamic set-
valued information system in Tables 22 and 23, respec-
tively. In Tables 22 and 23, AA denotes adding attribute
set {am?1, am?2, Oðm à n2
Þ, ak}; DA stands for deleting
attribute set {al}; AO indicates adding object set U3; DO
refers to as deleting object set U3; AAV is the alteration of
an attribute value.
6 Conclusions
In practical situations, it is difficult to construct attribute
reduction of large-scale set-valued information systems
and dynamic set-valued information systems. In this paper,
we have introduced three relations for solving issues of set-
valued information systems. Moreover, we have proposed
an incremental algorithm for attribute reduction of set-
valued information systems and studied their basic prop-
erties. We have conducted attribute reduction of set-valued
decision information systems. We have illustrated the
process of compressing the set-valued information system
with several examples. Afterwards, we have compressed
Table 21 A set-valued information system
U1 a1 a2 a3 a4
x1 {0} {0} {1, 2} {1, 2}
x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2}
x3 {1, 2} {0, 1} {1, 2} {1, 2}
x4 {0, 1} {0, 2} {1, 2} {1}
x5 {1, 2} {1, 2} {1, 2} {1}
x6 {1} {1, 2} {0, 1} {0, 1}
x7 {0} {0} {1, 2} {1, 2}
x8 {0} {1, 2} {0, 1} {0, 1}
Table 22 The computational complexity of compressing dynamic set-valued information system
Alteration Incremental algorithm Non-incremental algorithm
AA ðk À mÞ Ã OðjU1j2
Þ þ Oððk À 1Þ Ã jU1j2
Þ k à OðjU1j2
Þ þ Oððk À 1Þ Ã n2
Þ
DA Oððm À 2Þ Ã jU1j2
Þ ðm À 1Þ Ã OðjU1j2
Þ þ Oððm À 2Þ Ã jU1j2
Þ
AO m à OðjU3j2
Þ þ m à OðjU2 [ U5j2
Þ m à OðjU1 [ U3j2
Þ þ Oððm À 1Þ Ã jU1 [ U3j2
Þ
þOððm À 1Þ Ã jU2 [ U5j2
Þ
DO
AAV
m à OðjU5j2
Þ þ Oððm À 1Þ Ã jU5j2
Þ m à OðjU1 À U3j2
Þ þ Oððm À 1Þ Ã jU1 À U3j2
Þ
OðjU1j2
Þ þ Oððm À 1Þ Ã jU1j2
Þ m à OðjU1j2
Þ þ Oððm À 1Þ Ã jU1j2
Þ
Table 23 The computational complexity of attribute reduction for dynamic set-valued information system
Alteration Incremental algorithm Non-incremental algorithm
AA ðk À mÞ Ã OðjU1j2
Þ þ Oððk À 1Þ Ã jU1j2
Þ þ Oðk à jU2j2
Þ k à OðjU1j2
Þ þ Oððk À 1Þ Ã jU1j2
Þ þ Oðk à jU2j2
Þ
DA Oððm À 2Þ Ã jU1j2
Þ þ Oððm À 1Þ Ã jU2j2
Þ ðm À 1Þ Ã OðjU1j2
Þ þ Oððm À 2Þ Ã jU1j2
Þ
þOððm À 1Þ Ã jU2j2
Þ
AO Oððm À 1Þ Ã jU2 [ U5j2
Þ þ Oðm à jU7j2
Þ m à OðjU1 [ U3j2
Þ þ Oððm À 1Þ Ã jU1 [ U3j2
Þ
þm à OðjU3j2
Þ þ m à OðjU2 [ U5j2
Þ þOðm à jU7j2
Þ
DO m à OðjU5j2
Þ þ Oððm À 1Þ Ã jU5j2
Þ m à OðjU1 À U3j2
Þ þ Oððm À 1Þ Ã jU1 À U3j2
Þ
þOðm à jU5j2
Þ þOðm à jU5j2
Þ
AAV OðjU1j2
Þ þ Oððm À 1Þ Ã jU1j2
Þ þ Oðm à jU1j2
Þ m à OðjU1j2
Þ þ Oððm À 1Þ Ã jU1j2
Þ þ Oðm à jU1j2
Þ
786 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123
dynamic set-valued information systems by using an
incremental algorithm.
There are still some interesting problems which need be
further discussed. For example, we will focus on com-
pressing fuzzy set-valued information systems and
dynamic fuzzy set-valued information systems. We will
investigate the compression of interval-valued information
systems, fuzzy interval-valued information systems,
dynamic interval-valued information systems and dynamic
fuzzy interval-valued information systems.
Acknowledgments We would like to thank the anonymous
reviewers very much for their professional comments and valuable
suggestions. This work is supported by the National Natural Science
Foundation of China (No. 11071061,11371130) and the National
Basic Research Program of China (No. 2010CB334706,
2011CB311808).
References
1. Banerjee M, Pal SK (1996) Roughness of a fuzzy set. Inf Sci
93(3-4):235–246
2. Bhatt RB, Gopal M (2005) On the compact computational
domain of fuzzy-rough sets. Pattern Recognit Lett
26(11):1632–1640
3. Biswas R (1994) On rough sets and fuzzy rough sets. Bull Pol
Acad Sci Math 42:345–349
4. Bobillo F, Straccia U (2012) Generalized fuzzy rough description
logics. Inf Sci 189:43–62
5. Capotorti A, Barbanera E (2012) Credit scoring analysis using a
fuzzy probabilistic rough set model. Comput Stat Data Anal
56(4):981–994
6. Chakrabarty K, Biswas R, Nanda S (2000) Fuzziness in rough
sets. Fuzzy Sets Syst 110:247–251
7. Chen HM, Li TR, Qiao SJ, Ruan D (2010) A rough set based
dynamic maintenance approach for approximations in coarsening
and refining attribute values. Int J Intell Syst 25(10):1005–1026
8. Chen ZC, Qin KY (2008) Attribute reduction of set-valued
information system based on variable precision tolerance realtion.
Comput Eng Appl 44: 27–29
9. Chen ZC, Qin KY (2009) Attribute reduction of set-valued
information system based on tolerance relation. Fuzzy Syst Math
23(1):150–154
10. Dai JH (2013) Rough set approach to incomplete numerical data.
Inf Sci. doi:10.1016/j.ins.2013.04.023.
11. Dai JH, Tian HW (2013) Entropy measures and granularity
measures for set-valued information systems. Inf Sci 240:72–82
12. Dai JH, Tian HW (2013) Fuzzy rough set model for set-valued
data. Fuzzy Sets Syst 229:54–68
13. Dai JH, Wang WT, Tian HW, Liu L (2013) Attribute selection
based on a new conditional entropy for incomplete decision
systems. Knowl-Based Syst 39:207–213
14. Dai JH, Xu Q (2012) Approximations and uncertainty measures
in incomplete information systems. Inf Sci 198:62–80
15. Dai JH, Xu Q (2013) Attribute selection based on information
gain ratio in fuzzy rough set theory with application to tumor
classification. Appl Soft Comput 13(1):211–221
16. Diker M, Ug˘ur AA (2012) Textures and covering based rough
sets. Inf Sci 184(1):44–63
17. Dubois D, Prade H (1990) Rough fuzzy sets and fuzzy rough sets.
Int J General Syst 17:191–209
18. Feng T, Zhang SP, Mi JS, Feng Q (2011) Reductions of a fuzzy
covering decision system. Int J Model Identif Control
13(3):225–233
19. Gong ZT, Xiao ZY (2010) Communicating between information
systems based on including degrees. Int J General Syst
39(2):189–206
20. Grzymala-Busse JW (2010) Rough set and CART approaches to
mining incomplete data. In: 2010 international conference of soft
computing and pattern recognition (SoCPaR), Paris, pp 214–219
21. Grzymala-Busse JW, Sedelow Jr. WA (1988) On rough sets and
information system homomorphism. Bull Pol Acad Sci Tech Sci
36(3):233–239
22. Guan YY, Wang HK (2006) Set-valued information systems. Inf
Sci 176(17):2507–2525
23. Huang B, Li HX, Wei DK (2012) Dominance-based rough set
model in intuitionistic fuzzy information systems. Knowl-Based
Syst 28: 115–123
24. Jensen R, Shen Q (2004) Semantics-preserving dimensionality
reduction: rough and fuzzy-rough-based approaches. IEEE Trans
Knowl Data Eng 16(12):1457–1471
25. Leung Y, Li DY (2003) Maximal consistent block technique for
rule acquisition in incomplete information systems. Inf Sci
153:85–106
26. Li SY, Li TR, Liu D (2013) Incremental updating approximations
in dominance-based rough sets approach under the variation of
the attribute set. Knowl Based Syst 40:17–26
27. Li DY, Ma YC (2000) Invariant characters of information sys-
tems under some homomorphisms. Inf Sci 129(1–4):211–220
28. Li TR, Ruan D, Geert W, Song J, Xu Y (2007) A rough sets based
characteristic relation approach for dynamic attribute general-
ization in data mining. Knowl Based Syst 20(5):485–494
29. Liu CH, Miao DQ, Zhang N (2012) Graded rough set model
based on two universes and its properties. Knowl Based Syst
33:65–72
30. Liu GL (2010) Rough set theory based on two universal sets and
its applications. Knowl Based Syst 23(2):110–115
31. Liu PH, Chen ZC, Qin KY (2009) Attribute reduction of set-
valued information systems based on maximal variable precision
tolerance classes. J Sichuan Normal Univ 32(5):576–580
32. Liu D, Li TR, Liang DC (2013) Incorporating logistic regression
to decision-theoretic rough sets for classifications. Int J Approx
Reason. doi:10.1016/j.ijar.2013.02.013
33. Liu D, Li TR, Ruan D (2011) Probabilistic model criteria with
decision-theoretic rough sets. Inf Sci 181:3709–3722
34. Liu D, Li TR, Ruan D, Zhang JB (2011) Incremental learning
optimization on knowledge discovery in dynamic business
intelligent systems. J Global Optim 51(2):325–344
35. Liu D, Li TR, Ruan D, Zou WL (2009) An incremental approach
for inducing knowledge from dynamic information systems. Fund
Inf 94(2):245–260
36. Morsi NN, Yakout MM (1998) Axiomatics for fuzzy rough sets.
Fuzzy Sets Syst 100(1–3):327–342
37. Meng D, Zhang XH, Qin KY (2011) Soft rough fuzzy sets and
soft fuzzy rough sets. Comput Math Appl 62:4635–4645
38. Miao DQ, Gao C, Zhang N, Zhang ZF (2011) Diverse reduct
subspaces based co-training for partially labeled data. Int J
Approx Reason 52:1103–1117
39. Nanda S, Majumdar S (1992) Fuzzy rough sets. Fuzzy Sets Syst
45(2):157–160
40. Pawlak Z (1982) Rough sets. Int J Comput Inform Sci
11(5):341–356
41. Qian YH, Dang CY, Liang JY, Tang DW (2009) Set-valued
ordered information systems. Inf Sci 179:2809–2832
42. Qin KY, Pei Z (2005) On the topological properties of fuzzy
rough sets. Fuzzy Sets Syst 151:601–613
Int. J. Mach. Learn.  Cyber. (2014) 5:775–788 787
123
43. Qin KY, Yang JL, Pei Z (2008) Generalized rough sets based on
reflexive and transitive relations. Inf Sci 178:4138–4141
44. Skowron A (1990) The rough set theory and evidence theory.
Fund Inf 13:245–262
45. Slezak D, Ziarko W (2005) The investigation of the Bayesian
rough set model. Int J Approx Reason 40(1–2):81–91
46. Wang CZ, Wu CX, Chen DG (2008) A systematic study on
attribute reduction with rough sets based on general binary
relations. Inf Sci 178(9):2237–2261
47. Wang SP, Zhu QX, Zhu W, Min F (2012) Matroidal structure of
rough sets and its characterization to attribute reduction. Knowl
Based Syst 36:155–161
48. Wang XZ, Tsang E, Zhao SY, Chen DG, Yeung D (2007)
Learning fuzzy rules from fuzzy examples based on rough set
techniques. Inf Sci 177(20):4493–4514
49. Wang XZ, Zhai JH, Lu SX (2008) Induction of multiple fuzzy
decision trees based on rough set technique. Inf Sci
178(16):3188–3202
50. Yang T, Li QG (2010) Reduction about approximation spaces of
covering generalized rough sets. Int J Approx Reason
51(3):335–345
51. Yang XB, Song XN, Chen ZH, Yang JY (2012) On multigran-
ulation rough sets in incomplete information system. Int J Mach
Learn Cybern 3:223–232
52. Yang XB, Zhang M, Dou HL (2011) Neighborhood systems-
based rough sets in incomplete information system. Knowl Based
Syst 24(6):858–867
53. Yao YY (2003) Probabilistic approaches to rough sets. Expert
Syst 20(5):287–297
54. Yao YY (2010) Three-way decisions with probabilistic rough
sets. Inf Sci 180(3):341–353
55. Yao YY, Zhao Y (2008) Attribute reduction in decision-theoretic
rough set models. Inf Sci 178(17):3356–3373
56. Zakowski W (1983) Approximations in the space (u, p). De-
monstratio Math 16:761–769
57. Zhang JB, Li TR, Ruan D, Liu D (2012) Rough sets based matrix
approaches with dynamic attribute variation in set-valued infor-
mation systems. Int J Approx Reason 53(4):620–635
58. Zhang JB, Li TR, Ruan D, Liu D (2012) Neighborhood rough sets
for dynamic data mining. Int J Intell Syst 27:317-342
59. Zhu W (2007) Topological approaches to covering rough sets. Inf
Sci 177(6):1499–1508
60. Zhu, P (2011) Covering rough sets based on neighborhoods: An
approach without using neighborhoods. Int J Approx Reason
52(3):461–472
61. Zhu P, Wen QY (2010) Some improved results on communica-
tion between information systems. Inf Sci 180(18):3521–3531
62. Ziarko W (2008) Probabilistic approach to rough sets. Int J
Approx Reason 49(2):272–284
788 Int. J. Mach. Learn.  Cyber. (2014) 5:775–788
123

More Related Content

What's hot

UNIT I LINEAR DATA STRUCTURES – LIST
UNIT I 	LINEAR DATA STRUCTURES – LIST 	UNIT I 	LINEAR DATA STRUCTURES – LIST
UNIT I LINEAR DATA STRUCTURES – LIST Kathirvel Ayyaswamy
 
Mit202 data base management system(dbms)
Mit202  data base management system(dbms)Mit202  data base management system(dbms)
Mit202 data base management system(dbms)smumbahelp
 
Essay on-data-analysis
Essay on-data-analysisEssay on-data-analysis
Essay on-data-analysisRaman Kannan
 
Piepho et-al-2003
Piepho et-al-2003Piepho et-al-2003
Piepho et-al-2003Juaci Cpac
 
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...ijdms
 

What's hot (12)

UNIT I LINEAR DATA STRUCTURES – LIST
UNIT I 	LINEAR DATA STRUCTURES – LIST 	UNIT I 	LINEAR DATA STRUCTURES – LIST
UNIT I LINEAR DATA STRUCTURES – LIST
 
5th Grade Math Glossary
5th Grade Math Glossary5th Grade Math Glossary
5th Grade Math Glossary
 
ADB introduction
ADB introductionADB introduction
ADB introduction
 
Mit202 data base management system(dbms)
Mit202  data base management system(dbms)Mit202  data base management system(dbms)
Mit202 data base management system(dbms)
 
Essay on-data-analysis
Essay on-data-analysisEssay on-data-analysis
Essay on-data-analysis
 
cross tabulation
 cross tabulation cross tabulation
cross tabulation
 
Lattice2 tree
Lattice2 treeLattice2 tree
Lattice2 tree
 
Crosstabs
CrosstabsCrosstabs
Crosstabs
 
6. R data structures
6. R data structures6. R data structures
6. R data structures
 
Piepho et-al-2003
Piepho et-al-2003Piepho et-al-2003
Piepho et-al-2003
 
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...
DESIGN METHODOLOGY FOR RELATIONAL DATABASES: ISSUES RELATED TO TERNARY RELATI...
 
Crosstabs
CrosstabsCrosstabs
Crosstabs
 

Viewers also liked

Конфликты в аптеке - ваши действия под перекрестным огнем
Конфликты в аптеке - ваши действия под перекрестным огнемКонфликты в аптеке - ваши действия под перекрестным огнем
Конфликты в аптеке - ваши действия под перекрестным огнемYaroslav Shulga
 
Flashback location shots
Flashback location shotsFlashback location shots
Flashback location shotsghird
 
Federation of Infection Societies Conference 2015
Federation of Infection Societies Conference 2015Federation of Infection Societies Conference 2015
Federation of Infection Societies Conference 2015Arwa Mujahid Al-Shuwaikh
 
Software Process Control on Ungrouped Data: Log-Power Model
Software Process Control on Ungrouped Data: Log-Power ModelSoftware Process Control on Ungrouped Data: Log-Power Model
Software Process Control on Ungrouped Data: Log-Power ModelWaqas Tariq
 
VMware EVO - Fremtidens datarom er hyperkonvergert
VMware EVO - Fremtidens datarom er hyperkonvergertVMware EVO - Fremtidens datarom er hyperkonvergert
VMware EVO - Fremtidens datarom er hyperkonvergertKenneth de Brucq
 

Viewers also liked (9)

Organigramas
OrganigramasOrganigramas
Organigramas
 
Конфликты в аптеке - ваши действия под перекрестным огнем
Конфликты в аптеке - ваши действия под перекрестным огнемКонфликты в аптеке - ваши действия под перекрестным огнем
Конфликты в аптеке - ваши действия под перекрестным огнем
 
Day 1
Day 1Day 1
Day 1
 
Flashback location shots
Flashback location shotsFlashback location shots
Flashback location shots
 
Federation of Infection Societies Conference 2015
Federation of Infection Societies Conference 2015Federation of Infection Societies Conference 2015
Federation of Infection Societies Conference 2015
 
Software Process Control on Ungrouped Data: Log-Power Model
Software Process Control on Ungrouped Data: Log-Power ModelSoftware Process Control on Ungrouped Data: Log-Power Model
Software Process Control on Ungrouped Data: Log-Power Model
 
Research design
Research designResearch design
Research design
 
shaji cv nov 15
shaji cv nov 15shaji cv nov 15
shaji cv nov 15
 
VMware EVO - Fremtidens datarom er hyperkonvergert
VMware EVO - Fremtidens datarom er hyperkonvergertVMware EVO - Fremtidens datarom er hyperkonvergert
VMware EVO - Fremtidens datarom er hyperkonvergert
 

Similar to Incremental Approach to Attribute Reduction of Dynamic Set-Valued Info Systems

Application for Logical Expression Processing
Application for Logical Expression Processing Application for Logical Expression Processing
Application for Logical Expression Processing csandit
 
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...Monica Gero
 
KIT-601 Lecture Notes-UNIT-2.pdf
KIT-601 Lecture Notes-UNIT-2.pdfKIT-601 Lecture Notes-UNIT-2.pdf
KIT-601 Lecture Notes-UNIT-2.pdfDr. Radhey Shyam
 
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATION
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATIONFUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATION
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATIONijdms
 
The Fuzzy Logical Databases
The Fuzzy Logical DatabasesThe Fuzzy Logical Databases
The Fuzzy Logical DatabasesAlaaZ
 
Semi-Supervised Discriminant Analysis Based On Data Structure
Semi-Supervised Discriminant Analysis Based On Data StructureSemi-Supervised Discriminant Analysis Based On Data Structure
Semi-Supervised Discriminant Analysis Based On Data Structureiosrjce
 
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...cscpconf
 
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...csandit
 
Reduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theoryReduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theorycsandit
 
Secured Ontology Mapping
Secured Ontology Mapping Secured Ontology Mapping
Secured Ontology Mapping dannyijwest
 
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...ijsc
 
Higher Order Learning
Higher Order LearningHigher Order Learning
Higher Order Learningbutest
 
Multinomial Logistic Regression.pdf
Multinomial Logistic Regression.pdfMultinomial Logistic Regression.pdf
Multinomial Logistic Regression.pdfAlemAyahu
 
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASES
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASESMAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASES
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASESijdms
 
20IT501_DWDM_PPT_Unit_IV.ppt
20IT501_DWDM_PPT_Unit_IV.ppt20IT501_DWDM_PPT_Unit_IV.ppt
20IT501_DWDM_PPT_Unit_IV.pptSamPrem3
 

Similar to Incremental Approach to Attribute Reduction of Dynamic Set-Valued Info Systems (20)

Application for Logical Expression Processing
Application for Logical Expression Processing Application for Logical Expression Processing
Application for Logical Expression Processing
 
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...
A Collaborative Recommender System Based On Probabilistic Inference From Fuzz...
 
call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...call for papers, research paper publishing, where to publish research paper, ...
call for papers, research paper publishing, where to publish research paper, ...
 
KIT-601 Lecture Notes-UNIT-2.pdf
KIT-601 Lecture Notes-UNIT-2.pdfKIT-601 Lecture Notes-UNIT-2.pdf
KIT-601 Lecture Notes-UNIT-2.pdf
 
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATION
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATIONFUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATION
FUZZY STATISTICAL DATABASE AND ITS PHYSICAL ORGANIZATION
 
The Fuzzy Logical Databases
The Fuzzy Logical DatabasesThe Fuzzy Logical Databases
The Fuzzy Logical Databases
 
Fuzzy
FuzzyFuzzy
Fuzzy
 
E017373946
E017373946E017373946
E017373946
 
Semi-Supervised Discriminant Analysis Based On Data Structure
Semi-Supervised Discriminant Analysis Based On Data StructureSemi-Supervised Discriminant Analysis Based On Data Structure
Semi-Supervised Discriminant Analysis Based On Data Structure
 
Ijetcas14 347
Ijetcas14 347Ijetcas14 347
Ijetcas14 347
 
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
 
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
OPTIMIZATION IN ENGINE DESIGN VIA FORMAL CONCEPT ANALYSIS USING NEGATIVE ATTR...
 
Reduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theoryReduct generation for the incremental data using rough set theory
Reduct generation for the incremental data using rough set theory
 
graph_embeddings
graph_embeddingsgraph_embeddings
graph_embeddings
 
Secured Ontology Mapping
Secured Ontology Mapping Secured Ontology Mapping
Secured Ontology Mapping
 
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...
Single Reduct Generation Based on Relative Indiscernibility of Rough Set Theo...
 
Higher Order Learning
Higher Order LearningHigher Order Learning
Higher Order Learning
 
Multinomial Logistic Regression.pdf
Multinomial Logistic Regression.pdfMultinomial Logistic Regression.pdf
Multinomial Logistic Regression.pdf
 
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASES
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASESMAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASES
MAPPING DATA BETWEEN PROBABILITY SPACES IN PROBABILISTIC DATABASES
 
20IT501_DWDM_PPT_Unit_IV.ppt
20IT501_DWDM_PPT_Unit_IV.ppt20IT501_DWDM_PPT_Unit_IV.ppt
20IT501_DWDM_PPT_Unit_IV.ppt
 

Incremental Approach to Attribute Reduction of Dynamic Set-Valued Info Systems

  • 1. ORIGINAL ARTICLE An incremental approach to attribute reduction of dynamic set-valued information systems Guangming Lang • Qingguo Li • Tian Yang Received: 3 July 2013 / Accepted: 15 December 2013 / Published online: 1 January 2014 Ó Springer-Verlag Berlin Heidelberg 2013 Abstract Set-valued information systems are important generalizations of single-valued information systems. In this paper, three relations are proposed for attribute reduction of set-valued information systems. Then, we convert a large-scale set-valued information system into a smaller relation information system. An incremental algo- rithm is designed to compress dynamic set-valued infor- mation systems. Concretely, we mainly address the compression updating from three aspects: variations of attribute set, immigration and emigration of objects and alterations of attribute values. Finally, several illustrative examples are employed to demonstrate that attribute reduction of dynamic set-valued information systems are simplified significantly by our proposed approaches. Keywords Rough sets Á Attribute reduction Á Homomorphism Á Set-valued information system Á Dynamic set-valued information system 1 Introduction Rough set theory proposed by Pawlak [40] is a powerful mathematical tool to deal with vagueness and uncertainty of information. But the condition of equivalence relation is so restrictive that limits applications of rough sets in practice. By combining with fuzzy sets [1–6, 17, 23, 24, 36, 37, 39, 48, 49], probability theory [32, 33, 44, 45, 53–55, 62], topology [16, 18, 42. 43, 50, 52, 56, 59], matroid theory [47] and other theories [29, 38], rough set theory has been successfully applied to various areas such as knowl- edge discovery, data mining and pattern recognition. Set-valued information systems as generalized models of single-valued information systems and a representation of incomplete information have attracted a great deal of attention [8–15, 20, 22, 25, 31, 41, 51, 52, 57]. For example, Guan et al. [22] initially introduced set-valued information systems and investigated their basic properties. Chen et al. [8, 9] studied attribute reduction of set-valued information systems based on tolerance relations and var- iable tolerance relations. Liu et al. [31] discussed attribute reduction of set-valued information systems on the basis of maximal variable precision tolerance classes. Zhang et al. [57] introduced matrix approaches for approximations of concepts in set-valued information systems with dynamic attribute variation. In practice, the tolerance relation dis- cerns objects on the basis of whether there are common attribute values or not, and it neglects some differences between objects. For instance, there are three objects x, y and z in an incomplete information system, and their attribute values a(x), a(y) and a(z) are 1, 0 and à with respect to attribute a, respectively, where à stands for the lost value. Since the lost value is considered to be similar to any value in the domain of the corresponding attribute, x and y are in the tolerance class for z with respect to a. But G. Lang Á Q. Li (&) College of Mathematics and Econometrics, Hunan University Changsha, 410082 Hunan, People’s Republic of China e-mail: liqingguoli@aliyun.com G. Lang e-mail: langguangming1984@126.com T. Yang College of Science, Central South University of Forestry and Technology, Changsha 410082, Hunan, People’s Republic of China e-mail: math-yangtian@126.com 123 Int. J. Mach. Learn. & Cyber. (2014) 5:775–788 DOI 10.1007/s13042-013-0225-x
  • 2. a(x) = a(y). In other words, it may happen that two objects are in the same tolerance class with respect to an attribute, but there are no common attribute values with respect to the attribute. Therefore, it is important to present new relations for set-valued information systems. In recent years, homomorphisms [19, 21, 27, 30, 46, 60, 61] have been considered as an important approach for attribute reduction of information systems. For instance, Grzymala-Busse [21] initially introduced seven kinds of homomorphisms of knowledge representation systems and investigated their basic properties in detail. Afterwards, scholars discussed the relationship between information systems by means of different homomorphisms [19, 27, 46, 60, 61]. But few attempts have been made on compressing set-valued information systems under the condition of homomorphisms. Furthermore, there have been many papers concerning dynamic information systems [7, 26, 28, 34, 35, 58]. For example, Chen et al. [7] brought forward a dynamic maintenance approach for approximations in coarsening and refining attribute values. Li et al. [26] investigated incremental updating approximations in dominance-based rough sets approach under the variation of attribute set. Li et al. [28] introduced the characteristic relation approach for dynamic attribute generalization. Liu et al. [34, 35] proposed an incremental approach for inducing knowledge from dynamic information systems. To deal with numerical data, Zhang et al. [58] presented a new dynamic method for incrementally updating approxi- mations of a concept under neighborhood rough sets. In practice, set-valued information systems also vary with time due to dynamic characteristics of data collection, and the non-incremental approach to compressing dynamic set- valued information systems is often very costly or even intractable. Therefore, it is essential to apply an incre- mental updating scheme to maintain the compression dynamically and avoid unnecessary computations. The purpose of this paper is to further study set-valued information systems. First, we present three new relations and two types of discernibility matrixes for set-valued information systems. We also investigate their basic properties. Second, a large-scale set-valued information system is compressed into a relatively smaller relation information system by using the proposed relations and information system homomorphisms. Third, we design an incremental algorithm of compressing dynamic set-valued information systems. Particularly, we mainly address the compression updating from three aspects: variations of attribute set, immigration and emigration of objects and alterations of attribute values. The computational com- plexity of attribute reduction of dynamic set-valued infor- mation systems can be reduced greatly. The rest of this paper is organized as follows. Section 2 briefly reviews the basic concepts of set-valued information systems and consistent functions. In Sect. 3, we put forward three relations and two types of discern- ibility matrixes for set-valued information systems. Section 4 is devoted to compressing set-valued information systems for attribute reduction. In Sect. 5, we compress dynamic set-valued information systems by using an incremental algorithm. We conclude the paper and set further research directions in Sect. 6. 2 Preliminaries In this section, we briefly review some concepts about set- valued information systems and relation information sys- tems. In addition, an example is employed to illustrate set- valued information systems. Definition 2.1 [22] Suppose S = (U, A, V, f) is a set- valued information system (denoted as SIS), where U = {x1, x2, . . ., xn} is a non-empty finite set of objects, A = {a1, a2, . . ., am} is a non-empty finite set of attri- butes,V is the set of attribute values, and f is a mapping from U 9 A to V, where f:U9A?2V is a set-valued mapping. Single-valued information systems are regarded as special cases of set-valued information systems. There are many semantic interpretations for set-valued infor- mation systems, we summarize two types of them as follows: Type 1: For x 2 U; a 2 A; fðx; aÞ is interpreted con- junctively. For example, if a is the attribute ‘‘speaking language’’, then f(x, a) = {German, French, Polish} can be viewed as: x speaks German, French and Polish, and x can speak three languages. Type 2: For x 2 U; a 2 A; fðx; aÞ is interpreted dis- junctively. For instance, if a is the attribute ‘‘speaking language’’, then f(x, a) = {German, French, Polish} can be regarded as: x speaks German, French or Polish, and x can speak only one of them. For set-valued information systems, Guan et al. and Chen et al. presented concepts of tolerance relation and variable precision tolerance relation, respectively. Definition 2.2 [22] Let S = (U, A, V, f) be a set-valued information system, a 2 A; and B A. Then the tolerance relations Ra and RB are defined as follows: Ra ¼ fðxi; xjÞjfðxi; aÞ fðxj; aÞ 6¼ ;; xi; xj 2 Ug; RB ¼ fðxi; xjÞj8b 2 B; fðxi; bÞ fðxj; bÞ 6¼ ;; xi; xj 2 Ug: In other words, ðx; yÞ 2 RB is viewed as x and y are indiscernible with respect to B, and RB(x) is seen as the tolerance class for x with respect to B. 776 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 3. Definition 2.3 [8] Let S = (U, A, V, f) be a set-valued information system, al 2 A; B A; xi; xj 2 U; cl ij ¼ jfðxi; alÞ fðxj; alÞj=jfðxi; alÞ [ fðxj; alÞj; and a 2 ð0; 1Š. Then the relations Ra al and RB a are defined as follows: Ra al ¼fðxi; xjÞ 2 U  Ujcl ij ! ag; Ra B ¼fðxi; xjÞ 2 U  Uj8al 2 B; cl ij ! ag: The following example shows that there are some issues related to the tolerance relation and variable precision tolerance relation. Example 2.4 Table 1 depicts a set-valued information system. In the sense of Definition 2.2, Ra1 ðx2Þ ¼ fx1; x2; x3; x4; x5; x6g. Obviously, we have that fðx1; x2Þ; ðx3; x2Þg Ra1 . But |f(x1, a1) f(x2, a1)| = 1 and |f(x2, a1) f(x3, a1)| = 2. Furthermore, we obtain that fðx1; x4Þ; ðx6; x4Þg Ra1 . But f(x1, a1) f(x4, a1) = {0} and f(x6, a1) f(x4, a1) = {1}. Although there are some differences between objects which are in the same toler- ance class, Ra1 cannot discern them. By Definition 2.3, we have that fðx1; x4Þ; ðx2; x3Þ; ðx4; x6Þ; ðx5; x6Þg R0:5 a1 . Furthermore, we obtain that f(x1, a1) f(x4, a1) = {0}, f(x2, a1) f(x3, a1) = {1, 2}, f(x4, a1) f(x6, a1) = {1} and f(x5, a1) f(x6, a1) = {1}. It is obvious that {1,2} = {1} and {1} = {0}. But we cannot get this difference in terms of Definition 2.3. Wang et al. presented a concept of consistent functions for attribute reduction of relation information systems. Definition 2.5 [46] Let U1 and U2 be two universes, f a mapping from U1 to U2, the relation R a mapping from U 9 U to {0, 1}, and ½xŠf ¼ fy 2 U1jfðxÞ ¼ f ðyÞg. For any x; y 2 U1; if R(u, v) = R(s, t) for any two pairs ðu; vÞ; ðs; tÞ 2 ½xŠf  ½yŠf ; then f is said to be consistent with respect to R. If the consistent function is a surjection, then it is a homomorphism between relation information systems. We compress a large-scale information system into a smaller one under the condition of a homomorphism. It has been proved that attribute reduction of the original system and image system are equivalent to each other. Therefore, the consistent functions provide an approach to compressing relation information systems. 3 Three relations for set-valued information systems In this section, we propose three relations to address the problems illustrated in Example 2.4 and present two types of discernibility matrixes for set-valued information systems. Definition3.1 Let (U, A, V, f) be a set-valued informa- tion system, a 2 A; and B A. Then the relations R(a, h) [ and R[ ðB;HBÞ are defined as follows: R[ ða;hÞ ¼ fðx; yÞjjfðx; aÞ fðy; aÞj [ h; x; y 2 Ug; R[ ðB;HBÞ ¼ fðx; yÞjjfðx; aiÞ fðy; aiÞj [ hi; x; y 2 U; ai 2 Bg; where j Á j denotes the cardinality of a set, HB = (h1, h2, . . ., hm) and 0 hi jVai j: The physical meaning of ðx; yÞ 2 R[ ða;hÞ is that there exist at least h ? 1 common values between x and y with respect to a. In terms of Definition 3.1, we obtain that Ra = R(a, 0) [ , R(B,(0,0,. . .,0)) [ = RB and R[ ðB;HBÞ ¼ T ai2B R[ ðai;hiÞ. For the convenient representation, we denote R[ ðB;HBÞðxÞ ¼ ½xŠ[ ðB;HBÞ ¼ fyjðx; yÞ 2 R[ ðB;HBÞg. Furthermore, K = (k1, k2, . . ., km) B HB if and only if ki B hi for 1 B i B m. If fR[ ða;hÞðxÞjx 2 Ug is a covering of U, then R(a, h) [ is called a [ h -relation. In general, R(a, h) [ and R[ ðB;HBÞ are symmetric and intransitive, R(a,h) [ and R[ ðB;HBÞ are not reflexive necessarily if h [ 0 and HB = (1, 1, . . ., 1), respectively. For example, we obtain that R[ ða1;hÞðx1Þ ¼ ; by considering Table 1. Example 3.2 (Continuation of Example 2.4) Let a1, 0, 1 and 2 denote language, German, French and Polish, respectively, then we have that fðx2; a1Þ ¼ fGerman, French; Polishg and fðx3; a1Þ ¼ fFrench, Polishg. Consequently, we obtain that ðx2; x3Þ 2 R[ ða1;1Þ. In other words, x2 and x3 speak at least two common languages. Proposition 3.3 Let (U, A, V, f) be a set-valued infor- mation system, and B; C A. Then we have (1) if HB B HC B HA, then R[ ðA;HAÞ R[ ðC;HCÞ R[ ðB;HBÞ; (2) if HB B HC B HA, then ½xŠ[ ðA;HAÞ ½xŠ[ ðC;HCÞ ½xŠ[ ðB;HBÞ: Definition 3.4 Let S = (U, A, V, f) be a set-valued information system, R[ A ¼ fR[ ða1;h1Þ; R[ ða2;h2Þ; . . .; R[ ðam;hmÞg; Table 1 A set-valued information system U a1 a2 a3 a4 x1 {0} {0} {1, 2} {1, 2} x2 {0, 1, 2} {1, 2} {1, 2} {0, 1, 2} x3 {1, 2} {1} {1} {1, 2} x4 {0, 1} {0, 2} {1, 2} {1, 2} x5 {1, 2} {1, 2} {1, 2} {1} x6 {1} {1} {0, 1} {0, 1} Int. J. Mach. Learn. Cyber. (2014) 5:775–788 777 123
  • 4. and R[ ðai;hiÞ a [ hi -relation. Then ðU; R[ A Þ is called an induced [ -relation information system of S. Example 3.5 Considering Table 1, we obtain an induced [ -relation information system ðU; R[ A Þ, where R[ A ¼ fRij1 i 4g and R[ ða1;0Þðx1Þ ¼ fx1; x2; x4g; R[ ða1;0Þðx2Þ ¼ R[ ða1;0Þðx4Þ ¼ fx1; x2; x3; x4; x5; x6g; R[ ða1;0Þðx3Þ ¼ R[ ða1;0Þðx5Þ ¼ R[ ða1;0Þðx6Þ ¼ fx2; x3; x4; x5; x6g; R[ ða2;0Þðx1Þ ¼ fx1; x4g; R[ ða2;0Þðx2Þ ¼ R[ ða2;0Þðx5Þ ¼ fx2; x3; x4; x5; x6g; R[ ða2;0Þðx3Þ ¼ R[ ða2;0Þðx6Þ ¼ fx2; x3; x5; x6g; R[ ða2;0Þðx4Þ ¼ fx1; x2; x4; x5g; R[ ða3;0Þðx1Þ ¼ R[ ða3;0Þðx2Þ ¼ R[ ða3;0Þðx3Þ ¼ R[ ða3;0Þðx4Þ ¼ R[ ða3;0Þðx5Þ ¼ R[ ða3;0Þðx6Þ ¼ fx1; x2; x3; x4; x5; x6g; R[ ða4;0Þðx1Þ ¼ R[ ða4;0Þðx2Þ ¼ R[ ða4;0Þðx3Þ ¼ R[ ða4;0Þðx4Þ ¼ R[ ða4;0Þðx5Þ ¼ R[ ða4;0Þðx6Þ ¼ fx1; x2; x3; x4; x5; x6g: Definition 3.6 Let ðU; R[ A Þ be an induced [ -relation information system of S = (U, A, V, f), and P A. If T R[ P ¼ T R[ A and T R[ PÃ 6¼ T R[ A for any R[ PÃ $R[ P ; then R[ P is called a reduct of ðU; R[ A Þ: By Definition 3.6, the reduct is the minimal subset preserving R[ A : In Example 3.5, we can get a reduct {R2} for ðU; R[ A Þ: In the sense of Definition 3.1, we propose a discern- ibility matrix for set-valued information systems. Definition 3.7 Let S = (U, A, V, f) be a set-valued information system. Then its discernibility matrix MA = (M(x, y)) is a |U| 9 |U| matrix, the element M(x, y) is defined by Mðx; yÞ ¼ fa 2 Ajðx; yÞ 62 R[ ða;haÞ; x; y 2 Ug; where R[ ða;haÞ is a [ ha -relation. Thatis,thephysicalmeaningofM(x, y)isthatobjectsx and y can be distinguished by any element of M(x, y) in S. If M(x, y) = ;, then objects x and y can be discerned. It is suf- ficient to consider only the lower triangle or the upper triangle of the matrix since the discernibility matrix M is symmetric. Definition 3.8 Let S = (U, A, V, f) be a set-valued information system, and M = (M(x, y)) the discernibility matrix of S. Then M ¼ V ðx;yÞ2U2 W Mðx; yÞ is called a dis- cernibility function of S. The expression W Mðx; yÞ denotes the disjunction of all attributes in M(x, y), and the expression V f W Mðx; yÞg stands for the conjunction of all W Mðx; yÞ. In addition, V B is a prime implicant of the discernibility function D if and only if B is a reduct of S. Definition 3.9 Let S = (U, A[ {d}, V, f) be a set-valued decision information system(denoted as SDIS), where d is a decision attribute. Then its discernibility matrix Md = (Md(x, y)) is defined as a |U| 9 |U| matrix, where Mdðx;yÞ ¼ ;; dðxÞ ¼ dðyÞ; fa 2 Ajðx;yÞ 62 R[ ða;haÞ;x;y 2 Ug; otherwise: In other words, the physical meaning of Md(x, y) is that objects x and y can be distinguished by any element of Md(x, y) in S. If Md(x, y) = [, then objects x and y can be discerned. It is sufficient to consider only the lower triangle or the upper triangle of Md since it is symmetric. Definition 3.10 Let S = (U, A[ {d}, V, f) be a SDIS, and Md = (Md(x, y)) the discernibility matrix of S. Then Md ¼ V ðx;yÞ2U2 W Mdðx; yÞ is called a discernibility function of S. The expression W Mdðx; yÞ denotes the disjunction of all attributes in Md(x, y), and the expression V f W Mdðx; yÞg stands for the conjunction of all W Mdðx; yÞ. In addition, V B is a prime implicant of the discernibility function Dd if and only if B is a reduct of S. Definition 3.11 Let (U, A, V, f) be a set-valued infor- mation system, a 2 A; and B A. Then the relations R(a,h) and RðB;HBÞ are defined as Rða;hÞ ¼ fðx; yÞjjfðx; aÞ fðy; aÞj ¼ h; x; y 2 Ug; RðB;HBÞ ¼ fðx; yÞjjfðx; aiÞ fðy; aiÞj ¼ hi; x; y 2 U; ai 2 Bg: The physical meaning of ðx; yÞ 2 Rða;hÞ is that there exist h common values between x and y with respect to a. In the sense of Definition 3.11, R(a,h) and RðB;HBÞ are special cases of Definition 3.1. But the condition of relations presented in Definition 3.11 is stricter than Definition 3.1. They can be applied in practice with respect to different requests. Meanwhile, we have that R[ ða;hÞ ¼ S j [ h Rða;jÞ and R[ ðB;HBÞ ¼ S K [ HB RðB;KÞ. For simplicity, we note that RðB;HBÞðxÞ ¼ ½xŠðB;HBÞ ¼ fyjðx; yÞ 2 RðB;HBÞg. For example, we get that ðx2; x3Þ 2 Rða1;2Þ in Example 3.2. In other words, x2 and x3 speak two common languages. Property 3.12 Let (U, A, V, f) be a set-valued informa- tion system, and B; C A. Then we have (1) if HB B HC B HA, then RðA;HAÞ RðC;HCÞ RðB;HBÞ; (2) if HB B HC B HA, then ½xŠðA;HAÞ ½xŠðC;HCÞ ½xŠðB;HBÞ: 778 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 5. Definition 3.13 Let (U, A, V, f) be a set-valued infor- mation system, a 2 A; B A; and P Va. Then the rela- tions R(a,P) and RðB;PÞ are defined as Rða;PÞ ¼ fðx; yÞjfðx; aÞ fðy; aÞ ¼ P; x; y 2 Ug; RðB;PÞ ¼ fðx; yÞjfðx; aiÞ fðy; aiÞ ¼ Pi; x; y 2 U; ai 2 Bg; where P ¼ ðP1; P2; . . .; PmÞ; and Pi is defined as Pi Vai ðrespectively; Pi ¼ ;Þ if ai 2 B ðrespectively;ai 62 BÞ: The physical meaning of ðx; yÞ 2 Rða;PÞ is that the common value between x and y with respect to a is P. In the sense of Definition 3.13, we obtain that ðx2; x3Þ 2 Rða1;PÞ in Example 3.2, where P ¼ fFrench, Polishg. In other words, x2 and x3 speak French and Polish. The con- dition of relations presented in Definition 3.13 is stricter than Definitions 3.1 and 3.11. The proposed relations can be applied in practical situations with respect to different requests. For simplicity, we do not present discernibility matrixes based on Definitions 3.11 and 3.13 in this section. By Definitions 3.11 and 3.13, we observe that Rða;hÞ ¼ S fRða;PÞjP 2 2A ; jPj ¼ hg: Furthermore, R(a,P) and RðB;PÞ are symmetric and intransitive. According to Definitions 3.1, 3.11 and 3.13, we obtain R[ ða;hÞ ¼ [ i [ h Rða;iÞ ¼ [ i ! h [ fRða;PÞjjPj ¼ i; P 2 2Va g and R[ ðB;IÞ ¼ a2B f [ i [ h Rða;iÞg ¼ a2B f [ i [ h [ fRða;PÞjjPj ¼ i; P 2 2Va gg: In the sense of Definitions 3.1, 3.11 and 3.13, we obtain different granularities of the universe. Furthermore, we learn that Definition 3.1, Definition 3.11 and Definition 3.13 is the ordering of decreasing granularity of the uni- verse. In other words, ranking the granularities obtained by using Definitions 3.1, 3.11 and 3.13, Definition 3.1 is No.1, Definition 3.11 is No.2, Definition 3.13 is No.3. For example, we get the coarsest granularity by using Defini- tion 3.1 among the proposed relations. In practice, we take the corresponding relation with respect to the request. 4 Attribute reduction of SIS and SDIS under homomorphisms In practical situations, it is time-consuming to conduct attribute reduction of the large-scale set-valued information systems. To solve this issue, we propose a transforming algorithm (from an original SIS/SDIS to a relation SIS/SDIS) for attribute reduction of SIS and SDIS in this section. 4.1 Attribute reduction of SIS by using homomorphisms In this subsection, after deriving an induced C -relation information system of SIS, we convert it into a smaller one under the condition of homomorphisms. Several examples are employed to illustrate that the computational com- plexity of computing attribute reducts is reduced greatly by means of homomorphisms. Definition 4.1 Let ðU1; R[ A Þ be an induced [ -relation information system of S = (U1, A, V, f), and R[ ðai;hiÞ 2 R[ A . Then U1=R[ ðai;hiÞ ¼ f½xŠR[ ðai;hiÞ jx 2 U1g is called a partition based on R[ ðai;hiÞ; where ½xŠR[ ðai;hiÞ ¼ fyjR[ ðai;hiÞðxÞ ¼ R[ ðai;hiÞðyÞ; y 2 U1g for x 2 U1: We can discuss set-valued information systems in the sense of Definitions 3.11 and 3.13. For convenience, we only consider hi = 0 and denote R[ ðai;hiÞ as Ri in this section. Following, we employ Table 2 to show the partition based on each relation for ðU1; R[ A Þ; where Pixj stands for the block containing xj based on Ri. It is easy to see that PAxj ¼ T 1 i m Pixj ; where PAxj denotes the block con- taining xj based on R[ A : We present an algorithm of compressing set-valued information systems. Algorithm 4.2 Let S = (U1, A, V, f) be a set-valued information system, where U1 = {x1, . . ., xn} and A = {a1, . . ., am}. Step 1. Input the set-valued information system S = (U1, A, V, f) and obtain an induced [ -relation information system ðU1; R[ A Þ, where R[ A ¼ fR1; R2; . . .; Rmg; Step 2. Compute U1/Ri (1 B i B m) and obtain U1=R[ A ¼ fCij1 i Ng; Step 3. Obtain ðU2; gðR[ A ÞÞ by defining g(x) = yi for any x 2 Ci; where U2 ¼ fgðxiÞjxi 2 U1g and gðR[ A Þ={g(R1), g(R2), . . ., g(Rm)}; Step 4. Get a reduct {g(Ri1), g(Ri2), . . ., g(Rik)} of (U2, {g(R1), g(R2), . . ., g(Rm)}); Step 5. Obtain a reduct {Ri1, Ri2, . . ., Rik} of ðU1; R! A Þ and output the results. The mapping g presented in Algorithm 4.2 is a homomorphism from ðU1; R[ A Þ to ðU2; gðR[ A ÞÞ. Attribute reducts of ðU1; R[ A Þ and ðU2; gðR[ A ÞÞ are equivalent to each other under the condition of g. The computational complexity of constructing g is m * O(n2 ) ? O((m - 1) * n2 ). Furthermore, by transforming a large-scale set-valued information system S = (U1, A, V, f) into a relation Int. J. Mach. Learn. Cyber. (2014) 5:775–788 779 123
  • 6. information system ðU1; R[ A Þ; if there is a homomorphism between the relation information system and another relation information system ðU2; R[ Ã A Þ, we can get attri- bute reducts of S by the reducts of ðU2; R[ Ã A Þ. Remark In Example 3.1, Wang et al. [46] also obtained U1=R[ A . But we get U1=R[ A by computing U1/Ri for any Ri 2 R[ A in Algorithm 4.2. By using the proposed approach, we compress dynamic set-valued information systems in Sect. 5. The process of compressing set-valued information systems with Algorithm 4.2 is illustrated by the following example. Example 4.3 Table 3 depicts a set-valued information system S1 = (U1, A, V, f). According to Definition 3.1 and Example 3.5, we obtain ðU1; R[ A Þ; and R[ A ¼ fR1; R2; R3; R4g; where R1ðx1Þ ¼ R1ðx7Þ ¼ fx1; x2; x4; x7g; R1ðx2Þ ¼ R1ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g; R1ðx3Þ ¼ R1ðx5Þ ¼ R1ðx6Þ ¼ R1ðx8Þ ¼ fx2; x3; x4; x5; x6; x8g; R2ðx1Þ ¼ R1ðx7Þ ¼ fx1; x2; x3; x4; x7g; R2ðx2Þ ¼ R2ðx3Þ ¼ R2ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g; R2ðx5Þ ¼ R2ðx6Þ ¼ R2ðx8Þ ¼ fx2; x3; x4; x5; x6; x8g; R3ðx1Þ ¼ R3ðx2Þ ¼ R3ðx3Þ ¼ R3ðx4Þ ¼ R3ðx5Þ ¼ R3ðx6Þ ¼ R3ðx7Þ ¼ R3ðx8Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g; R4ðx1Þ ¼ R4ðx2Þ ¼ R4ðx3Þ ¼ R4ðx4Þ ¼ R4ðx5Þ ¼ R4ðx6Þ ¼ R4ðx7Þ ¼ R4ðx8Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g: By Definition 4.1, we derive U1/R1, U1/R2, U1/R3 and U1/R4 shown in Table 4 and get U1=R[ A ¼ ffx1; x7g; fx2; x4g; fx3g; fx5; x6; x8gg. Then we define a mapping g : U1 À! U2 as follows: gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ ¼ gðx6Þ ¼ gðx8Þ ¼ y4: Consequently, we derive ðU2; gðR[ A ÞÞ; where U2 ¼ fy1; y2; y3; y4g; gðR[ A Þ ¼ fgðR1Þ; gðR2Þ; gðR3Þ; gðR4Þg; and gðR1Þðy1Þ ¼ fy1; y2g; gðR1Þðy2Þ ¼ fy1; y2; y3; y4g; gðR1Þðy3Þ ¼ gðR1Þðy4Þ ¼ fy2; y3; y4g; gðR2Þðy1Þ ¼ fy1; y2; y3g; gðR2Þðy2Þ ¼ gðR2Þðy3Þ ¼ fy1; y2; y3; y4g; gðR2Þðy4Þ ¼ fy2; y3; y4g; gðR3Þðy1Þ ¼ gðR3Þðy2Þ ¼ gðR3Þðy3Þ ¼ gðR3Þðy4Þ ¼ fy1; y2; y3; y4g; gðR4Þðy1Þ ¼ gðR4Þðy2Þ ¼ gðR4Þðy3Þ ¼ gðR4Þðy4Þ ¼ fy1; y2; y3; y4g: Table 2 The partitions based on Ri (1 B i B m) and R[ A , respectively U1 R1 R2 . . . Rm R[ A x1 P1x1 P2x1 . . . Pmx1 PAx1 x2 P1x2 P2x2 . . . Pmx2 PAx2 . . . . . . . . . . . . . . . . . . . . . . . . xn P1xn P2xn . . . Pmxn PAxn Table 3 A set-valued information system U1 a1 a2 a3 a4 x1 {0} {0} {1, 2} {1, 2} x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} x3 {1, 2} {0, 1} {1, 2} {1, 2} x4 {0, 1} {0, 2} {1, 2} {1} x5 {1, 2} {1, 2} {1, 2} {1} x6 {1} {1, 2} {0, 1} {0, 1} x7 {0} {0} {1, 2} {1, 2} x8 {1} {1, 2} {0, 1} {0, 1} Table 4 The partitions based on R1, R2, R3, R4 and R[ A , respectively U1 R1 R2 R3 R4 R[ A x1 {x1, x7} {x1, x7} U1 U1 {x1, x7} x2 {x2, x4} {x2, x3, x4} U1 U1 {x2, x4} x3 {x3, x5, x6, x8} {x2, x3, x4} U1 U1 {x3} x4 {x2, x4} {x2, x3, x4} U1 U1 {x2, x4} x5 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8} x6 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8} x7 {x1, x7} {x1, x7} U1 U1 {x1, x7} x8 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x5, x6, x8} 780 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 7. Afterwards, we obtain the following results: (1) g is a homomorphism from ðU1; R[ A Þ to ðU2; gðR[ A ÞÞ; (2) g(R2), g(R3) and g(R4) are superfluous in gðR! A Þ if and only if R2, R3 and R4 are superfluous in R[ A ; (3) {g(R1)} is a reduct of gðR[ A Þ if and only if {R1} is a reduct of R[ A : In Example 4.3, we see that the size of ðU2; gðR[ A ÞÞ is smaller than ðU1; R[ A Þ; and their attribute reducts are equivalent to each other under the condition of homomorphisms. We employ an example to illustrate that the computa- tional complexity of computing attribute reducts is reduced greatly by means of homomorphisms from the view of discernibility matrix. Example 4.4 (Continuation of Example 4.3) By Defini- tion 3.7, we obtain discernibility matrixes D1 and D2 for ðU1; R[ A Þ and ðU2; gðR[ A ÞÞ, respectively. D1 ¼ ; fa1g ; ; ; ; fa1; a2g ; ; ; fa1; a2g ; ; ; ; ; ; fa1g ; fa1; a2g fa1; a2g fa1; a2g ; ; ; ; ; fa1; a2g 2 6 6 6 6 6 6 6 6 4 3 7 7 7 7 7 7 7 7 5 ; D2 ¼ ; fa1g ; fa1; a2g ; ; 2 4 3 5: In Example 4.4, we observe that the size of D1 is larger than D2, and {a1} is the reduct of ðU1; R[ A Þ and ðU2; gðR[ A ÞÞ. The computational complexity of computing D2 is relatively lower than computing D1. 4.2 Attribute reduction of SDIS under the condition of homomorphisms In this subsection, we study attribute reduction of SDIS under the condition of homomorphisms. Example 4.5 (Continuation of Example 4.4). Let Table 5 be a SDIS. Then, we have Rdðx1Þ ¼ Rdðx2Þ ¼ Rdðx4Þ ¼ Rdðx7Þ ¼ fx1; x2; x4; x7g; Rdðx3Þ ¼ Rdðx5Þ ¼ Rdðx6Þ ¼ Rdðx8Þ ¼ fx3; x5; x6; x8g: Thus, we get U1/Rd = {{x1, x2, x4, x7}, {x3, x5, x6, x8}} and define a mapping g : U1 À! U2 as follows: gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ ¼ gðx6Þ ¼ gðx8Þ ¼ y4: Consequently, we derive ðU2; gðR[ A ÞÞ; where U2 ¼ fy1; y2; y3; y4g; gðR[ A Þ ¼ fgðR1Þ; gðR2Þ; gðR3Þ; gðR4Þg; and gðR1Þðy1Þ ¼ fy1; y2g; gðR1Þðy2Þ ¼ fy1; y2; y3; y4g; gðR1Þðy3Þ ¼ gðR1Þðy4Þ ¼ fy2; y3; y4g; gðR2Þðy1Þ ¼ fy1; y2; y3g; gðR2Þðy2Þ ¼ gðR2Þðy3Þ ¼ fy1; y2; y3; y4g; gðR2Þðy4Þ ¼ fy2; y3; y4g; gðR3Þðy1Þ ¼ gðR3Þðy2Þ ¼ gðR3Þðy3Þ ¼ gðR3Þðy4Þ ¼ fy1; y2; y3; y4g; gðR4Þðy1Þ ¼ gðR4Þðy2Þ ¼ gðR4Þðy3Þ ¼ gðR4Þðy4Þ ¼ fy1; y2; y3; y4g; gðRdÞðy1Þ ¼ gðRdÞðy2Þ ¼ fy1; y2g; gðRdÞðy3Þ ¼ gðRdÞðy4Þ ¼ fy3; y4g: Afterwards, we obtain the following results: (1) g is a homomorphism from ðU1; R[ A[fdgÞ to ðU2; gðR[ A[fdgÞÞ; (2) g(R2), g(R3) and g(R4) are superfluous in gðR[ A[fdgÞ if and only if R2,R3 and R4 are superfluous in R[ A[fdg; (3) {g(R1)} is a reduct of gðR[ A[fdgÞ if and only if {R1} is a reduct of R[ A[fdg: In Example 4.5, we see that the size of ðU2; gðR[ A[fdgÞÞ is smaller than ðU1; R[ A[fdgÞ; and their attribute reducts are equivalent to each other under the condition of homo- morphisms. Furthermore, by transforming a large-scale SDIS S = (U1, A[ {d}, V, f) into a relation information system ðU1; R[ A[fdgÞ; if there is a homomorphism between the relation information system and another relation information system ðU2; R[ Ã A[fdgÞ; we can get attribute re- ducts of S = (U1, A[ {d}, V, f) by the reducts of ðU2; R[ Ã A[fdgÞ: We employ an example to show that the computational complexity of computing attribute reducts is reduced greatly by means of homomorphisms from the view of discernibility matrix. Example 4.6 (Continuation of Example 4.5). By Defini- tion 3.9, we obtain discernibility matrixes D3 and D4 for ðU1; R[ A[fdgÞ and ðU2; gðR[ A[fdgÞÞ, respectively. Table 5 A set-valued information system with a decision attribute U1 a1 a2 a3 a4 d x1 {0} {0} {1, 2} {1, 2} 0 x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} 0 x3 {1, 2} {0, 1} {1, 2} {1, 2} 1 x4 {0, 1} {0, 2} {1, 2} {1} 0 x5 {1, 2} {1, 2} {1, 2} {1} 1 x6 {1} {1, 2} {0, 1} {0, 1} 1 x7 {0} {0} {1, 2} {1, 2} 0 x8 {1} {1, 2} {0, 1} {0, 1} 1 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 781 123
  • 8. D3 ¼ ; fa1g ; ; ; ; fa1; a2g ; ; ; fa1; a2g ; ; ; ; ; ; fa1g ; fa1; a2g fa1; a2g fa1; a2g ; ; ; ; ; fa1; a2g 2 6 6 6 6 6 6 6 6 4 3 7 7 7 7 7 7 7 7 5 ; D4 ¼ ; fa1g ; fa1; a2g ; ; 2 4 3 5: Notice that the size of D3 is larger than D4, and {a1} is the reduct of ðU1; R[ A[fdgÞ and ðU2; gðR[ A[fdgÞÞ. The computational complexity of computing D4 is relatively lower than computing D3. There are n objects in the original set-valued informa- tion system. If we transform it into a relation information system, then computational complexity of getting a reduct by using a discernibility matrix is Oðm à n2 Þ. If we com- press the relation information system into one with N objects, then the computational complexity is Oðm à N2 Þ: In practice, it may be difficult to construct reducts of large-scale set-valued information systems. So we convert it into a relation information system and compress the relation information system into a relatively smaller one under the condition of homomorphisms. By conducting attribute reduction of the smaller relation information system, we obtain reducts of the set-valued information system. Moreover, if there is a homomorphism between an induced [ -relation information system and a relation information system, we can obtain reducts of the others by conducting attribute reduction of one relation information system. 5 Compressing dynamic set-valued information systems Illustrated by Examples 4.4 and 4.6, the computational complexity of attribute reduction for large-scale set-valued information systems can be reduced greatly by using homomorphisms, and most of time is spent on constructing the homomorphisms between relation information systems. However, we have not seen the work on constructing the homomorphisms for dynamic set-valued information sys- tems so far. In this section, we mainly construct homomor- phisms from three aspects: variations of attribute set, immigration and emigration of objects and alterations of attribute values for dynamic set-valued information systems. 5.1 Variations of attribute set In this subsection, we show that how to compress a dynamic set-valued information system when adding and deleting attributes. Suppose that we have obtained Table 2 by compressing set-valued information system S1 = (U1, A, V1, f1). Now we get S2 = (U1, A[ P, V2, f2) by adding an attribute set P into A, where A P = ; and P = {am?1, am?2, . . ., ak}. There are three steps to compress S2 by utilizing Algorithm 4.2 as follows. Step 1: Derive U1/Ri by inducing Ri (m ? 1 B i B k). Step 2: Get Table 6 by adding U1/ Ri (m ? 1 B i B k) into Table 2 and derive U1=R[ A[P: Step 3: Obtain S3 ¼ ðgðU1Þ; gðR[ A[PÞÞ by defining the homomorphism g based on U1=R[ A[P: By using the Algorithm 4.2, the computational com- plexity ðk À mÞ Ã Oðn2 Þ þ Oððk À 1Þ Ã n2 Þ of constructing g is reduced without computing U1/Ri (1 B i B m). But we need to compute them without Table 2. Example 5.1 We obtain Table 7 by adding a5 into Table 3. By Definition 4.1, we first get U1/ R5 = {{x1, x2, x3, x4, x5, x6, x7}, {x8}}. Then we obtain Table 8 and derive U1=R[ A[fa5g ¼ ffx1; x7g; fx2; x4g; fx3g; fx5; x6g; fx8gg. Afterwards, we define the mapping g : U1 À! U2 as follows: gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx4Þ ¼ y2; gðx3Þ ¼ y3; gðx5Þ ¼ gðx6Þ ¼ y4; gðx8Þ ¼ y5; where U2 = {y1, y2, y3, y4, y5}. Consequently, we obtain a relation information system ðU2; gðR[ A[fa5gÞÞ. For Table 6 The partitions based on Ri (1 B i B k) and R[ A[P, respectively U1 R1 R2 . . . Rk R[ A[P x1 P1x1 P2x1 . . . Pkx1 PðA[PÞx1 x2 P1x2 P2x2 . . . Pkx2 PðA[PÞx2 . . . . . . . . . . . . . . . . . . . . . . . . xn P1xn P2xn . . . Pkxn PðA[PÞxn 782 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 9. simplicity, we do not list the relation information system in this subsection. Subsequently, we show the process of compressing the dynamic set-valued information system S2 = (U1, A - {al}, V2, f2), where al 2 A. There are two steps to com- press S2. We firstly get U=R[ ðAÀfalgÞ by using U1/ Ri (i = l) shown in Table 2 and define g as Example 4.3. Secondly, we obtain S3 ¼ ðgðU1Þ; gðR[ ðAÀfalgÞÞÞ. By using the Algorithm 4.2, the computational complexity Oððm À 2Þ Ã n2 Þ of constructing g is reduced without computing U1/ Ri (1 B i B l - 1,l ? 1 B i B m). But we need to com- pute them without Table 2. Furthermore, we can compress S2 when deleting an attribute set with the same approach. Example 5.2 By deleting a1 in a set-valued information system S1 shown in Table 3, we obtain information system S2 shown in Table 9. To compress S2 based on the com- pression of S1, we get Table 10 by deleting U1/R1 based on a1. Then, we obtain U1=R[ ðAÀfa1gÞ ¼ ffx1; x7g; fx2; x3; x4g; fx5; x6; x8gg and define the mapping g : U1 À! U2 as follows: gðx1Þ ¼ gðx7Þ ¼ y1; gðx2Þ ¼ gðx3Þ ¼ gðx4Þ ¼ y2; gðx5Þ ¼ gðx6Þ ¼ gðx8Þ ¼ y3; where U2 = {y1, y2, y3}. Consequently, the set-valued information system (U1, A - {a1}, V, f1) can be compressed into a smaller relation information system (U2, {g(R2), g(R3), g(R4)}). To express clearly, we do not list all the relations in this subsection. In Example 5.2, we compress a dynamic set-valued information system when deleting an attribute. The same approach can be applied to the set-valued information system when deleting an attribute set. 5.2 Immigration and emigration of objects In this subsection, we introduce an equivalence relation for set-valued information systems and show a process of compressing dynamic set-valued information systems in terms of object variation. Definition 5.3 Let S1 = (U1, A, V, f1) be a set-valued information system. Then an equivalence relation TA is defined as follows: TA ¼ fðx; yÞj8a 2 A; fðx; aÞ ¼ fðy; aÞ; x; y 2 U1g: The physical meaning of ðx; yÞ 2 TA is that there exist the same attribute values for x and y with respect to any a 2 A. For convenience, we denote ½xŠ1 A ¼ fyjðx; yÞ 2 TA; x; y 2 U1g. We derive U1=A ¼ f½xŠ1 Ajx 2 U1g ¼ fC1; C2; . . .; CNg and obtain S2 = (U2, A, V, f2) by defining g1(x) = yk for any x 2 Ck, where U2 = {yk|1 B k B N}, f2(yk, a) = f1(x, a) for a 2 A; and x 2 gÀ1 1 ðykÞ: Now, we obtain S4 = (U1[ U3, A, V, f1[ f2) by adding S3 = (U3, A, V, f3) into S1. To compress S4 by utilizing S2, we firstly obtain S5 = (U5, A, V, f5) by compressing S3 as S1. Then, we compress S2 [ S5 as S1 and get S7 which is the same as the compression of S1 [ S3. To express clearly, the process of compressing dynamic set-valued information systems is illustrated below. S1#S2 S3#S5 ' S6 ¼ S2 [ S5#S7S4 ¼ S1 [ S3 S1; S3 where # (respectively, ) denotes the process of compressing set-valued information systems. The Table 7 A set-valued information system by adding a5 into Table 3 U1 a1 a2 a3 a4 a5 x1 {0} {0} {1, 2} {1, 2} {1, 2} x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} {0, 2} x3 {1, 2} {0, 1} {1, 2} {1, 2} {1, 2} x4 {0, 1} {0, 2} {1, 2} {1} {2} x5 {1, 2} {1, 2} {1, 2} {1} {2} x6 {1} {1, 2} {0, 1} {0, 1} {0, 1, 2} x7 {0} {0} {1, 2} {1, 2} {0, 2} x8 {1} {1, 2} {0, 1} {0, 1} {3} Table 8 The partitions based on R1, R2, R3, R4, R5 and R[ A[fa5g, respectively U1 R1 R2 R3 R4 R5 R[ A[fa5g x1 {x1, x7} {x1, x7} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x1, x7} x2 {x2, x4} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x2, x4} x3 {x3, x5, x6, x8} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x3} x4 {x2, x4} {x2, x3, x4} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x2, x4} x5 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x5,x6} x6 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x5,x6} x7 {x1, x7} {x1, x7} U1 U1 {x1, x2, x3, x4, x5, x6, x7} {x1, x7} x8 {x3, x5, x6, x8} {x5, x6, x8} U1 U1 {x8} {x8} Int. J. Mach. Learn. Cyber. (2014) 5:775–788 783 123
  • 10. computational complexity of constructing the homomor- phism is m à OðjU3j2 Þ þ m à OðjU2 [ U5j2 Þþ Oððm À 1Þ Ã jU2 [ U5j2 Þ with the incremental algorithm. But the com- putational complexity is m à OðjU1 [ U3j2 Þ þ Oððm À 1Þ Ã jU1 [ U3j2 Þ without S2. Example 5.4 Let Table 11 be a set-valued information system S1 = {U1, A, V, f1}. By Definition 5.3, we obtain that U1/A = {{x1, x2}, {x3, x4}, {x5, x6}}. Then, we define g1 and f2 as follows: g1ðx1Þ ¼ g1ðx2Þ ¼ y1; g1ðx3Þ ¼ g1ðx4Þ ¼ y2; g1ðx5Þ ¼ g1ðx6Þ ¼ y3; f2ðyi; aiÞ ¼ f1ðx; aiÞ; where x 2 gÀ1 1 ðyiÞ. Consequently, we compress S1 into S2 = (U2, A, V, f2) shown in Table 12, where U2 ¼ fgðxÞjx 2 U1g: The following example is employed to illustrate how to update the compression when adding an object set. Example 5.5 By adding S3 shown in Table 13 into S1, we obtain S4 = S1[ S3 shown in Table 14. To compress S4, we compress S3 to S5 = (U5, A, V, f5) shown in Table 15 as Example 5.4. Then we compress S6 = S2[ S5 shown in Table 16 and obtain S7 = {U7, A, V, f7} shown in Table 17. Afterwards, we can continue to compress S7 as Example 4.3 in Sect. 4. Below we compress the dynamic set-valued information system when deleting an object set. Suppose S1 = (U1, A, V, f1) is a set-valued information system, we compress S1 to S2 = (U2, A, V, f2) under the condition of g1. We obtain S4 = (U4, A, V, f4) by deleting S3 = (U3, A, V, f3), where U3 U1 and U4 = U1 - U3. There are three steps to compress S4 = (U4, A, V, f4) based on S2. Firstly, we obtain that U1=A ¼ f½xŠ1 Ajx 2 U1g and U3=A ¼ f½xŠ3 Ajx 2 U3g by Definition 5.3. It is obvious that ½xŠ3 A ½xŠ1 A for any x 2 U3. Subsequently, we cancel g1(x) in U2 if [x]A 3 = [x]A 1 and keep g1(x) in U2 if [x]A 3 = [x]A 1 . Finally, we obtain S5 = (U5, A, V, f5) and compress it as Example 4.3. The computational complexity of constructing the homomorphism is m à OðjU5j2 Þ þ Oððm À 1Þ Ã jU5j2 Þ with the incremental algorithm. But the computational complexity is m à OðjU1 À U3j2 Þ þ Oððm À 1Þ Ã jU1 À U3j2 Þ without S2. Example 5.6 We take S4 and S7 shown in Example 5.5 as the original set-valued information system S1 and the compressed information system S2, respectively. By Table 9 An updated set-valued information system U1 a2 a3 a4 x1 {0} {1, 2} {1, 2} x2 {0, 1, 2} {1, 2} {0, 1, 2} x3 {0, 1} {1, 2} {1, 2} x4 {0, 2} {1, 2} {1} x5 {1, 2} {1, 2} {1} x6 {1, 2} {0, 1} {0, 1} x7 {0} {1, 2} {1, 2} x8 {1, 2} {0, 1} {0, 1} Table 10 The partitions based on R2, R3, R4 and R[ ðAÀfa1gÞ, respectively U1 R2 R3 R4 R[ ðAÀfa1gÞ x1 {x1, x7} U1 U1 {x1, x7} x2 {x2, x3, x4} U1 U1 {x2, x3, x4} x3 {x2, x3, x4} U1 U1 {x2, x3, x4} x4 {x2, x3, x4} U1 U1 {x2, x3, x4} x5 {x5, x6, x8} U1 U1 {x5, x6, x8} x6 {x5, x6, x8} U1 U1 {x5, x6, x8} x7 {x1, x7} U1 U1 {x1, x7} x8 {x5, x6, x8} U1 U1 {x5, x6, x8} Table 11 The set-valued information system S1 U1 a1 a2 a3 x1 {0, 1} {0, 2} {1, 2} x2 {0, 1} {0, 2} {1, 2} x3 {0, 1} {1} {0, 1} x4 {0, 1} {1} {0, 1} x5 {1, 2} {1} {1, 2} x6 {1, 2} {1} {1, 2} Table 12 The compressed set-valued information system S2 of S1 U2 a1 a2 a3 y1 {0, 1} {0, 2} {1, 2} y2 {0, 1} {1} {0, 1} y3 {1, 2} {1} {1, 2} Table 13 The set-valued information system S3 U3 a1 a2 a3 x7 {1, 2} {0, 2} {0, 1} x8 {1, 2} {0, 2} {0, 1} x9 {0, 1} {1} {0, 1} x10 {0, 1} {1} {0, 1} 784 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 11. deleting S3 = (U3, A, V, f) shown in Table 18, we obtain the set-valued information system S4 shown in Table 19. To compress S4, we first get that U1/A = {{x1, x2}, {x3, x4, x9, x10}, {x5, x6}, {x7, x8}} and U3/ A = {{x1, x2}, {x3}}. Obviously, [x1]A 1 = [x2]A 1 = {x1, x2} = [x1]A 3 = [x2]A 3 and ½x3Š3 A ¼ fx3g fx3; x4; x9; x10g ¼ ½x3Š1 A. Then we cancel z1 and keep {z2, z3, z4} in Table 17. Afterwards, we obtain the compressed set-valued information system S5 shown in Table 20. We can continue to compress S5 as Example 4.3 in Sect. 4. 5.3 Alterations of attribute values In this subsection, we show a process of compressing dynamic set-valued information systems in terms of attri- bute value variation. Suppose S1 = (U1, A, V1, f1) is a set-valued information system, we get S2 = (U1, A, V2, f2) when revising f(xj, ai), where xj 2 U1 and ai 2 A. By utilizing Algorithm 4.2 there are three steps to compress S2 as follows. Step 1: Derive U1=Rà i by inducing [ -relations Rà i (the relation based on the attribute ai after revising f(xj, ai) is denoted by Rà i ). Step 2: Get U1=Rà [ A based on U1=Rà i and U1/Rl (1 B l B m, l = i). Step 3: Obtain S3 ¼ ðgðU1Þ; gðU1=Rà [ A Þ by defining g based on U1=Rà [ A : The computational complexity of constructing g is Oðn2 Þ þ Oððm À 1Þ Ã n2 Þ with the incremental algorithm. But the computational complexity is m à Oðn2 Þ þ Oððm À 1Þ Ã n2 Þ without S2. Similarly, we can compress the set- valued information system when there are more alterations of attribute values. Example 5.7 (Continuation of Example 4.3) Consider Table 3, we revise f(x8,a1) = {1} to f(x8,a1) = {0} and obtain Table 21. In Example 4.3, we have that Table 14 The set-valued information system S4 = S1[ S3 U4 = U1[ U3 a1 a2 a3 x1 {0, 1} {0, 2} {1, 2} x2 {0, 1} {0, 2} {1, 2} x3 {0, 1} {1} {0, 1} x4 {0, 1} {1} {0, 1} x5 {1, 2} {1} {1, 2} x6 {1, 2} {1} {1, 2} x7 {1, 2} {0, 2} {0, 1} x8 {1, 2} {0, 2} {0, 1} x9 {0, 1} {1} {0, 1} x10 {0, 1} {1} {0, 1} Table 15 The set-valued information system S5 U5 a1 a2 a3 y4 {1, 2} {0, 2} {0, 1} y5 {0, 1} {1} {0, 1} Table 16 The set-valued information system S6 = S2 [ S5 U6 = U2 [ U4 a1 a2 a3 y1 {0, 1} {0, 2} {1, 2} y2 {0, 1} {1} {0, 1} y3 {1, 2} {1} {1, 2} y4 {1, 2} {0, 2} {0, 1} y5 {0, 1} {1} {0, 1} Table 17 The set-valued information system S7 U7 a1 a2 a3 z1 {0, 1} {0, 2} {1, 2} z2 {0, 1} {1} {0, 1} z3 {1, 2} {1} {1, 2} z4 {1, 2} {0, 2} {0, 1} Table 18 The set-valued information system S3 U3 a1 a2 a3 x1 {0, 1} {0, 2} {1, 2} x2 {0, 1} {0, 2} {1, 2} x3 {0, 1} {1} {0, 1} Table 19 The set-valued information system S4 U4 = U1 - U3 a1 a2 a3 x4 {0, 1} {1} {0, 1} x5 {1, 2} {1} {1, 2} x6 {1, 2} {1} {1, 2} x7 {1, 2} {0, 2} {0, 1} x8 {1, 2} {0, 2} {0, 1} x9 {0, 1} {1} {0, 1} x10 {0, 1} {1} {0, 1} Table 20 The set-valued information system S5 U5 a1 a2 a3 z2 {0, 1} {1} {0, 1} z3 {1, 2} {1} {1, 2} z4 {1, 2} {0, 2} {0, 1} Int. J. Mach. Learn. Cyber. (2014) 5:775–788 785 123
  • 12. R1ðx1Þ ¼R1ðx7Þ ¼ fx1; x2; x4; x7g; R1ðx2Þ ¼R1ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g; R1ðx3Þ ¼R1ðx5Þ ¼ R1ðx6Þ ¼ R1ðx8Þ ¼ fx2; x3; x4; x5; x6; x8g : Then we obtain Rà 1ðx8Þ ¼ fx1; x2; x4; x7; x8g and Rà 1ðx1Þ ¼Rà 1ðx7Þ ¼ fx1; x2; x4; x7; x8g; Rà 1ðx2Þ ¼Rà 1ðx4Þ ¼ fx1; x2; x3; x4; x5; x6; x7; x8g; Rà 1ðx3Þ ¼Rà 1ðx5Þ ¼ Rà 1ðx6Þ ¼ fx2; x3; x4; x5; x6g; Rà 1ðx8Þ ¼fx2; x3; x4; x5; x6; x8g: Consequently, we get that U1=Rà 1 ¼ ffx1; x1g; fx2; x4g; fx3; x5; x6g; fx8gg. Based on U1=Rà 1; U1=R2; U1=R3 and U1/ R4, we can construct a homomorphism and compress the original set-valued information system shown in Table 21 into a smaller one. For simplicity, we do not show the process of constructing the homomorphism. At the end of this section, we show the computational complexities of compressing the dynamic set-valued information system and attribute reduction of dynamic set- valued information system in Tables 22 and 23, respec- tively. In Tables 22 and 23, AA denotes adding attribute set {am?1, am?2, Oðm à n2 Þ, ak}; DA stands for deleting attribute set {al}; AO indicates adding object set U3; DO refers to as deleting object set U3; AAV is the alteration of an attribute value. 6 Conclusions In practical situations, it is difficult to construct attribute reduction of large-scale set-valued information systems and dynamic set-valued information systems. In this paper, we have introduced three relations for solving issues of set- valued information systems. Moreover, we have proposed an incremental algorithm for attribute reduction of set- valued information systems and studied their basic prop- erties. We have conducted attribute reduction of set-valued decision information systems. We have illustrated the process of compressing the set-valued information system with several examples. Afterwards, we have compressed Table 21 A set-valued information system U1 a1 a2 a3 a4 x1 {0} {0} {1, 2} {1, 2} x2 {0, 1, 2} {0, 1, 2} {1, 2} {0, 1, 2} x3 {1, 2} {0, 1} {1, 2} {1, 2} x4 {0, 1} {0, 2} {1, 2} {1} x5 {1, 2} {1, 2} {1, 2} {1} x6 {1} {1, 2} {0, 1} {0, 1} x7 {0} {0} {1, 2} {1, 2} x8 {0} {1, 2} {0, 1} {0, 1} Table 22 The computational complexity of compressing dynamic set-valued information system Alteration Incremental algorithm Non-incremental algorithm AA ðk À mÞ Ã OðjU1j2 Þ þ Oððk À 1Þ Ã jU1j2 Þ k à OðjU1j2 Þ þ Oððk À 1Þ Ã n2 Þ DA Oððm À 2Þ Ã jU1j2 Þ ðm À 1Þ Ã OðjU1j2 Þ þ Oððm À 2Þ Ã jU1j2 Þ AO m à OðjU3j2 Þ þ m à OðjU2 [ U5j2 Þ m à OðjU1 [ U3j2 Þ þ Oððm À 1Þ Ã jU1 [ U3j2 Þ þOððm À 1Þ Ã jU2 [ U5j2 Þ DO AAV m à OðjU5j2 Þ þ Oððm À 1Þ Ã jU5j2 Þ m à OðjU1 À U3j2 Þ þ Oððm À 1Þ Ã jU1 À U3j2 Þ OðjU1j2 Þ þ Oððm À 1Þ Ã jU1j2 Þ m à OðjU1j2 Þ þ Oððm À 1Þ Ã jU1j2 Þ Table 23 The computational complexity of attribute reduction for dynamic set-valued information system Alteration Incremental algorithm Non-incremental algorithm AA ðk À mÞ Ã OðjU1j2 Þ þ Oððk À 1Þ Ã jU1j2 Þ þ Oðk à jU2j2 Þ k à OðjU1j2 Þ þ Oððk À 1Þ Ã jU1j2 Þ þ Oðk à jU2j2 Þ DA Oððm À 2Þ Ã jU1j2 Þ þ Oððm À 1Þ Ã jU2j2 Þ ðm À 1Þ Ã OðjU1j2 Þ þ Oððm À 2Þ Ã jU1j2 Þ þOððm À 1Þ Ã jU2j2 Þ AO Oððm À 1Þ Ã jU2 [ U5j2 Þ þ Oðm à jU7j2 Þ m à OðjU1 [ U3j2 Þ þ Oððm À 1Þ Ã jU1 [ U3j2 Þ þm à OðjU3j2 Þ þ m à OðjU2 [ U5j2 Þ þOðm à jU7j2 Þ DO m à OðjU5j2 Þ þ Oððm À 1Þ Ã jU5j2 Þ m à OðjU1 À U3j2 Þ þ Oððm À 1Þ Ã jU1 À U3j2 Þ þOðm à jU5j2 Þ þOðm à jU5j2 Þ AAV OðjU1j2 Þ þ Oððm À 1Þ Ã jU1j2 Þ þ Oðm à jU1j2 Þ m à OðjU1j2 Þ þ Oððm À 1Þ Ã jU1j2 Þ þ Oðm à jU1j2 Þ 786 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123
  • 13. dynamic set-valued information systems by using an incremental algorithm. There are still some interesting problems which need be further discussed. For example, we will focus on com- pressing fuzzy set-valued information systems and dynamic fuzzy set-valued information systems. We will investigate the compression of interval-valued information systems, fuzzy interval-valued information systems, dynamic interval-valued information systems and dynamic fuzzy interval-valued information systems. Acknowledgments We would like to thank the anonymous reviewers very much for their professional comments and valuable suggestions. This work is supported by the National Natural Science Foundation of China (No. 11071061,11371130) and the National Basic Research Program of China (No. 2010CB334706, 2011CB311808). References 1. Banerjee M, Pal SK (1996) Roughness of a fuzzy set. Inf Sci 93(3-4):235–246 2. Bhatt RB, Gopal M (2005) On the compact computational domain of fuzzy-rough sets. Pattern Recognit Lett 26(11):1632–1640 3. Biswas R (1994) On rough sets and fuzzy rough sets. Bull Pol Acad Sci Math 42:345–349 4. Bobillo F, Straccia U (2012) Generalized fuzzy rough description logics. Inf Sci 189:43–62 5. Capotorti A, Barbanera E (2012) Credit scoring analysis using a fuzzy probabilistic rough set model. Comput Stat Data Anal 56(4):981–994 6. Chakrabarty K, Biswas R, Nanda S (2000) Fuzziness in rough sets. Fuzzy Sets Syst 110:247–251 7. Chen HM, Li TR, Qiao SJ, Ruan D (2010) A rough set based dynamic maintenance approach for approximations in coarsening and refining attribute values. Int J Intell Syst 25(10):1005–1026 8. Chen ZC, Qin KY (2008) Attribute reduction of set-valued information system based on variable precision tolerance realtion. Comput Eng Appl 44: 27–29 9. Chen ZC, Qin KY (2009) Attribute reduction of set-valued information system based on tolerance relation. Fuzzy Syst Math 23(1):150–154 10. Dai JH (2013) Rough set approach to incomplete numerical data. Inf Sci. doi:10.1016/j.ins.2013.04.023. 11. Dai JH, Tian HW (2013) Entropy measures and granularity measures for set-valued information systems. Inf Sci 240:72–82 12. Dai JH, Tian HW (2013) Fuzzy rough set model for set-valued data. Fuzzy Sets Syst 229:54–68 13. Dai JH, Wang WT, Tian HW, Liu L (2013) Attribute selection based on a new conditional entropy for incomplete decision systems. Knowl-Based Syst 39:207–213 14. Dai JH, Xu Q (2012) Approximations and uncertainty measures in incomplete information systems. Inf Sci 198:62–80 15. Dai JH, Xu Q (2013) Attribute selection based on information gain ratio in fuzzy rough set theory with application to tumor classification. Appl Soft Comput 13(1):211–221 16. Diker M, Ug˘ur AA (2012) Textures and covering based rough sets. Inf Sci 184(1):44–63 17. Dubois D, Prade H (1990) Rough fuzzy sets and fuzzy rough sets. Int J General Syst 17:191–209 18. Feng T, Zhang SP, Mi JS, Feng Q (2011) Reductions of a fuzzy covering decision system. Int J Model Identif Control 13(3):225–233 19. Gong ZT, Xiao ZY (2010) Communicating between information systems based on including degrees. Int J General Syst 39(2):189–206 20. Grzymala-Busse JW (2010) Rough set and CART approaches to mining incomplete data. In: 2010 international conference of soft computing and pattern recognition (SoCPaR), Paris, pp 214–219 21. Grzymala-Busse JW, Sedelow Jr. WA (1988) On rough sets and information system homomorphism. Bull Pol Acad Sci Tech Sci 36(3):233–239 22. Guan YY, Wang HK (2006) Set-valued information systems. Inf Sci 176(17):2507–2525 23. Huang B, Li HX, Wei DK (2012) Dominance-based rough set model in intuitionistic fuzzy information systems. Knowl-Based Syst 28: 115–123 24. Jensen R, Shen Q (2004) Semantics-preserving dimensionality reduction: rough and fuzzy-rough-based approaches. IEEE Trans Knowl Data Eng 16(12):1457–1471 25. Leung Y, Li DY (2003) Maximal consistent block technique for rule acquisition in incomplete information systems. Inf Sci 153:85–106 26. Li SY, Li TR, Liu D (2013) Incremental updating approximations in dominance-based rough sets approach under the variation of the attribute set. Knowl Based Syst 40:17–26 27. Li DY, Ma YC (2000) Invariant characters of information sys- tems under some homomorphisms. Inf Sci 129(1–4):211–220 28. Li TR, Ruan D, Geert W, Song J, Xu Y (2007) A rough sets based characteristic relation approach for dynamic attribute general- ization in data mining. Knowl Based Syst 20(5):485–494 29. Liu CH, Miao DQ, Zhang N (2012) Graded rough set model based on two universes and its properties. Knowl Based Syst 33:65–72 30. Liu GL (2010) Rough set theory based on two universal sets and its applications. Knowl Based Syst 23(2):110–115 31. Liu PH, Chen ZC, Qin KY (2009) Attribute reduction of set- valued information systems based on maximal variable precision tolerance classes. J Sichuan Normal Univ 32(5):576–580 32. Liu D, Li TR, Liang DC (2013) Incorporating logistic regression to decision-theoretic rough sets for classifications. Int J Approx Reason. doi:10.1016/j.ijar.2013.02.013 33. Liu D, Li TR, Ruan D (2011) Probabilistic model criteria with decision-theoretic rough sets. Inf Sci 181:3709–3722 34. Liu D, Li TR, Ruan D, Zhang JB (2011) Incremental learning optimization on knowledge discovery in dynamic business intelligent systems. J Global Optim 51(2):325–344 35. Liu D, Li TR, Ruan D, Zou WL (2009) An incremental approach for inducing knowledge from dynamic information systems. Fund Inf 94(2):245–260 36. Morsi NN, Yakout MM (1998) Axiomatics for fuzzy rough sets. Fuzzy Sets Syst 100(1–3):327–342 37. Meng D, Zhang XH, Qin KY (2011) Soft rough fuzzy sets and soft fuzzy rough sets. Comput Math Appl 62:4635–4645 38. Miao DQ, Gao C, Zhang N, Zhang ZF (2011) Diverse reduct subspaces based co-training for partially labeled data. Int J Approx Reason 52:1103–1117 39. Nanda S, Majumdar S (1992) Fuzzy rough sets. Fuzzy Sets Syst 45(2):157–160 40. Pawlak Z (1982) Rough sets. Int J Comput Inform Sci 11(5):341–356 41. Qian YH, Dang CY, Liang JY, Tang DW (2009) Set-valued ordered information systems. Inf Sci 179:2809–2832 42. Qin KY, Pei Z (2005) On the topological properties of fuzzy rough sets. Fuzzy Sets Syst 151:601–613 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 787 123
  • 14. 43. Qin KY, Yang JL, Pei Z (2008) Generalized rough sets based on reflexive and transitive relations. Inf Sci 178:4138–4141 44. Skowron A (1990) The rough set theory and evidence theory. Fund Inf 13:245–262 45. Slezak D, Ziarko W (2005) The investigation of the Bayesian rough set model. Int J Approx Reason 40(1–2):81–91 46. Wang CZ, Wu CX, Chen DG (2008) A systematic study on attribute reduction with rough sets based on general binary relations. Inf Sci 178(9):2237–2261 47. Wang SP, Zhu QX, Zhu W, Min F (2012) Matroidal structure of rough sets and its characterization to attribute reduction. Knowl Based Syst 36:155–161 48. Wang XZ, Tsang E, Zhao SY, Chen DG, Yeung D (2007) Learning fuzzy rules from fuzzy examples based on rough set techniques. Inf Sci 177(20):4493–4514 49. Wang XZ, Zhai JH, Lu SX (2008) Induction of multiple fuzzy decision trees based on rough set technique. Inf Sci 178(16):3188–3202 50. Yang T, Li QG (2010) Reduction about approximation spaces of covering generalized rough sets. Int J Approx Reason 51(3):335–345 51. Yang XB, Song XN, Chen ZH, Yang JY (2012) On multigran- ulation rough sets in incomplete information system. Int J Mach Learn Cybern 3:223–232 52. Yang XB, Zhang M, Dou HL (2011) Neighborhood systems- based rough sets in incomplete information system. Knowl Based Syst 24(6):858–867 53. Yao YY (2003) Probabilistic approaches to rough sets. Expert Syst 20(5):287–297 54. Yao YY (2010) Three-way decisions with probabilistic rough sets. Inf Sci 180(3):341–353 55. Yao YY, Zhao Y (2008) Attribute reduction in decision-theoretic rough set models. Inf Sci 178(17):3356–3373 56. Zakowski W (1983) Approximations in the space (u, p). De- monstratio Math 16:761–769 57. Zhang JB, Li TR, Ruan D, Liu D (2012) Rough sets based matrix approaches with dynamic attribute variation in set-valued infor- mation systems. Int J Approx Reason 53(4):620–635 58. Zhang JB, Li TR, Ruan D, Liu D (2012) Neighborhood rough sets for dynamic data mining. Int J Intell Syst 27:317-342 59. Zhu W (2007) Topological approaches to covering rough sets. Inf Sci 177(6):1499–1508 60. Zhu, P (2011) Covering rough sets based on neighborhoods: An approach without using neighborhoods. Int J Approx Reason 52(3):461–472 61. Zhu P, Wen QY (2010) Some improved results on communica- tion between information systems. Inf Sci 180(18):3521–3531 62. Ziarko W (2008) Probabilistic approach to rough sets. Int J Approx Reason 49(2):272–284 788 Int. J. Mach. Learn. Cyber. (2014) 5:775–788 123