1. Proceedingsof the 3'd World Congresson IntelligentControl and Automation June 28-July 2,2000,Hefei,P.R. China
A Gragh-based Truth Maintenance System
Yue x u
School of Computing
University of Tasmania
Launceston, TAS 7250
Australia
Abstract - The Assumption-based Truth Main-
tenance System proposed by de Kleer is the most
popular scheme for performing belief revision.
The ATMS is a logic-based scheme, which per-
forms its reasoning on the knowledge represented
by a propositional theory. In this paper, we pro-
pose a causal network called augmented causal
network and a graph-based truth maintenance
method based on the causal network. We prove
that the graph-based truth maintenance method
provides correct implementations of the ATMS.
1 Introduction
There are a number of approachesin the areaof belief re-
vision. The development of truth maintenance systems,
which are derived from the original Truth Maintenance
System (TMS) of Doyle [3],provides a powerful mecha-
nism to perform belief revision. Of all the truth main-
tenance systems, the Assumption-based Truth Mainte-
nance System (ATMS) proposed by de Kleer [l]is the
most populnr scheme. For the ATMS, the domain knowl-
edge is represented by justifications (rules) which are
propositional logicformulae. One limitation of the logic-
hased representation is the lack of structural information
on the rule base organization. Generally, to find solu-
tions for a given query, the whole body of the rules has
to be searched since there is no information to guide
the searching. Comparatively, the graph-based repre-
sentation scheme which is based on a network structure
provides more structural information than that of the
logic-basedrepresentation scheme. Based on this consid-
eration, in this paper we propose an augmented causal
network and a TMS model called ACN-based ATMS.
We have extended the ACN-based ATMS to incorporate
probalilities for probabilistic abductive reasoning, which
will be presented in a forthcoming paper.
The remainder of this paper is organised as follows.
Firstly, in Section 2, a brief overview of the ATMS is
given. Secondly, in Section 3, a causal network which
we call Augmented Causal Network (ACN) is presented.
The ACN-based ATMSis described in Section 4. Finally,
Sectidn 5 concludes this paper.
2 An overview of the ATMS
The ATMS is aninferenceengine for abductive reasoning
and belief revision, which is constrained to processHorn
clauses, and requires that the set of propositions of the
domain knowledge have a distinguished subset called as-
sumptions. Once an abduction is needed, the problem
solver transmits to the ATMS a set of Horn clauses, a
set of assumptions and a query clause, and requires the
ATMS to find the abductive explanations for the query
clause. A brief description of the ATMS is given as foi-
lows.
2.1 Informal Definitions
Let 0 be a set whose elements are propositional literals.
A propositional logic language 0 with 0is the collection
of formulae determined by one of the following cases:
(a) all propositional literals are formulae; (b) if a is a
formula, so is l a , where 7 refers to negation; (c) if a and
/3 are formulae, so are a A p, a V p, a -+ p, and o E p,
where A, V, +,and 5 refer to and, or, implication, and
equivalence, respectively. A clause is a finite disjunction
formula a
1 v ... V a, of literals. Each litera1 is either a
positive literal or a negative literal. A Horn clause is a
clause with at most one positive literal. The followingis
a Horn clause.
l a 1 V . . . V l a , V a
0-78O3-5995-X/00/$10.00
02000IEEE. 285
2. where a is calledthe consequentof the clause, and ai,...,
a, the antecedents of the clause. The Horn clause is
equivalentto the material implication: a1A.. .AUm +a,
b and the Horn clause -al V ...V l a m is equivalentto the
material implication: al A , ..A am +L, where Irefers
to false or contradiction. In the ATMS, a Horn clauseis
called by de Kleer as a justification which is denoted
as al,...,am *a [2]. A set of Horn clauses (or a set of
justifications) is communicated by the problem solver to
the ATMS.
One distict feature of the ATMS is that it designates
a subset of the propositional literals (propositions) to
be assumptions - literals which are presumed to be
true unless there is ,evidence to the contrary. The as-
sumptions serve as the foundational beliefs. During its
reasoning, the ATMS maintains for each proposition a
chain of justifications to ensure that each proposition
is justified by itself when it is a foundational belief, or
justified by other propositions which are in turn justi-
fied by others (thus forming a chain of justifications). A
chain of justifications starts from a subset of assump-
tions. Thus the ATMS maintains for each proposition
a subset of assumptions which justifies the proposition
through a chain of justifications starting from the sub-
set of assumptions. A subset of assumptions is called an
ATMS environment or environment. A proposition p
is said to hold in environment E if p can be derived from
tIie union of E and the clause set, and the environment
E is called an environment of p. Because p can be de-
rived from the environment, the environment is treated
as an explanation to explain p. An environment is in-
consistent (called nogood by de Kleer) if Iholds in it.
Each proposition may have more than one environ-
ment in which the proposition can be justified or derived.
Therefore, for a given query p , to explain p the ATMS
will find all the environments of p . That is, the ATMS
maintains for each proposition a set of environments
rather than necessarily only one environment. The set
of environments in which the proposition holds is called
the label of the proposition. Each element in the label
is an explanation of the proposition. Let C represent the
set of clauseswhich is provided by the problem solver, A
represent the set of assumptions, nogood refer to the in-
consistent environments, label@) = {El,...,E,,} be the
label of propositionp , n > 0, then the label, label@),has
the following properties, where + refers to entailment,
Ei E labeE(p).
1. Justifiability: Each environment Ei is justified
by itself, i.e., E, E A.
2. Consistence: Ei is not nogood,i.e., AN E nogood
so that N 5 Ed.
3. Minimality: No Ei is a proper subset of any other
Ej, i.e., BEj E label@) SO that Ej C Ei.
4. Soundness: p holds in each Ei, i.e., C U Ei +p.
5. Completeness: Every environment E ihwhich p
holds is a superset of some Ei.
2.2 Basic algorithms
The basic data structure of the ATMS is an ATMS node.
The ATMS creates a node for each proposition supplied
by the problem solver, and associates with the node a
label which stores the justificational information for the
proposition. The label represents the set of environ-
ments in which the proposition is derived or believed.
An ATMS node consists of three fields: Datum, Label,
and Justification. Datum is the proposition represented
by the node, Label is the label of the proposition, and
Justification is the set of justifications whose consequent
is the proposition being represented. Thus, the ATMS
node of proposition p , denoted as yp,takes the form as:
YP :
The ATMS starts its reasoning with the problem
solver supplying the ATMS with a set of justifications
C.For eachjustification (An,p)E C, the ATMS invokes
algorithm label-update(p) to process the justification
with adding the justification into the node -fp and updat-
ing the label of the node rp.Before label-updatek) is
called, it is supposed that the labels of each antecedent
node as well as the consequent node of the justification
have already been created. Suppose there are m jus-
tifications in C which take p a
s the consequent. That
is, assume that a k l , . ..,aknk p , k = 1,...,m, are
the justifications, label(aki) and label@) are the current
lables of each antecedent aki and the consequent p, re-
spectively, the algorithm label-update@)to compute
the new label of p based on label(aki) and label@) is de-
scribed as follows.
<p,label (p),justificatiunb) >
286
3. ALGORITHM label-update@)
1. begin
2. L, := {}
3. for each akl,. ..,aknh =+-p do
L, :=L, u { x I2 := Uxi, xi E label(aki)}
i=nh
i=l
4. endfor
5.
6.
for each environment e E L, do
if e subsumes any nogood environments in
if e subsumes any environments in label@)
nogood then delete e from L,.
then delete e from L,.
7.
8. endfor
9.
10.
if p is Ithen nogood :=nogoodu L,; return.
if L, # labet(p)then
label(p):= L,
11.
12. end.
for each justification (An,q)f C such that
p E An, label-update(q).
3 Augmented Causal Networks
Two kinds of knowledge, structural and probabilistic
knowledge, are used by most abductive problem solv-
ing systems. Structural knowledge specifies the enti-
ties which are involved in an application domain, and
alsospecifiesthe relationships amongest the entities such
as causal relationship, implication relationship, or sub-
sumption relationship. The structural knowledge is rep-
resented symbolically such as the formulae in proposi-
tional logic. Probabilistic knowledge specifiresthe plau-
sibility of the entities to occur and the uncertainty of
the associationsamongest the entities. The probabilistic
knowledge is usually represented numerically. The prob-
abilistic causal network proposed by Peng & Reggia [4]
represents the two aspects of abductive knowledge with
the nodes in the network representing the entities, the
links in the network representingthe causal relationships.
amongest the entities, ‘and the probabilities associated
with the links representing the strength of the causal re-
lationships. In this section, we present a probabilistic
/
causal network to represent abductive knowledge. The
causal network proposed here is different from the one
proposed by Peng & Reggia both in the network struc-
ture and the assumptions underlying the network. We
call the causal network as Augmented Causal Net-
work for distinction.
3.1 Basic Definitions
An augmented causal network consists of a directed graph
and aconditional probability distributionassociated with
the graph. The definition is given as follows.
Definition 3.1 ( Augmented Causal Network) :
An augmented causal network is a four-tuple < HIL,P,
TYPE > which is denoted as @(H,
L,P,TYPE), where:
e H is a finite set of nodes, H = {hi,...,hn},
e ‘L 5 H x H is the set of links or arcs, each el-
;pent in L is a pair of nodes like < hilhj >
which indicates an arc from hi to hj, i.e., hi hjJ
hi,hj E H J
e P is a set of probabilities. For each node hi E
H , P(h,) refers to the prior probability of hi. For
each link < hi,hj > in L, there is a conditional
probability P(h,/h,) accompanying the link which
is called the strength from hi to hj, representing
how strongly h, causes or implies h,. If there is no
link between hi and hj, P(hj/h,) is assumed to be
zero.
a T Y P E is a map from H to {OR,AND,TRUE,
NULL}. For each node hi E H , TYPE(hi) =
AND, TYPE(hi) = OR, TYPE(hi) = TRUE,
or TRUE(h,) = NULL, which is called the type
of hi.
The relationships represented by the links in the ACNs
are not limited to causal relationship, even though the
newtwork is called “causal network”. For convenience,
we define two specificsets called cause set and effect set.
Qh E H , the cause set of h is defined as causes(h) =
{hi I< hi,h >E L}, and the effect set is defined as
effects(h)= {hi I< h,h, >E L}. Vh E H , ifcauses(h) #
{}, among the nodes in causes(h), there are two pos-
sible logical relationships, i.e., disjunctive relationship
and conjunctive relationship. The disjunctive relation-
ship has been received much attention and considered
by almost all the causal networks such as the causal
network used by Peng & Reggia. In ACNs, both the
disjunctive relationship and the conjunctive relationship
287
4. are considered. In order to distinguish the two different
relationships, we use TYPE to denote the,type of each
node. Vhi E H , suppose causes(hi) = (hi11 . . .,hiT}.
TYPE(hi)= OR indicates that the relation among the
nodes in causes(hi)is disjunctive, i.e., hi1 V ...V hi, -+
hi and T 2 1. TYPE(hi)= A N D indicates that the
relation is conjunctive, i.e., hi1 A ...A hi, + hi and
r 2 1. TYPE(hi) = TRUE indicates that hi is a
premise. TYPE(hj)= NULL iff T = 0.
In next section, we will show that a propositional
logic language with the formulae constrained to Horn
clauses can be represented by an augmented causal net-
work.
3.2 Conversion algorithms
As we have introduced that the ATMS performs its ab-
ductive reasoning based on a set of Horn clauses of a
propositional logic language, i.e., the abductive knowl-
edge is represented by the set of Horn clauses. The algo-
rithm conversion which is described below can convert
a set of Horn clauses into an augmented causal network,
i.e., the knowledge which is represented by a set of Horn
clauses can be represented by an augmented causal net-
work. Let C be a set of justifications and @ be the
set of propositional literals which are involved in E,algo-
rithm conversion(& +(HIL,P,TYPE))describes the
method to convert the justifications in C into an aug-
mented causal network +(H,L,P,TYPE).
ALGORITHM
1. begin
2.
i := 0.
3. TYPE(T)=TRUE.
4.
5.
conversion(C, +(H,
L,P,TYPE))
H := {T},L := {}, P := {P(s) I z E O},
for each a E 0 do TYPE(a) := NULL
for each justification J E C and J 5 (* a) d o
endfor.
6. H :=H Uta};
... 8. P :=PU{P(a/T)};
7. L :=L U{<T ,a >};
9. TYPE(a):= TRUE.
10. endfor
11.
12.
13.
14.
15.
16.
17.
18.
19.
20. * TYPE(hi):=AND.
21. TYPE(a)=OR;
22. endfor.
23. end.
After the converting, an augmented causal network
+(H,L,P,TYPE) is created. +(H,L,P,TYPE) has
some features as follows.
4 ACN-based ATMS
In this section, firstly, we give the definitions of the
ACN-based ATMS and describe the algorithms to com-
pute the explanations for an abductive problem, then, we
prove that the explanations computed by the ACN-based
ATMS satisfies the requirements of the ATMS environ-
ments. In this section, the probability calculation of the
ACN-based ATMS is ignored temporally, since the a
i
m
of this section is to study the correctness of the ACN-
based ATMS not the probability calculations. We will
describe the probability calculation in other papers due
to the space limitation.
4.1 Definitions and Algorithms
Definition 4.1 (Function I?) Let +(H,L,P,TYPE) be
an ACN, function I' is a map from subsets of H to other
subsets, i.e., I?: 2H t-) 2H. Suppose X E 2H, r ( X ) is
defined as fo[lows.
I'(X) = XU{h'l h E HAcauses(h) # {}A((cuuses(h)g
X A TYPE(h) = AND) V((cuuses(h)nX)# {} A
TYPE(h)= OR) V{T} causes(h))}
288
5. Definition 4.2 (Closure) Let @(H,
L,P,TYPE) be an
ACN, and X 5 H , the closure of X , denoted as CZ(X),
is 4ejined as follows.
(1). ClO(X)= x
(2). a + 1 (X)
= F(Cli(X))
1. Justifiability: Each cover C
i is justified by itself,
i.e., Ci S.
2. Consistence: C
i is not a contradiction,i.e., ,BN E
contra so that N 5 Ci.
(3). CZ(X)= CEI,(X)Z#ClI,+l(X) = CIk(X)
3. Minimality: No Ci is a proper subset of any other
Definition 4.3 (Contradiction) Let< @(H,L,P,TYPE) cj,i.e., acjE candidate(h)so that cj ci.
be an ACN, and X H , if 3h E H , both h E C l ( X )and
l h E Cl(X),then X is called a Contradiction.
Definition 4.4 (ACN-based AI'MS theory) An ACN- 5. Completeness: Every cover C in which h holds
based ATMS theory is a triple-tuple< @(H,
L,P,TYPE),
S,contra >, where
4. Soundness: h holds in each Ci, i.e., C
i h.
is a superset of some Ci.
For each h in H , the ACN-based ATWIS creates a
@ ( H ,
L,P,T Y P E ) is an augmented causal network
with H being the node set, L being the link set, p data structure node to store the candidate of h, denoted
being the probability set, and T Y P E being the type as DSN(h). The data structure node has three filds,
map. i.e., DSN(h)=< h,cand,stntus >,where cand refers to
s is a non-emptysubset H , Elements of sare the candidate of h, i.e., DSN(h).cand = candidate(h),
called assumptions. and status indicates the type of the data structure node
which has four values, ASSU, DERI, FACT and DEAD,
referring to assumption node, derived node, fact node
and contradictionnode, respectively. The followingalp-
rithms reconstruct the reasoning of the ATMS using the
ACN-based ATMS. Algorithm ACN-ATMS(h) creates
the node DSN(h)if it doesn't exist, then computes the
candidate of h using Algorithm candidate-update(h)
after all the DSN(c) of each cause c E causes(h) has
been created. The same as the ATMS, combined with
> be an A c N - ~ ~ ~ ~ ~
~ ~ M ~ t l l e s e
algorithms, the ACN-based ATMS theory becomes
an abductive model. In the next chapter, we Will Present
0 contra is a subset of 2', each element of COTltTU is
a contradiction.
As the ATWfS theory, an ACN-based ATMS theory
also can be used to represent an abductive problem. H
in the ACN-based ATMS corresponds to D,and L, P
and T Y P E together correspond to K in an abductive
problem defined by Definition 2.2.1.
Definition 4.5 (ACN-based ATMS Cover) Let
< +(H,L,p,TYPE),S,
theory, and C 5 H , C is said to be a cover if C satisfies
the follows conditions.
C L S .
VN E COIltTU, N C .
a probability calculus of the ACN-based ATMS abduc-
tive model, which corresponds to pl in Definition 2.2.2of
an abductive model. The probability calculation of the
ACN-based ATMS is a difference from the ATMS.
The closure cl(X) is a set of nodes which can be
caused or derived by X. Vh E H , if h E C l ( X ) and
S is a cover, then X is a cover of h and /L is said to
hold in S,denoted as X U $(H,L,P,TYPE) h, 2* if = then DSN(h) :=< T ,{{)),FACT>,
or simply X A. Generally, h may have more than
one cover in which it holds. The task of the ACN-
based ATMS is to find and maintain a set of covers for
each node in the ACN. The set of covers of node h is
ALGORITHfif
1. begin
ACN-ATMS(~)
return.
3.
4
5.
if DSN(h)exists then return.
if h E S then DSN(h):=< h,{{h}},ASSU>.
if h g s then
called the canditate of h denoted as candidate(h).Sup-
pose candidatc(h) = {Cl,.. .,C,}, like the label of the
ATMS, the candidate of h has the following properties,
where C, E cnnrlidate(h).
6. if causes(1L)= { T } then DSN(h) :=<
7. if h =Ithen D S N ( 1 ) :=<I
h, {{)I,FACT >.
, {},DEAD>.
289
6. 8.
9.
10.
else DSN(h):=< h,{}, DERI >. Both the two algorithms are recursive algorithms.
Before computing the candidate of h using Algorithm
candidate-update@), AlgorithmACN-ATMS(h) cre-
ates the datastructure nodes forall the causesof h recur-
if auses(h)# {} then
for each c E ca;uses(h)do
11. ACN-ATMS(c);
12. endfor
13. candidate-update(h).
14. end.
ALGORITHM candidate-update(h)
sivily. After computing the candidate of h, candidate-
update(h) will propogate the updated candidate to the
effect nodes of h recursivily. In the next subsection, we
will prove that the candidate computed by the ACN-
based ATMS has the properties which are required by
the ATMS label.
5 Summary
1. Let causes(h)= {CI,. ..,cm}.
In this paper, we presented a causal network called Aug-
mented Causal Network, and proposed a TMS model
2. begin
3. called ACN-based ATMS. The main difference between
4. thenadnew
:= DSN(h).and. the ACN-based ATMS and the ATMS is the knowledge
representation. An efficient conversion algorithm was
given to convertjustifications into an ACN. That means,
5.
6. and,,,, := {x I x :=U
;
:
'
' zi,xi E DSN(q).and).the problems which can be solved by the ATMS can be
if TYPE(h)=NULL Or TYPE(h)= TRUE
if TYPE(h)= AND then
. -
7. if TYPE(h) = OR then
solved by the ACN-based ATMS. A very important con-
cern behind the work of this paper is to incorporateprob-
8. cand,,,, := U
;
:
'
' DSN(q).mnd.
9. if DSN(h).status = ASSU then
ability calculationsinto abductive reasoning. Thegraph-
ical knowledge representation lets the ACN-based ATMS
easily incorporate the probability calculations into its
reasoning. We will present our work on this concern in
a forthcomingpaper.
10. candnew :=andnewU{{h))
11. for each cover c E Candnew do
References
13. if 3c' E candnewand c' C c then andnew:=
14. endfor Intelligence, 28:127-162, 1986.
cand,,,, - {c}. [l]J. de Kleer. An assumption-based tms. Artificial
15. if can&,, # DSN(h).candthen
DSN(h).cand :=a n d n e w ;
16. if effects(h) # {}
[2] J. de Kleer. A general labeling algorithm for
assumption-based truth maintenance. In Proc. of
AAAI-88, pqges 188-192, 1988.
17.
18.
for each e E ef f eds(h) do [3] J. Doyle. A truth maintenance system. Artificial
if for all ci E cuuses(e) ,DSN(cj) exists Intelligence, 12:231-272, 1979.
19. then candidate-update(e). [4] Y. Peng and J. Reggia. Abductive Inference Mod-
20. endfor
21. end.
els for Diagnostic Problem-Solving. Springer-Verlag
New York Inc., 1990.
290