SlideShare a Scribd company logo
1 of 72
Download to read offline
Computing Science
Argumentation in Artificial Intelligence:
20 Years after Dung’s Work
Federico Cerutti
Department of Computing Science July 2015
University of Aberdeen
King’s College
Aberdeen AB24 3UE
Copyright © 2015, The University of Aberdeen
Argumentation in Artificial Intelligence: 20
Years after Dung’s Work
Federico Cerutti
Department of Computing Science
University of Aberdeen
July 2015
Abstract: Handouts for the IJCAI 2015 tutorial on Argumentation.
This document is a collection of technical definitions as well as ex-
amples of various topics addressed in the tutorial. It is not supposed
to be an exhaustive compendium of twenty years of research in ar-
gumentation theory.
This material is derived from a variety of publications from many
researchers who hold the copyright and any other intellectual prop-
erty of their work. Original publications are thoroughly cited and
reported in the bibliography at the end of the document. Errors and
misunderstandings rest with the author of this tutorial: please send
an email to federico.cerutti@acm.org for reporting any.
Keywords: argumentation; tutorial; IJCAI2015
University of Aberdeen, 2015 Page 1
1 Dung’s Argumentation
Framework
Acknowledgement
This handout include material from a number of collaborators including
Pietro Baroni, Massimiliano Giacomin, and Stefan Woltran.5
Definition 1 ([Dun95]). A Dung argumentation framework AF is a pair
〈A ,→ 〉
where A is a set of arguments, and → is a binary relation on A i.e. →⊆
A ×A . ♠
An argumentation framework has an obvious representation as a di-
rected graph where the nodes are arguments and the edges are drawn10
from attacking to attacked arguments.
The set of attackers of an argument a1 will be denoted as a−
1 {a2 :
a2 → a1}, the set of arguments attacked by a1 will be denoted as a+
1 {a2 :
a1 → a2}. We also extend these notations to sets of arguments, i.e. given
E ⊆ A , E−
{a2 | ∃a1 ∈ E,a2 → a1} and E+
{a2 | ∃a1 ∈ E,a1 → a2}.15
With a little abuse of notation we define S → a ≡ ∃a ∈ S : a → b. Simi-
larly, b → S ≡ ∃a ∈ S : b → a.
1.1 Principles for Extension-based Semantics:
[BG07]
Definition 2.
Given an argumentation framework AF = 〈A ,→ 〉, a set20
S ⊆ A is D-conflict-free, denoted as D-cf(S), if and only if a,b ∈ S such
that a → b. A semantics σ satisfies the D-conflict-free principle if and only
if ∀AF,∀E ∈ Eσ(AF) E is D-conflict-free . ♠
Definition 3. Given an argumentation framework AF = 〈A ,→ 〉, an ar-
gument a ∈ A is D-acceptable w.r.t. a set S ⊆ A if and only if ∀b ∈ A25
b → a ⇒ S → b.
The function FAF : 2A
→ 2A
which, given a set S ⊆ A , returns the
set of the D-acceptable arguments w.r.t. S, is called the D-characteristic
function of AF. ♠
Definition 4. Given an argumentation framework AF = 〈A ,→ 〉, a set30
S ⊆ A is D-admissible (S ∈ AS (AF)) if and only if D-cf(S) and ∀a ∈ S
University of Aberdeen, 2015 Page 2
Dung’s AF • Acceptability of Arguments [PV02; BG09a]
a is D-acceptable w.r.t. S. The set of all the D-admissible sets of AF is
denoted as AS (AF). ♠
Dσ = {AF|Eσ(AF) = }
Definition 5.
A semantics σ satisfies the D-admissibility principle if and
only if ∀AF ∈ Dσ Eσ(AF) ⊆ AS (AF), namely ∀E ∈ Eσ(AF) it holds that:
a ∈ E ⇒ (∀b ∈ A ,b → a ⇒ E → b). ♠
Definition 6. Given an argumentation framework AF = 〈A ,→ 〉, a ∈
A and S ⊆ A , we say that a is D-strongly-defended by S (denoted as5
D-sd(a,S)) iff ∀b ∈ A , b → a, ∃c ∈ S {a} : c → b and D-sd(c,S {a}). ♠
Definition 7.
A semantics σ satisfies the D-strongly admissibility prin-
ciple if and only if ∀AF ∈ Dσ, ∀E ∈ Eσ(AF) it holds that
a ∈ E ⊃ D-sd(a,E) ♠
Definition 8.
A semantics σ satisfies the D-reinstatement principle if and
only if ∀AF ∈ Dσ, ∀E ∈ Eσ(AF) it holds that:
(∀b ∈ A ,b → a ⇒ E → b) ⇒ a ∈ E. ♠
Definition 9.
A set of extensions E is D-I-maximal if and only if ∀E1,E2 ∈
E , if E1 ⊆ E2 then E1 = E2. A semantics σ satisfies the D-I-maximality
principle if and only if ∀AF ∈ Dσ Eσ(AF) is D-I-maximal. ♠
Definition 10. Given an argumentation framework AF = 〈A ,→ 〉, a non-10
empty set S ⊆ A is D-unattacked if and only if ∃a ∈ (A  S) : a → S. The
set of D-unattacked sets of AF is denoted as US (AF). ♠
Definition 11. Let AF = 〈A ,→ 〉 be an argumentation framework. The
restriction of AF to S ⊆ A is the argumentation framework AF↓S = 〈S,→
∩(S × S)〉. ♠15
Definition 12.
A semantics σ satisfies the D-directionality principle if
and only if ∀AF = 〈A ,→ 〉,∀S ∈ US (AF),AE σ(AF,S) = Eσ(AF↓S), where
AE σ(AF,S) {(E ∩ S) | E ∈ Eσ(AF)} ⊆ 2S
. ♠
1.2 Acceptability of Arguments [PV02; BG09a]
Definition 13. Given a semantics σ and an argumentation framework20
〈A ,→ 〉, an argument AF ∈ Dσ is:
• skeptically justified iff ∀E ∈ Eσ(AF), a ∈ S;
• credulously justified iff ∃E ∈ Eσ(AF), a ∈ S. ♠
University of Aberdeen, 2015 Page 3
Dung’s AF • (Some) Semantics [Dun95]
Definition 14. Given a semantics σ and an argumentation framework
〈A ,→ 〉, an argument AF ∈ Dσ is:
• justified iff it is skeptically justified;
• defensible iff it is credulously justified but not skeptically justified;
• overruled iff it is not credulously justified. ♠5
1.3 (Some) Semantics [Dun95]
Lemma 1 (Dung’s Fundamental Lemma, [Dun95, Lemma 10]). Given an
argumentation framework AF = 〈A ,→ 〉, let S ⊆ A be a D-admissible set
of arguments, and a,b be arguments which are acceptable with respect to
S. Then:10
1. S = S ∪{a} is D-admissible; and
2. b is D-acceptable with respect to S . ♣
Theorem 1 ([Dun95, Theorem 11]). Given an argumentation framework
AF = 〈A ,→ 〉, the set of all D-admissible sets of 〈A ,→ 〉 form a complete
partial order with respect to set inclusion. ♣15
Definition 15 (Complete Extension).
Given an argumentation frame-
work AF = 〈A ,→ 〉, S ⊆ A is a D-complete extension iff S is D-conflict-free
and S = FAF(S). C O denotes the complete semantics. ♠
Definition 16 (Grounded Extension).
Given an argumentation frame-
work AF = 〈A ,→ 〉. The grounded extension of AF is the least complete20
extension of AF. GR denotes the grounded semantics. ♠
Definition 17 (Preferred Extension).
Given an argumentation frame-
work AF = 〈A ,→ 〉. A preferred extension of AF is a maximal (w.r.t. set
inclusion) complete extension of AF. P R denotes the preferred seman-
tics. ♠25
Definition 18. Given an argumentation framework AF = 〈A ,→ 〉 and
S ⊆ A , S+
{a ∈ A | ∃b ∈ S ∧ b → a}. ♠
Definition 19 (Stable Extension).
Given an argumentation framework
AF = 〈A ,→ 〉. S ⊆ A is a stable extension of AF iff S is a preferred exten-
sion and S+
= A  S. S T denotes the stable semantics. ♠30
University of Aberdeen, 2015 Page 4
Dung’s AF • Labelling-Based Semantics Representation
[Cam06]
C O GR P R S T
D-conflict-free Yes Yes Yes Yes
D-admissibility Yes Yes Yes Yes
D-strongly admissibility No Yes No No
D-reinstatement Yes Yes Yes Yes
D-I-maximality No Yes Yes Yes
D-directionality Yes Yes Yes No
Table 1.1: Satisfaction of general properties by argumentation semantics
[BG07; BCG11]
S T
P R
C O GR
Figure 1.1: Relationships among argumentation semantics
1.4 Labelling-Based Semantics Representation
[Cam06]
Definition 20. Let ∆ = Γ be an argumentation framework. A labelling
L ab ∈ L(∆) is a complete labelling of ∆ iff it satisfies the following condi-
tions for any a1 ∈ A :5
• L ab(a1) = in ⇔ ∀a2 ∈ a−
1 L ab(a2) = out;
• L ab(a1) = out ⇔ ∃a2 ∈ a−
1 : L ab(a2) = in. ♠
The grounded and preferred labelling can then be defined on the basis
of complete labellings.
Definition 21. Let ∆ = Γ be an argumentation framework. A labelling10
L ab ∈ L(∆) is the grounded labelling of ∆ if it is the complete labelling
of ∆ minimizing the set of arguments labelled in, and it is a preferred
labelling of ∆ if it is a complete labelling of ∆ maximizing the set of argu-
ments labelled in. ♠
In order to show the connection between extensions and labellings, let15
us recall the definition of the function Ext2Lab, returning the labelling
corresponding to a D-conflict-free set of arguments S.
Definition 22. Given an AF ∆ = Γ and a D-conflict-free set S ⊆ A , the
corresponding labelling Ext2Lab(S) is defined as Ext2Lab(S) ≡ L ab, where
• L ab(a1) = in ⇔ a1 ∈ S20
• L ab(a1) = out ⇔ ∃ a2 ∈ S s.t. a2 → a1
University of Aberdeen, 2015 Page 5
Dung’s AF • Labelling-Based Semantics Representation
[Cam06]
σ = C O σ = GR σ = P R σ = S T
EXISTSσ trivial trivial trivial NP-c
CAσ NP-c polynomial NP-c NP-c
SAσ polynomial polynomial Π
p
2 -c coNP-c
VERσ polynomial polynomial coNP-c polynomial
NEσ NP-c polynomial NP-c NP-c
Table 1.2: Complexity of decision problems by argumentation semantics
[DW09]
• L ab(a1) = undec ⇔ a1 ∉ S ∧ a2 ∈ S s.t. a2 → a1 ♠
[Cam06] shows that there is a bijective correspondence between the
complete, grounded, preferred extensions and the complete, grounded,
preferred labellings, respectively.
Proposition 1. Given an an AF ∆ = Γ, L ab is a complete (grounded,5
preferred) labelling of ∆ if and only if there is a complete (grounded, pre-
ferred) extension S of ∆ such that L ab = Ext2Lab(S). ♣
The set of complete labellings of ∆ is denoted as LC O (∆), the set of
preferred labellings as LP R(∆), while LGR(∆) denotes the set including
the grounded labelling.10
University of Aberdeen, 2015 Page 6
Dung’s AF • Labelling-Based Semantics Representation
[Cam06]
University of Aberdeen, 2015 Page 7
Dung’s AF • Skepticism Relationships [BG09b]
GR
C O
P R
GR
C O
P RS T
Figure 1.2: S
⊕ relation for any argumentation framework (left) and for
argumentation framework where stable extensions exist (right).
1.5 Skepticism Relationships [BG09b]
E1
E
E2 denotes that E1 is at least as skeptical as E2.
Definition 23. Let E
be a skepticism relation between sets of exten-
sions. The skepticism relation between argumentation semantics S
is
such that for any argumentation semantics σ1 and σ2, σ1
S
σ2 iff ∀AF ∈5
Dσ1 ∩Dσ2 , EAF(σ1) E
EAF(σ2). ♠
Definition 24. Given two sets of extensions E1 and E2 of an argumenta-
tion framework AF:
• E1
E
∩+ E2 iff ∀E2 ∈ E2, ∃E1 ∈ E1: E1 ⊆ E2;
• E1
E
∪+ E2 iff ∀E1 ∈ E1, ∃E2 ∈ E2: E1 ⊆ E2. ♠10
Lemma 2. Given two argumentation semantics σ1 and σ2, if for any
argumentation framework AF EAF(σ1) ⊆ EAF(σ2), then σ1
E
∩+ σ2 and
σ1
E
∪+ σ2 (σ1
E
⊕ σ2). ♣
1.6 Signatures [Dun+14]
Let A be a countably infinite domain of arguments, and15
AFA = {〈A ,→〉 | A ⊆ A,→⊆ A ×A }.
Definition 25. The signature Σσ of a semantics σ is defined as
Σσ = {σ(F) | F ∈ AFA}
(i.e. the collection of all possible sets of extensions an AF can possess
under a semantics). ♠20
Given S ⊆ 2A
, ArgsS = S∈S S, PairsS = {〈a,b〉 | ∃S ∈ S s.t. {a,b} ⊆ S}. S
is called an extension-set if ArgsS is finite.
Definition 26. Let S ⊆ 2A
. S is incomparable if ∀S,S ∈ S, S ⊆ S implies
S = S . ♠
University of Aberdeen, 2015 Page 8
Dung’s AF • Signatures [Dun+14]
Definition 27. An extension-set S ⊆ 2A
is tight if ∀S ∈ S and a ∈ ArgsS
it holds that if S ∪ {a} ∈ S then there exists an b ∈ S such that 〈a,b〉 ∈
PairsS. ♠
Definition 28. S ⊆⊆ 2A
is adm-closed if for each A,B ∈ S the following
holds: if 〈a,b〉 ∈ PairsS for each a,b ∈ A ∪B, then also A ∪B ∈ S. ♠5
Proposition 2. For each F ∈ AFA:
• S T (F) is incomparable and tight;
• P R(F) is non-empty, incomparable and adm-closed. ♣
Theorem 2. The signatures for S T and P R are:
• ΣS T = {S | S is incomparable and tight};10
• ΣP R = {S = | S is incomparable and adm-closed}. ♣
University of Aberdeen, 2015 Page 9
Dung’s AF • Signatures [Dun+14]
Consider
S = { { a,d, e },
{ b, c, e },
{ a,b,d } }
University of Aberdeen, 2015 Page 10
Dung’s AF • Decomposability and Transparancy [Bar+14]
1.7 Decomposability and Transparancy [Bar+14]
Definition 29. Given an argumentation framework AF = (A ,→),
a labelling-based semantics σ associates with AF a subset of L(AF), de-
noted as Lσ(AF). ♠
Definition 30. Given AF = (A ,→) and a set Args ⊆ A , the input of Args,5
denoted as Argsinp, is the set {B ∈ A Args | ∃A ∈ Args,(B, A) ∈→}, the con-
ditioning relation of Args, denoted as ArgsR
, is defined as → ∩(Argsinp ×
Args). ♠
Definition 31. An argumentation framework with input is a tuple
(AF,I ,LI ,RI ), including an argumentation framework AF = (A ,→), a10
set of arguments I such that I ∩A = , a labelling LI ∈ LI and a rela-
tion RI ⊆ I × A . A local function assigns to any argumentation frame-
work with input a (possibly empty) set of labellings of AF, i.e.
F(AF,I ,LI ,RI ) ∈ 2L(AF)
. ♠
Definition 32. Given an argumentation framework with input15
(AF,I ,LI ,RI ), the standard argumentation framework w.r.t.
(AF,I ,LI ,RI ) is defined as AF = (A ∪ I ,→ ∪R I ), where I = I ∪
{A | A ∈ out(LI )} and R I = RI ∪ {(A , A) | A ∈ out(LI )} ∪ {(A, A) | A ∈
undec(LI )}. ♠
Definition 33. Given a semantics σ, the canonical local function of σ20
(also called local function of σ) is defined as Fσ(AF,I ,LI ,RI ) = {Lab↓A |
Lab ∈ Lσ(AF )}, where AF = (A ,→) and AF is the standard argumenta-
tion framework w.r.t. (AF,I ,LI ,RI ). ♠
Definition 34. A semantics σ is complete-compatible iff the following
conditions hold:25
1. For any argumentation framework AF = (A ,→), every labelling L ∈
Lσ(AF) satisfies the following conditions:
• if A ∈ A is initial, then L(A) = in
• if B ∈ A and there is an initial argument A which attacks B,
then L(B) = out30
• if C ∈ A is self-defeating, and there are no attackers of C be-
sides C itself, then L(C) = undec
2. for any set of arguments I and any labelling LI ∈ LI , the ar-
gumentation framework AF = (I ,→ ), where I = I ∪ {A | A ∈
out(LI )} and → = {(A , A) | A ∈ out(LI )}∪{(A, A) | A ∈ undec(LI )},35
admits a (unique) labelling, i.e. |Lσ(AF )| = 1. ♠
University of Aberdeen, 2015 Page 11
Dung’s AF • Decomposability and Transparancy [Bar+14]
Definition 35. A semantics σ is fully decomposable (or simply decom-
posable) iff there is a local function F such that for every argumenta-
tion framework AF = (A ,→) and every partition P = {P1,...Pn} of A ,
Lσ(AF) = U (P , AF,F) where U (P , AF,F) {LP1 ∪ ... ∪ LPn |
LPi
∈ F(AF↓Pi
,Pi
inp,( j=1···n,j=i LPj
)↓
Pi
inp,Pi
R
)}. ♠5
Definition 36. A complete-compatible semantics σ is top-down decom-
posable iff for any argumentation framework AF = (A ,→) and any parti-
tion P = {P1,...Pn} of A , it holds that Lσ(AF) ⊆ U (P , AF,Fσ). ♠
Definition 37. A complete-compatible semantics σ is bottom-up decom-
posable iff for any argumentation framework AF = (A ,→) and any parti-10
tion P = {P1,...Pn} of A , it holds that Lσ(AF) ⊇ U (P , AF,Fσ). ♠
C O S T GR P R
Full decomposability Yes Yes No No
Top-down decomposability Yes Yes Yes Yes
Bottom-up decomposability Yes Yes No No
Table 1.3: Decomposability properties of argumentation semantics.
University of Aberdeen, 2015 Page 12
2 Argumentation Schemes
Argumentation schemes [WRM08] are reasoning patterns which generate
arguments:
• deductive/inductive inferences that represent forms of common types
of arguments used in everyday discourse, and in special contexts5
(e.g. legal argumentation);
• neither deductive nor inductive, but defeasible, presumptive, or ab-
ductive.
Moreover, an argument satisfying a pattern may not be very strong by
itself, but may be strong enough to provide evidence to warrant rational10
acceptance of its conclusion, given that it premises are acceptable.
According to Toulmin [Tou58] such an argument can be plausible and
thus accepted after a balance of considerations in an investigation or dis-
cussion moved forward as new evidence is being collected. The investiga-
tion can then move ahead, even under conditions of uncertainty and lack15
of knowledge, using the conclusions tentatively accepted.
2.1 An example: Walton et al. ’s Argumentation
Schemes for Practical Reasoning
Suppose I am deliberating with my spouse on what to do
with our pension investment fund — whether to buy stocks,20
bonds or some other type of investments. We consult with a
financial adviser, and expert source of information who can
tell us what is happening in the stock market, and so forth at
the present time [Wal97].
Premises for practical inference:25
1. states that an agent (“I” or “my”) has a particular goal;
2. states that an agent has a particular goal.
〈S0,S1,...,Sn〉 represents a sequence of states of affairs that can be
ordered temporally from earlier to latter. A state of affairs is meant to be
like a statement, but one describing some event or occurrence that can30
be brought about by an agent. It may be a human action, or it may be a
natural event.
University of Aberdeen, 2015 Page 13
Argumentation Schemes • AS and Dialogues
Practical Inference
Premises:
Goal Premise Bringing about Sn is my goal
Means Premise In order to bring about Sn, I need to bring
about Si
Conclusions:
Therefore, I need to bring about Si.
Critical questions:
Other-Means
Question
Are there alternative possible actions to
bring about Si that could also lead to the
goal?
Best-Means
Question
Is Si the best (or most favourable) of the
alternatives?
Other-Goals
Question
Do I have goals other than Si whose
achievement is preferable and that
should have priority?
Possibility
Question
Is it possible to bring about Si in the
given circumstances?
Side Effects
Question
Would bringing about Si have known bad
consequences that ought to be taken into
account?
2.2 AS and Dialogues
Dialogue for practical reasoning: all moves (propose, prefer, justify) are co-
ordinated in a formal deliberation dialogue that has eight stages [HMP01].
1. Opening of the deliberation dialogue, and the raising of a governing5
question about what is to be done.
2. Discussion of: (a) the governing question; (b) desirable goals; (c)
any constraints on the possible actions which may be considered;
(d) perspectives by which proposals may be evaluated; and (e) any
premises (facts) relevant to this evaluation.10
3. Suggesting of possible action-options appropriate to the governing
question.
4. Commenting on proposals from various perspectives.
5. Revising of: (a) the governing question, (b) goals, (c) constraints, (d)
perspectives, and/or (e) action-options in the light of the comments15
University of Aberdeen, 2015 Page 14
Argumentation Schemes • AS and Dialogues
presented; and the undertaking of any information-gathering or
fact-checking required for resolution.
6. Recommending an option for action, and acceptance or non-accept-
ance of this recommendation by each participant.
7. Confirming acceptance of a recommended option by each partici-5
pant.
8. Closing of the deliberation dialogue.
Proposals are initially made at stage 3, and then evaluated at stages
4, 5 and 6.
Especially at stage 5, much argumentation taking the form of practi-10
cal reasoning would seem to be involved.
As discussed in [Wal06], there are three dialectical adequacy condi-
tions for defining the speech act of making a proposal.
The Proponent’s Requirement (Condition 1). The proponent
puts forward a statement that describes an action and says that15
both proponent and respondent (or the respondent group) should
carry out this action.
The proponent is committed to carrying out that action: the state-
ment has the logical form of the conclusion of a practical inference,
and also expresses an attitude toward that statement.20
The Respondent’s Requirement (Condition 2). The statement
is put forward with the aim of offering reasons of a kind that will
lead the respondent to become committed to it.
The Governing Question Requirement (Condition 3). The job
of the proponent is to overcame doubts or conflicts of opinions, while25
the job of the respondent is to express them. Thus the role of the
respondent is to ask questions that cast the prudential reasonable-
ness of the action in the statement into doubt, and to mount attacks
(counter-arguments and rebuttals) against it.
Condition 3 relates to the global structure of the dialogue, whereas30
conditions 1 and 2 are more localised to the part where the proposal was
made. Condition 3 relates to the global burden of proof [Wal14] and the
roles of the two parties in the dialogue as a whole.
Speech acts [MP02], like making a proposal, are seen as types of
moves in a dialogue that are governed by rules. Three basic character-35
istics of any type of move that have to be defined:
1. pre-conditions of the move;
University of Aberdeen, 2015 Page 15
Argumentation Schemes • AS and Dialogues
2. the conditions defining the move itself;
3. the post-conditions that state the result of the move.
Preconditions
• At least two agents (proponent and opponent);
• A governing question;5
• Set of statements (propositions);
• The proponent proposes the proposition to the respondent if and
only if:
1. there is a set of premises that the proponent is committed to,
and fit the premises of the argumentation scheme for practical10
reasoning;
2. the proponent is advocating these premises, that is, he is mak-
ing a claim that they are true or applicable in the case at issue;
3. there is an inference from these premises fitting the argumen-
tation scheme for practical reasoning; and15
4. the proposition is the conclusion of the inference.
The Defining Conditions
The central defining condition sets out the conditions defining the struc-
ture of the move of making a proposal.
The Goal Statement: We have a goal G.20
The Means Statement: Bringing about p is necessary (or suffi-
cient) for us to bring about G.
Then the inference follows.
The Proposal Statement: We should (practically ought to) bring
about p.25
University of Aberdeen, 2015 Page 16
Argumentation Schemes • AS and Dialogues
Proposal Statement in form of AS
Premises:
Goal Statement We have a goal G.
The Means
Statement
Bringing about p is necessary (or suffi-
cient) for us to bring about G.
Conclusions:
We should (practically ought to) bring
about p.
The Post-Conditions
The central post-condition is the response condition.
The proposal must be open to critical questioning by opponent. The5
proponent should be open to answering doubts and objections correspond-
ing to any one of the five critical questions for practical reasoning; as well
as to counter-proposals, and is in charge of giving reasons why her pro-
posal is better than the alternatives.
The response condition set by these critical questions helps to explain10
how and why the maker of a proposal needs to be open to questioning and
to requests for justification.
University of Aberdeen, 2015 Page 17
3 A Semantic-Web View of
Argumentation
Acknowledgement
This handout include material from a number of collaborators including
Chris Reed. An overview can also be find at [Bex+13].5
3.1 The Argument Interchange Format [Rah+11]
Node Graph
(argument
network)
has-a
Information
Node
(I-Node)
is-a
Scheme Node
S-Node
has-a
Edge
is-a
Rule of inference
application node
(RA-Node)
Conflict application
node (CA-Node)
Preference
application node
(PA-Node)
Derived concept
application node (e.g.
defeat)
is-a
...
ContextScheme
Conflict
scheme
contained-in
Rule of inference
scheme
Logical inference
scheme
Presumptive
inference scheme
...
is-a
Logical conflict
scheme
is-a
...
Preference
scheme
Logical preference
scheme
is-a
...
Presumptive
preference scheme
is-a
uses uses uses
Figure 3.1: Original AIF Ontology [Che+06; Rah+11]
3.2 An Ontology of Arguments [Rah+11]
Please download Protégé from http://protege.stanford.edu/ and the
AIF OWL version from http://www.arg.dundee.ac.uk/wp-content/
uploads/AIF.owl10
Representation of the argument described in Figure 3.2
___jobArg : PracticalReasoning_Inference
fulfils(___jobArg, PracticalReasoning_Scheme)
hasGoalPlan_Premise(___jobArg, ___jobArgGoalPlan)
hasConclusion(___jobArg, ___jobArgConclusion)15
hasGoal_Premise(___jobArg, ___jobArgGoal)
___jobArgConclusion : EncouragedAction_Statement
fulfils(___jobArgConclusion, EncouragedAction_Desc)
University of Aberdeen, 2015 Page 18
Semantic Web Argumentation • AIF-OWL
Practical
Inference
Bringing about
is my goal
Sn
Si
In order to bring about
I need to bring about
Sn
Therefore I need
to bring about Si
hasConcDeschasPremiseDesc
hasPremiseDesc
Bringing about being rich
is my goal
In order to bring about being rich
I need to bring about having a job
fulfilsPremiseDesc
fulfilsPremiseDesc
fulfilsScheme
supports
supports
Therefore I need
to bring about
having a job
hasConclusion
fulfils
Figure 3.2: An argument network linking instances of argument and
scheme components
Symmetric attack
r → p
r pMP2
A1
A2
p → q
p
qMP1
neg1
Undercut attack
r MP2
A3
A2 s → v
s
vMP1
cut1
p
r → p
Figure 3.3: Examples of conflicts [Rah+11, Fig. 2]
claimText (___jobArgConclusion "Therefore I need to bring about hav-
ing a job")
___jobArgGoal : Goal_Statement
fulfils(___jobArgGoal, Goal_Desc)
claimText (___jobArgGoal "Bringing about being rich is my goal")5
___jobArgGoalPlan : GoalPlan_Statement
fulfils(___jobArgGoalPlan, GoalPlan_Desc)
claimText (___jobArgGoalPlan "In order to bring about being rich I
need to bring about having a job")
University of Aberdeen, 2015 Page 19
Semantic Web Argumentation • AIF-OWL
Relevant portion of the AIF ontology
EncouragedAction_Statement
EncouragedAction_Statement Statement
GoalPlan_Statement
GoalPlan_Statement Statement5
Goal_Statement
Goal_Statement Statement
I-node
I-node ≡ Statement
I-node Node10
I-node ¬ S-node
Inference
Inference ≡ RA-node
Inference ∃ fulfils Inference_Scheme
Inference ≥ 1 hasPremise Statement15
Inference Scheme_Application
Inference = hasConclusion (Scheme_Application Statement)
Inference_Scheme
Inference_Scheme Scheme ≥
1 hasPremise_Desc Statement_Description = hasConclusion_Desc20
(Scheme Statement_Description)
PracticalReasoning_Inference
PracticalReasoning_Inference ≡ Presumptive_Inference ∃ hasCon-
clusion EncouragedAction_Statement ∃ hasGoalPlan_Premise Goal-
Plan_Statement ∃ hasGoal_Premise Goal_Statement25
RA-node
RA-node ≡ Inference
RA-node S-node
S-node
S-node ≡ Scheme_Application30
S-node Node
S-node ¬ I-node
University of Aberdeen, 2015 Page 20
Semantic Web Argumentation • AIF-OWL
Scheme
Scheme Form
Scheme ¬ Statement_Description
Scheme_Application
Scheme_Application ≡ S-node5
Scheme_Application ∃ fulfils Scheme
Scheme_Application Thing
Scheme_Application ¬ Statement
Statement
Statement ≡ NegStatement10
Statement ≡ I-node
Statement Thing
Statement ∃ fulfils Statement_Description
Statement ¬ Scheme_Application
Statement_Description15
Statement_Description Form
Statement_Description ¬ Scheme
fulfils
∃ fulfils Thing Node
hasConclusion_Desc20
∃ hasConclusion_Desc Thing Inference_Scheme
hasGoalPlan_Premise
hasPremise
hasGoal_Premise
hasPremise25
claimText
∃ claimText DatatypeLiteral Statement
∀ claimText DatatypeString
Individuals of EncouragedAction_Desc
EncouragedAction_Desc : Statement_Description30
formDescription (EncouragedAction_Desc "A should be brought about")
University of Aberdeen, 2015 Page 21
Semantic Web Argumentation • AIF-OWL
Individuals of GoalPlan_Desc
GoalPlan_Desc : Statement_Description
formDescription (GoalPlan_Desc "Bringing about B is the way to bring
about A")
Individuals of Goal_Desc5
Goal_Desc : Statement_Description
formDescription (Goal_Desc "The goal is to bring about A")
Individuals of PracticalReasoning_Scheme
PracticalReasoning_Scheme : PresumptiveInference_Scheme
hasPremise_Desc(PracticalReasoning_Scheme, Goal_Desc)10
hasConclusion_Desc(PracticalReasoning_Scheme, EncouragedAction_Desc)
hasPremise_Desc(PracticalReasoning_Scheme, GoalPlan_Desc)
University of Aberdeen, 2015 Page 22
4 Argumentation Frameworks:
Graphs and Models
Acknowledgement
This handout include material from a number of collaborators including
(in alphabetic order):5
• Pietro Baroni;
• Trevor J. M. Bench-Capon;
• Claudette Cayrol;
• Paul E. Dunne;
• Anthony Hunter;10
• Hengfei Li;
• Sanjay Modgil;
• Nir Oren;
• Guillermo R. Simari.
4.1 Graphs15
Value-Based Argumentation Framework [BA09]
Example 1 ([AB08], derived from [Col92; Chr00]). The situation involves
two agents, called Hal and Carla, both of whom are diabetic. Hal, through
no fault of his own, has lost his supply of insulin and urgently needs to
take some to stay alive. Hal is aware that Carla has some insulin kept in20
her house, but Hal does not have permission to enter Carla’s house. The
question is whether Hal is justified in breaking into Carla’s house and
taking her insulin in order to save his life. Note that by taking Carla’s in-
sulin, Hal may be putting her life in jeopardy, since she will come to need
that insulin herself. One possible response is that if Hal has money, he25
can compensate Carla so that her insulin can be replaced before she needs
it. Alternatively if Hal has no money but Carla does, she can replace her
insulin herself, since her need is not immediately life threatening. There
is, however, a serious problem if neither of them have money, since in that
case Carla’s life is really under threat.30
Partial formalisation:
University of Aberdeen, 2015 Page 23
Frameworks • Graphs
a2
LC, FC
a3
LC, FH
a1
LC
Figure 4.1: Graphical representation of Ex. 1.
• a1 suggests that Hal should not take insulin, thus allowing Carla
to be alive (which promotes the value of Life for Carla LC);
• a2 suggests that Hal should take insulin and compensate Carla,
thus both of them stay alive (which promotes the value of Life for
Carla, and the Freedom — of using money — for Carla FC);5
• a3 suggests that Hal should take insulin and that Carla should buy
insulin, thus both of them stay alive (which promotes the value of
Life for Carla, and the Freedom — of using money — for Hal FH).
a2 defeats a1, a3 defeats a1, a3 and a2 defeat each other. ♥
Extended Argumentation Framework [Mod09]10
Example 2 (From [Mod09]).
• a1: “Today will be dry in London since the BBC forecast sunshine”;
• a2: “Today will be wet in London since CNN forecast rain”;
• a3: “But the BBC are more trustworthy than CNN”;
• a4: “However, statistically CNN are more accurate forecasters than15
the BBC”;
• a5: “Basing a comparison on statistics is more rigorous and ratio-
nal than basing a comparison on your instincts about their relative
trustworthiness”.
a1 and a2 are mutually conflicting; a3 is a preference in favour of a1,20
a4 is a preference in favour of a2. a3 and a4 are mutually conflicting. a5
is a preference in favour of a4. ♥
University of Aberdeen, 2015 Page 24
Frameworks • Graphs
Figure 4.2: Graphical representation of Ex. 2.
Figure 4.3: Graphical representation of Ex. 3.
AFRA: Argumentation Framework with Recursive Attacks
[Bar+11; Bar+09]
Example 3 ([Bar+11; Bar+09]). Suppose Bob is deciding about his Christ-
mas holidays.
• a1: There is a last minute offer for Gstaad: therefore I should go to5
Gstaad;
• a2: There is a last minute offer for Cuba: therefore I should go to
Cuba;
• a3: I do like to ski;
• a4: The weather report informs that in Gstaad there were no snow-10
falls since one month: therefore it is not possible to ski in Gstaad;
• a5: It is anyway possible to ski in Gstaad, thanks to a good amount
of artificial snow. ♥
Definition 38 (AFRA). An Argumentation Framework with Recursive
Attacks (AFRA) is a pair 〈A ,R〉 where:15
• A is a set of arguments;
• R is a set of attacks, namely pairs (a1,X ) s.t. a1 ∈ A and (X ∈ R
or X ∈ A ).
Given an attack α = (a1,X ) ∈ R, we say that a1 is the source of α, denoted
as src(α) = a1 and X is the target of α, denoted as trg(α) = X .20
When useful, we will denote an attack to attack explicitly showing
all the recursive steps implied by its definition; for instance (a1,(a2,a3))
means (a1,α) where α = (a2,a3). ♠
University of Aberdeen, 2015 Page 25
Frameworks • Graphs
Definition 39 (Semantics). Let Γ = 〈A ,R〉 be an AFRA. A set S ⊆ A ∪R
is:
• a complete extension if and only if S is admissible and every el-
ement of A ∪ R which is acceptable w.r.t. S belongs to S , i.e.
FΓ(S ) ⊆ S ;5
• the grounded extension of Γ iff is the least fixed point of FΓ;
• a preferred extension of Γ iff it is a maximal (w.r.t. set inclusion)
admissible set;
• a stable extension of Γ if and only if S is conflict-free and ∀V ∈
A ∪R,V ∉ S , ∃α ∈ S s.t. α →R V . ♠10
Theorem 3.
In the case where an AFRA is also an AF, a bijective corre-
spondence between the semantics notions according to the two formalisms
hold. ♣
Theorem 4.
Moreover, in the case where an AFRA is not an AF, it is
possible to rewrite it as an AF with extra arguments. ♣15
Bipolar Argumentation Framework [CL05]
Example 4 ([CL05, Example 1]). A murder has been performed and the
suspects are Liz, Mary and Peter. The following pieces of information
have been gathered:
• The type of murder suggests us that the killer is a female (f );20
• The killer is certainly small (s);
• Liz is tall and Mary and Peter are small;
• The killer has long hair and uses a lipstick (l);
• A witness claims that he saw the killer who was tall;
• The witness is reliable (w);25
• Moreover we are told that the witness is short-sighted, so he is no
more reliable (b).
The following arguments can be formed:
• a1 in favour of m, with premises {s, f ,(s∧ f ) → m};
• a2 in favour of ¬s, with premises {w,w → ¬s};30
• a3 in favour of ¬w, with premises {b,b → ¬w};
• a4 in favour of f , with premises {l,l → f }
University of Aberdeen, 2015 Page 26
Frameworks • Deterministic Structured Argumentation
a3 a2 a1
a4
Figure 4.4: Graphical representation of Ex. 4: rounded arrows represent
the support relationship.
a3 defeats a2; a2 defeats a1. But, the argument a4 confirms one of the
premises of a1, thus strengthening it. ♥
4.2 Deterministic Structured Argumentation
Defeasible Logic Programming (DeLP) [Sim89; SL92; GS04;
GS14]5
A defeasible logic program (DeLP) is a set of:
• facts, i.e. ground literals representing atomic information or the
negation of atomic information using strong negation ¬;
• strict rules, Lo ←− L1,...,Ln, represent non-defeasible information.
Lo is the head, the body {Li}i>0 is a non-empty set of ground literals;10
• defeasible rules, Lo −< L1,...,Ln, represent tentative information.
Lo is the head, the body {Li}i>0 is a non-empty set of ground literals.
A DeLP program is denoted by 〈Π,∆〉, where Π is the subset of non-
defeasible knowledge (strict rules and facts); and ∆ is the subset of defea-
sible knowledge.15
A defeasible derivation of a literal Q from a DeLP program 〈Π,∆〉 |∼ Q,
is a finite sequence of ground literals L1,L2,...,Ln = Q where either:
1. Li is a fact;
2. there exists a rule Ri in 〈Π,∆〉 (either strict or defeasible) with head
Li and body B1,...,Bk, and every literal of the body is an element20
L j of the sequence appearing before Li (j < i).
A derivation from 〈Π, 〉 is called a strict derivation.
Definition 40. Let H be a ground literal, 〈Π,∆〉 a DeLP program, and
A ⊆ ∆. The pair 〈A ,H〉 is an argument structure if:
• there exists a defeasible derivation for H from 〈Π,A 〉;25
• there are no defeasible derivations from 〈Π,A 〉 of contradictory lit-
erals;
University of Aberdeen, 2015 Page 27
Frameworks • Deterministic Structured Argumentation
• and there is no proper subset A ⊂ A such that A satisfies (1) and
(2). ♠
Definition 41. An argument 〈B,S〉 is a counter-argument for 〈A ,H〉 at
literal P, if there exists a sub-argument 〈C ,P〉 of 〈A ,H〉 such that P
and S disagree, that is, there exist two contradictory literals that have a5
strict derivation from Π∪{S,P}. The literal P is referred as the counter-
argument point and 〈C ,P〉 as the disagreement sub-argument. ♠
Let assume an argument comparison criterion .
Definition 42. Let 〈B,S〉 be a counter-argument for 〈A ,H〉 at point P,
and 〈C ,P〉 the disagreement sub-argument.10
If 〈B,S〉 〈C ,P〉, then 〈B,S〉 is a proper defeater for 〈A ,H〉.
If 〈B,S〉 〈C ,P〉 and 〈C ,P〉 〈B,S〉, then 〈B,S〉 is a blocking de-
feater for 〈A ,H〉.
〈B,S〉 is a defeater for 〈A ,H〉 if 〈B,S〉 is either a proper or blocking
defeater for 〈A ,H〉. ♠15
Example 5. Let 〈Π1,∆1〉 be a DeLP-program such that:
Π1 =



monday
cloudy
dry_season
waves
grass_grown
hire_gardener
vacation
¬working ←− vacation
few_surfers ←− ¬many_surfers
¬surf ←− ill



∆1 =



surf −< nice,spare_time
nice −< waves
spare_time −< ¬busy
¬busy −< ¬working
¬nice −< rain
rain −< cloudy
¬rain −< dry_season
...



From 〈Π1,∆1〉, these are some arguments that can be derived:
〈A0,surf〉 =



surf −< nice,spare_time
nice −< waves
spare_time −< ¬busy
¬busy −< ¬working



,surf
〈A1,¬nice〉 = 〈{¬nice −< rain; rain −< cloudy},¬nice〉20
〈A2,nice〉 = 〈{nice −< waves},nice〉
〈A3,rain〉 = 〈{rain −< cloudy},rain〉
〈A4,¬rain〉 = 〈{¬rain −< dry_season},¬rain〉
University of Aberdeen, 2015 Page 28
Frameworks • Deterministic Structured Argumentation
Figure 4.5: Arguments and their interactions from Example 5
〈A9,¬busy〉 = 〈{¬busy −< ¬working},¬busy〉
♥
Assumption Based Argumentation (ABA) [BTK93; Bon+97;
Ton12; Ton14; DT10]
Definition 43. An ABA is a tuple 〈L ,R,A , 〉 where:5
• 〈L ,R〉 is a deductive system, with L the language and R a set of
rules, that we assume of the form σ0 ←− σ1,...,σm (m ≥ 0), with
σi ∈ L ; σ0 is referred to as the head and σ1,...,σm as the body of
the rule σ0 ←− σ1,...,σm;
• A ⊆ L is a (non-empty) set, referred to as assumptions;10
• is a total mapping from A into L ; a is referred to as the contrary
of a.
♠
Definition 44. A deduction for σ ∈ L supported by S ⊆ L and R ⊆ R,
denoted as S
R
σ, is a (finite) tree with nodes labelled by sentences in15
L or by τ ∉ L , the root labelled by σ, leaves either τ or sentences in S,
non-leaves σ with, as children, the elements of the body of some rule in
R with head σ , and R the set of all such rules. ♠
Definition 45. An argument for the claim σ ∈ L supported by A ⊆ A
(A σ) is a deduction for σ supported by A (and some R ⊆ R). ♠20
Definition 46. An argument A1 σ1 attacks an argument A2 σ2 iff σ1
is the contrary of one of the assumptions in A2. ♠
University of Aberdeen, 2015 Page 29
Frameworks • Deterministic Structured Argumentation
Figure 4.6: Graphical representation of Ex. 6.
Example 6.
R = { innocent(X) ←− notGuilty(X);
killer(oj) ←− DNAshows(oj),DNAshows(X) ⊃ killer(X);
DNAshows(X) ⊃ killer(X) ←− DNAfromReliableEvidence(X);
evidenceUnreliable(X) ←− collected(X,Y ),racist(Y );
DNAshows(oj) ←−;
collected(oj,mary) ←−;
racist(mary) ←− }
A = { notGuilty(oj);
DNAfromReliableEvidence(oj) }
Moreover, notGuilty(oj) = killer(oj), and
DNAfromReliableEvidence(oj) = evidenceUnreliale(oj). ♥5
ASPIC+ [Pra10; MP13; MP14]
Given a logical language L , and a set of strict or defeasible inference
rules — resp. ϕ1,...,ϕn −→ ϕ and ϕ1,...,ϕn =⇒ ϕ. A strict rule inference
always holds — i.e. if the antecedents ϕ1,...,ϕn hold, the consequent ϕ
holds as well — while a defeasible inference “usually” holds. Arguments10
are constructed w.r.t. a knowledge base with two types of formulae.
Definition 47. An argumentation system is as tuple AS = 〈L ,R,ν〉 where:
• : L → 2L
is a contrariness function s.t. if ϕ ∈ ψ and:
– ψ ∉ ϕ, then ϕ is a contrary of ψ;
– ψ ∈ ϕ, then ϕ is a contradictory of ψ (ϕ = –ψ);15
• R = Rd ∪ Rs is a set of strict (Rs) and defeasible (Rd) inference
rules such that Rd ∩Rs = ;
• ν : Rd → L , is a partial function.1
1Informally, ν(r) is a wff in L which says that the defeasible rule r is applicable.
University of Aberdeen, 2015 Page 30
Frameworks • Deterministic Structured Argumentation
For any P ⊆ L , Cl(P ) denotes the closure of P under strict rules, viz. the
smallest set containing P and any consequent of any consequent of any
strict rule in Rs whose antecedents are in Cl(P ).
P ⊆ L is consistent iff ϕ,ψ ∈ P s.t. ϕ ∈ ψ, otherwise is inconsistent.
A knowledge base in an AS is a set Kn ∪ Kp = K ⊆ L ; {Kn,Kp} is a5
partition of K ; Kn contains axioms that cannot be attacked; Kp contains
ordinary premises that can be attacked.
An argumentation theory is a pair AT = 〈AS,K 〉. ♠
Definition 48.
An argument a on the basis of a AT = 〈AS,K 〉, AS =
〈L ,R,ν〉 is:10
1. ϕ if ϕ ∈ K with: Prem(a) = {ϕ}; Conc(a) = ϕ; Sub(a) = {ϕ}; Rules(a) =
DefRules(a) = ; TopRule(a) = undefined.
2. a1,...,an −→ / =⇒ ψ if a1,...,an, with n ≥ 0, are arguments such
that there exists a strict/defeasible rule r = Conc(a1),...,Conc(an) −→
/ =⇒ ψ ∈ Rs/Rd.15
Prem(a) = n
i=1
Prem(ai); Conc(a) = ψ;
Sub(a) = n
i=1
Sub(ai)∪{a};
Rules(a) = n
i=1
Rules(ai)∪{r};
DefRules(a) = {d | d ∈ Rules(a)∩Rd};
TopRule(a) = r20
a is strict if DefRules(a) = , otherwise defeasible; firm if Prem(a) ⊆ Kn,
otherwise plausible.
P A ϕ iff ∃a strict argument s.t. Conc(a) = ϕ and P ⊇ Prem(a). ♠
An argument can be attacked in its premises (undermining), conclu-
sion (rebuttal), or inference step (undercut). The definition of defeats25
takes into account an argument ordering : a b iff a is “less preferred”
than b (a b iff a b and b a).
Definition 49.
Given a and b arguments, a defeats b iff a undercuts,
successfully rebuts or successfully undermines b, where:
• a undercuts b (on b ) iff Conc(a) ∉ ν(r) for some b ∈ Sub(b) s.t. r =30
TopRule(b ) ∈ Rd;
• a successfully rebuts b (on b ) iff Conc(a) ∉ ϕ for some b ∈ Sub(b) of
the form b1,...,bn =⇒ –ϕ, and a b ;
• a successfully undermines b (on ϕ) iff Conc(a) ∉ ϕ, and ϕ ∈ Prem(b)∩
Kp, and a ϕ. ♠35
University of Aberdeen, 2015 Page 31
Frameworks • Deterministic Structured Argumentation
Definition 50. AF is the abstract argumentation framework defined by
AT = 〈AS,K 〉, AS = 〈L ,R,ν〉 if A is the smallest set of all finite argu-
ments constructed from K satisfying Def. 48; and → is the defeat relation
on A as defined in Def. 49. ♠
Definition 51 (Rationality postulates [CA07; MP14]). Given ∆, an AF5
defined by an AT, and a semantic σ. ∀S ∈ E∆(σ), ∆ satisfies :
P1: direct consistency iff {Conc(a) | a ∈ S} is consistent;
P2: indirect consistency iff Cl({Conc(a) | a ∈ S}) is consistent;
P3: closure iff {Conc(a) | a ∈ S} = Cl({Conc(a) | a ∈ S});
P4 : sub-argument closure iff ∀a ∈ S, Sub(a) ⊆ S. ♠10
Note that P2 follows from P1 and P3.
An AT satisfies the postulates (i.e. it is Well-Formed) iff (let us con-
sider classical negation here instead of contrariness function) [MP13; MP14]:
• it is close under transposition2
or under contraposition;3
• Cl(Kn) is consistent;15
• the argument ordering is reasonable, namely:
– ∀a,b, if a is strict and firm, and b is plausible or defeasible,
then a b;
– ∀a,b, if b is strict and firm, then b a;
– ∀a,a ,b such that a is a strict continuation of {a}, if a b20
then a b, and if b a, then b a ;
– given a finite set of arguments {a1,...,an}, let a+i
be some
strict continuation of {a1,...,ai−1,ai+1,...,an}. Then it is not
the case that ∀i,a+i
ai.
An argument a is a strict continuation of a set of arguments {a1,...,an}25
iff (Prem(a)∩Kp) = n
i=1
(Prem(ai)∩Kp); DefRules(a) = n
i=1
DefRules(ai);
Rules(a) ⊇ n
i=1
Rules(ai) and (Prem(a)∩Kn) ⊆ n
i=1
(Prem(ai∩Kn)).
Example 7. It is well known that (1) birds normally fly; while (2) pen-
guins are known not to fly, although (3) all penguins are birds. In these
terms, one can say that (4) penguins are abnormal birds with respect to30
flying. (5) Tweety is observed to be a penguin, and (6) animals that are
observed to be penguins normally are penguins.
d1 : bird =⇒ canfly; d2 : penguin =⇒ ¬canfly; d3 : observed_penguin =⇒
penguin; f1 : penguin ⊃ bird; f2 : penguin ⊃ ¬d1; f3 : observed_penguin. The
2If ϕ1,...,ϕn −→ ψ ∈ Rs, then ∀i = 1...n, ϕ1,...,ϕi−1,¬ψ,ϕi+1,...,ϕn =⇒ ¬ϕi ∈ Rs.
3∀P ⊆ L , l ∈ P , if P A ϕ, then P {l}∪{¬ϕ} A ¬l
University of Aberdeen, 2015 Page 32
Frameworks • Deterministic Structured Argumentation
derived arguments are: a1 : observed_penguin; a2 : a1 =⇒ penguin; a3 :
penguin ⊃ bird; a4 : a2,a3 =⇒ canfly; b1 : a2 =⇒ ¬canfly; c1 : a2 =⇒ ¬ν(d1).
♥
Deductive Argumentation [BH01; BH08; GH11; BH14]
Focus on simple logic and classical logic, but other options include5
non-monotonic logics, conditional logics, temporal logics, description log-
ics, and paraconsistent logics.
Definition 52 (Base Logic). Let L be a language for a logic, and let i
be the consequence relation for that logic. If α is an atom in L , then α is
a positive literal in L and ¬α is a negative literal in L .10
For a literal β, the complement of β is defined as follows:
• If β is a positive literal, i.e. it is of the form α, then the complement
of β is the negative literal ¬α,
• if β is a negative literal, i.e. it is of the form ¬α, then the comple-
ment of β is the positive literal α. ♠15
Definition 53 (Deductive Argument). A deductive argument is an or-
dered pair 〈Φ,α〉 where Φ i α. Φ is the support, or premises, or assump-
tions of the argument, and α is the claim, or conclusion, of the argument.
For an argument a = 〈Φ,α〉, the function Support(a) returns Φ and the
function Claim(a) returns α. ♠20
Definition 54 (Constraints). An argument 〈Φ,α〉 satisfies the:
• consistency constraint when Φ is consistent (not essential, cf.
paraconsistent logic).
• minimality constraint when there is no Ψ ⊂ Φ such that Ψ α.
♠25
Definition 55 (Classical Logic Argument). A classical logic argument
from a set of formulae ∆ is a pair 〈Φ,α〉 such that
1. Φ ⊆ ∆
2. Φ ⊥
3. Φ α30
4. there is no Φ ⊂ Φ such that Φ α. ♠
Definition 56 (Counterargument). If 〈Φ,α〉 and 〈Ψ,β〉 are arguments,
then
• 〈Φ,α〉 rebuts 〈Ψ,β〉 iff α ¬β
University of Aberdeen, 2015 Page 33
Frameworks • Deterministic Structured Argumentation
• 〈Φ,α〉 undercuts 〈Ψ,β〉 iff α ¬∧Ψ ♠
Definition 57 (Direct undercut). Let a and b be two classical arguments.
We define the following types of classical attack.
• a is a direct undercut of b if ¬Claim(a) ∈ Support(b)
• a is a classical defeater of b if Claim(a) ¬ Support(b).5
• a is a classical direct defeater of b if ∃φ ∈ Support(b) s.t. Claim(a)
¬φ
• a is a classical undercut of b if ∃Ψ ⊆ Support(b) s.t. Claim(a) ≡
¬ Ψ
• a is a classical direct undercut of b if ∃φ ∈ Support(b) s.t. Claim(a) ≡10
¬φ
• a is a classical canonical undercut of b if Claim(a) ≡ ¬ Support(b).
• a is a classical rebuttal of b if Claim(a) ≡ ¬Claim(b).
• a is a classical defeating rebuttal of b if Claim(a) ¬Claim(b).
♠15
An arrow from D1 to D2 indicates that D1 ⊆ D2.
Defeater
Direct defeat Undercut Direct rebut
Direct undercut
Canonical
undercut
Rebut
University of Aberdeen, 2015 Page 34
Frameworks • Deterministic Structured Argumentation
bp(high)
ok(diuretic)
bp(high) ∧ok(diuretic)
→ give(diuretic)
¬ok(diuretic) ∨¬ok(betablocker)
give(diuretic) ∧¬ok(betablocker)
bp(high)
ok(betablocker)
bp(high) ∧ok(betablocker)
→ give(betablocker)
¬ok(diuretic) ∨¬ok(betablocker)
give(betablocker) ∧¬ok(diuretic)
symptom(emphysema),
symptom(emphysema) → ¬ok(betablocker)
¬ok(betablocker)
Figure 4.7: Example of argumentation with classical logic.
A Logic for Clinical Knowledge [GHW09; HW12; Wil+15]
Evidence on
treatments
T1 and T2
Inference rules for
inductive arguments
and meta-arguments
Arguments
Preferences on
outcomes and
their magnitude
Argument
graph
(T1 > T2) or (T1 = T2) or (T1 < T2)
Let us assume a set of evidence EVIDENCE = {e1,..., en}.
Definition 58 (Inductive Arguments). Given treatments τ1 and τ2, X ⊆
EVIDENCE, there are three kinds of inductive argument that can be formed.5
1. 〈X,τ1 > τ2〉, meaning the evidence in X supports the claim that
treatment τ1 is superior to τ2.
2. 〈X,τ1 ∼ τ2〉, meaning the evidence in X supports the claim that
treatment τ1 is equivalent to τ2
3. 〈X,τ1 < τ2〉, meaning the evidence in X supports the claim that10
treatment τ1 is inferior to τ2.
University of Aberdeen, 2015 Page 35
Frameworks • Probabilistic Argumentation
♠
Given an inductive argument a = 〈X, 〉, support(a) = X.
ARG(EVIDENCE) denotes the set of inductive arguments that can be
generated from the evidence in EVIDENCE.
Definition 59 (Conflicts). If the claim of argument ai is i and the claim5
of argument aj is j then we say that ai conflicts with aj whenever:
1. i = τ1 > τ2, and ( j = τ1 ∼ τ2 or j = τ1 < τ2 ).
2. i = τ1 ∼ τ2, and ( j = τ1 > τ2 or j = τ1 < τ2 ).
3. i = τ1 < τ2, and ( j = τ1 > τ2 or j = τ1 ∼ τ2 ). ♠
Definition 60 (Attack). For any pair of arguments ai and aj, and a pref-10
erence relation R, ai attacks aj with respect to R iff ai conflicts with aj
and it is not the case that aj is strictly preferred to ai according to R. ♠
A domain-specific benefit preference relation is defined in [HW12].
Definition 61 (Meta-Arguments). For a ∈ ARG(EVIDENCE), if there is an
e ∈ SUPPORT(a) such that:15
• e is not statistically significant, and the outcome indicator of e is not
a side-effect, then the following is a meta-argument that attacks a:
〈Not statistically significant〉;
• e is a non-randomised and non-blind trial, then the following is
a meta-argument that attacks a: 〈Non-randomized & non-blind20
trials〉;
• e is a meta-analysis that concerns a narrow patient group then the
following is a meta-argument that attacks a: 〈Meta-analysis for
a narrow patient group〉. ♠
Example 8. Example where CP is contraceptive pill and NT is no treat-25
ment. Fictional data.
ID Left Right Indicator Risk ratio Outcome p
e1 CP NT Pregnancy 0.05 superior 0.01
e2 CP NT Ovarian cancer 0.99 superior 0.07
e3 CP NT Breast cancer 1.04 inferior 0.01
e4 CP NT DVT 1.02 inferior 0.05
♥
University of Aberdeen, 2015 Page 36
Frameworks • Probabilistic Argumentation
〈{e1},CP > NT〉
〈{e2},CP > NT〉
〈{e1, e2},CP > NT〉
〈{e3},CP < NT〉
〈{e4},CP < NT〉
〈{e3, e4},CP < NT〉
〈Notstatistically
significant〉
Figure 4.8: Arguments derived from Ex. 8, with preferences and meta
arguments.
4.3 Probabilistic Argumentation
Epistemic Approach [Thi12; Hun13; HT14; BGV14]
Definition 62. Probability distribution over models of the language M
A function P : M → [0,1] such that
m∈M
P(m) = 15
♠
Definition 63. Probability of a formula φ, cf. [Par94]
P(φ) =
m∈Models(φ)
P(m)
♠
Example 9.10
Model a b P
m1 true true 0.8
m2 true false 0.2
m3 false true 0.0
m4 false false 0.0
• P(a) = 1
• P(a∧ b) = 0.8
• P(b ∨¬b) = 1
• P(¬a∨¬b) = 0.215
♥
University of Aberdeen, 2015 Page 37
Frameworks • Probabilistic Argumentation
Definition 64. Probability of an argument The probability of an argu-
ment 〈Φ,α〉, denoted P(〈Φ,α〉), is P(φ1∧...∧φn), where Φ = {φ, ...,φn}. ♠
Example 10. Consider the following probability distributions over mod-
els
Model a b Agent 1 Agent 2
m1 true true 0.5 0.0
m2 true false 0.5 0.0
m3 false true 0.0 0.6
m4 false false 0.0 0.4
5
Below is the probability of each argument according to each participant.
Argument Agent 1 Agent 2
a1 = 〈{a},a〉 1.0 0.0
a2 = 〈{b,b → ¬a},¬a〉 0.0 0.6
a3 = 〈{¬b},¬b〉 0.5 0.4
♥
Definition 65. For an argumentation framework AF = 〈A ,→〉 and a
probability assignment P, the epistemic extension is10
{a ∈ A | P(a) > 0.5}
♠
Definition 66 (From [Thi12; Hun13; BGV14]). Given an argumentation
framework 〈A ,→〉, a probability function:
COH P is coherent if for every a,b ∈ A , if a attacks b then P(a) ≤ 1−P(b).15
SFOU P is semi-founded if P(a) ≥ 0.5 for every unattacked a ∈ A .
FOU P is foundedif P(a) = 1 for every unattacked a ∈ A .
SOPT P is semi-optimistic if P(a) ≥ 1− b∈a− P(b) for every a ∈ A with at
least one attacker.
OPT P is optimistic if P(a) ≥ 1− b∈a− P(b) for every a ∈ A .20
JUS P is justifiableif P is coherent and optimistic.
TER P is ternary if P(a) ∈ {0,0.5,1} for every a ∈ A .
RAT P is rational if for every a,b ∈ A , if a attacks b then P(a) > 0.5
implies P(b) ≤ 0.5.
NEU P is neutral if P(a) = 0.5 for every a ∈ A .25
University of Aberdeen, 2015 Page 38
Frameworks • Structural Approach [Hun14]
INV P is involutary if for every a,b ∈ A , if a attacks b, then P(a) =
1− P(b).
Let the event “a is accepted” be denoted as a, and let be Eac(S) =
{a|a ∈ S}. Then P is weakly p-justifiable iff ∀a ∈ A , ∀b ∈ a−
, P(a) ≤ 1 −
P(b). ♠5
Proposition 3 ([BGV14]). For every argumentation framework, there is
at least one P that it is de Finetti coherent [Fin74] and weakly p-justifiable.
♣
Definition 67. Correspondences between probabilistic and classical se-
mantics10
Restriction on complete probability function P Classical semantics
No restriction complete extensions
No arguments a such that P(a) = 0.5 stable
Maximal no. of a such that P(a) = 1 preferred
Maximal no. of a such that P(a) = 0 preferred
Maximal no. of a such that P(a) = 0.5 grounded
Minimal no. of a such that P(a) = 1 grounded
Minimal no. of a such that P(a) = 0 grounded
Minimal no. of a such that P(a) = 0.5 semi-stable
♠
4.4 Structural Approach [Hun14]
Definition 68. Subframework For G = 〈A ,→〉 and G = 〈A ,→ 〉,
G G iff A ⊆ A and → = {〈a,b〉 ∈→| a,b ∈ A }15
♠
Definition 69. Graphs giving an extension For an argument framework
G = 〈A ,→〉, a set of arguments Γ ⊆ A , and a semantics σ,
QX (Γ) = {G G | G σ Γ}
where G σ Γ denotes that Γ is an σ extension of G . ♠20
Definition 70. Probability of a set being an extension The probability
that a set of arguments Γ is an σ extension, denoted Pσ(Γ), is
PX (Γ) =
G ∈Qσ(Γ)
P(G )
where P is a probability distribution over subframeworks of G. ♠
University of Aberdeen, 2015 Page 39
Frameworks • A Computational Framework
Example 11.
Subframework Probability
G1 a ↔ b 0.09
G2 a 0.81
G3 b 0.01
G4 0.09
PGR({a,b}) = = 0.00
PGR({a}) = P(G2) = 0.81
PGR({b}) = P(G3) = 0.01
PGR({}) = P(G1)+ P(G4) = 0.18
♥
4.5 A Computational Framework5
Definition 71 ([LON12; Li15]). A Li-PAF is a tuple 〈A ,PA ,→,P→〉, where
〈A ,→〉 is an argumentation framework, PA : A → (0..1] and P→ :→→
(0..1]. ♠
Definition 72 ([LON12; Li15]). Given a Li-PAF 〈A ,PA ,→,P→〉, AFI
=
〈A I
,→I
〉 is said to be induced iff A I
⊆ A ; and →I
⊆→ ∩(A T
× A T
); and10
∀a ∈ A s.t. PA (a) = 1,a ∈ A I
; and ∀〈a,b〉 ∈→ where P(a) = P(b) = 1 if
P→(〈a,b〉) = 1, then 〈a,b〉 ∈→I
. ♠
Under an assumption of independence, the probability of an inducible
∆I
= 〈A I
,→I
〉, denoted PI
PrAF
(∆I
), by the following equation:
PI
PrAF
(∆I
) = a∈A I PA (a) a∈A A I (1− PA (a)) 〈a,b〉∈→I P→(〈a,b〉)
〈a,b〉∈(→∪(A I ×A I ))→I (1− P→(〈a,b〉))
Assumption relaxed in [LON13; Li15] by relying on a bipolar argu-15
mentation framework, i.e. the evidential argumentation framework [ON08].
A correspondence with ASPIC+
is also drawn in [Li15], see Figure 4.9.
University of Aberdeen, 2015 Page 40
Frameworks • A Computational Framework
Convert to
ASPIC+ Argumentation
System
• Logical Language
• Inference Rules
• Contrariness Function
• ......
Structured
Argumentation
Framework
(SAF)
DAF
DAFEAF
Extended
Evidential
Framework
(EEAF)
Probabilistic
Extended
Evidential
Framework
Convert to
Convert to
Extended
Evidential
Framework
(EEAF)
Model
Probabilistic
Extended
Evidential
Framework
Associate
Probabilities
Convert toPrEAF
Associate
Probabilities
Semantics
Preserved
PrAF
Associate
Probabilities
Figure 4.9: [Li15]’s probabilistic argumentation architecture.
University of Aberdeen, 2015 Page 41
5 A novel synthesis: Collaborative
Intelligence Spaces (CISpaces)
Acknowledgement
This handout include material from a number of collaborators including
Alice Toniolo and Timothy J. Norman. Main reference: [Ton+15].5
5.1 Introduction
Problem
• Intelligence analysis is critical for making well-informed decisions
• Complexities in current military operations increase the amount of
information available to intelligence analysts10
CISpaces (Collaborative Intelligence Spaces)
• A toolkit developed to support collaborative intelligence analysis
• CISpaces aims to improve situational understanding of evolving sit-
uations
5.2 Intelligence Analysis15
Definition 73 ([DCD11]). The directed and coordinated acquisition and
analysis of information to assess capabilities, intent and opportunities for
exploitation by leaders at all levels. ♠
Fig. 5.1 summarises the Pirolli and Card Model [PC05].
Table 5.1 illustrates the problems of individual analysis and how col-20
laborative analysis can improve it.
University of Aberdeen, 2015 Page 42
CISpaces • Intelligence Analysis
External
Data
Sources
Presentation
Search
and Filter
Schematize
Build Case
Tell Story
Reevaluate
Search
for support
Search
for evidence
Search for
information
FORAGING LOOP
SENSE-MAKING LOOP
Structure
Effort
inf
Shoebox
Ev
Ev
EvEv Ev
Ev
Ev
Ev
Ev
Ev
Ev
Evidence File
Hyp1 Hyp2
Hypotheses
Pirolli & Card Model
Figure 5.1: The Pirolli & Card Model [PC05]
Individual analysis Collaborative analysis
• Scattered Information &
Noise
• Hard to make connections
• Missing Information
• Cognitive biases
• Missing Expertise
• More effective and reliable
• Brings together different
expertise, resources
• Prevent biases
Table 5.1: Individual vs. Collaborative Analysis
University of Aberdeen, 2015 Page 43
CISpaces • Intelligence Analysis
Harbour
Kish Farm
KISH
River
Water pipe
Aqueduct
KISHSHIRE
Kish Hall
Hotel
Illness among young and
elderly people in
Kishshire caused by
bacteria
Unidentified illness is
affecting the local
livestock in Kishshire,
the rural area of Kish
Figure 5.2: Initial information assigned to Joe
PEOPLE and
LIVESTOCK
illness
Water TEST
shows a
BACTERIA in the
water supply
Answer to POI:
"GER-MAN" seen
in Kish
Explosion in KISH
Hall Hotel
TIME
Tests on people/livestock POI for suspicious people
Figure 5.3: Further events happening in Kish
Example of Intelligence Analysis Process
Goal: discover potential threats in Kish
Analysts: Joe, Miles and Ella
What Joe knows is summarised by Figs. 5.2 and 5.3
Main critical points and possible conclusions during the analysis:5
• Causes of water contamination → waterborne/non-waterborne
bacteria;
• POI responsible for water contamination;
• Causes of hotel explosion.
University of Aberdeen, 2015 Page 44
CISpaces • Reasoning with Evidence
5.3 Reasoning with Evidence
• Identify what to believe happened from the claims constructed upon
information (the sensemaking process);
• Derive conclusions from data aggregated from explicitly requested
information (the crowdsourcing process);5
• Assess what is credible according to the history of data manipula-
tion (the provenance reasoning process).
5.4 Arguments for Sensemaking
Formal Linkage for Semantics Computation
A CISpace graph, WAT, can be transformed into a corresponding ASPIC-10
based argumentation theory. An edge in CISpaces is represented textu-
ally as →, an info/claim node is written pi and a link node is referred to
as type where type = {Pro,Con}. Then, [p1,...,pn → Pro → pφ] indicates
that the Pro-link has p1,..., pn as incoming nodes and an outgoing node
pφ.15
Definition 74. A WAT is a tuple 〈K, AS〉 such that AS= 〈L ,¯,R〉 is con-
structed as follows:
• L is a propositional logic language, and a node corresponds to a
proposition p ∈ L . The WAT set of propositions is Lw.
• The set R is formed by rules ri ∈ R corresponding to Pro links20
between nodes such that: [p1,..., pn → Pro → pφ] is converted to
ri : p1,..., pn ⇒ pφ
• The contrariness function between elements is defined as: i) if [p1 →
Con → p2] and [p2 → Con → p1], p1 and p2 are contradictory; ii)
[p1 → Con → p2] and p1 is the only premise of the Con link, then p125
is a contrary of p2; iii) if [p1, p3 → Con → p2] then a rule is added
such that p1 and p3 form an argument with conclusion ph against
p2, ri : p1, p3 ⇒ ph and ph is a contrary of p2. ♠
Definition 75. K is composed of propositions pi,
K = {pj, pi,...}, such that: i) let a set of rules r1,...,rn ∈ R indicate a cycle30
such that for all pi that are consequents of a rule r exists r containing pi
as antecedent, then pi ∈ K if pi is an info-node; ii) otherwise, pi ∈ K if pi
is not consequent of any rule r ∈ R. ♠
University of Aberdeen, 2015 Page 45
CISpaces • Arguments for Provenance
An Example of Argumentation Schemes for Intelligence
Analysis
Intelligence analysis broadly consists of three components: Activities
(Act) including actions performed by actors, and events happening in the
world; Entities (Et) including actors as individuals or groups, and objects5
such as resources; and Facts (Ft) including statements about the state of
the world regarding entities and activities.
A hypothesis in intelligence analysis is composed of activities and events
that show how the situation has evolved. The argument from cause to ef-
fect (ArgCE) forms the basis of these hypotheses. The scheme, adapted10
from [WRM08], is:
Argument from cause to effect
Premises:
• Typically, if C (either a fact Fti or an ac-
tivity Acti) occurs, then E (either a fact
Fti or an activity Acti) will occur
• In this case, C occurs
Conclusions:
In this case E will occur
Critical questions:
CQCE1 Is there evidence for C to occur?
CQCE1 Is there a general rule for C causing E ?
CQCE3 Is the relationship between C and E
causal?
CQCE4 Are there any exceptions to the causal
rule that prevent the effect E from occur-
ring?
CQCE5 Has C happened before E ?
CQCE6 Is there any other C that caused E ?
Formally:
rCE : rule(R,C ,E ),occur(C ),before(C ,E ),15
ruletype(R,causal),noexceptions(R) ⇒ occur(E )
5.5 Arguments for Provenance
Provenance can be used to annotate how, where, when and by whom some
information was produced [MM13]. Figure 5.4 depicts the core model for
University of Aberdeen, 2015 Page 46
CISpaces • Arguments for Provenance
WasInformedBy
Used
WasGeneratedBy
WasAssociatedWith
ActedOnBehalfOf
WasAttributedTo
WasDerivedFrom
Entity
Actor
Activity
Figure 5.4: PROV Data Model [MM13]
Lab Water
Testing
wasGeneratedBy
Used
wasAssociatedWith
pjID:Bacteria
contaminates
local water
Water
Sample
Generate
Requirement
Water
monitoring
Requirement
wasDerivedFrom
Used
wasGeneratedBy
wasInformedBy
Monitoring of
water supply
used
water
contamination
report
Report
generation
Used wasGeneratedBy
wasAssociatedWith
wasDerivedFrom
?a1Pattern Pg
Goal
NGO
lab
assistant
NGO
Chemical
Lab
PrimarySource
Time2014-11-13T08-16-45Z
Time2014-11-12T10-14-40Z
Time2014-11-14T05-14-10Z
?a2
?p
?ag
LEGEND
p-Agent
p-Entity
p-Activity
Node
Older p-elements Newer
Figure 5.5: Provenance of Joe’s information
representing provenance, and Figure 5.5 shows an example of provenance
for the pieces of information for analyst Joe w.r.t. the water contamination
problem in Kish.
Patterns representing relevant provenance information that may war-
rant the credibility of a datum can be integrated into the analysis by ap-5
plying the argument scheme for provenance (ArgPV) [Ton+14]:
University of Aberdeen, 2015 Page 47
CISpaces • Arguments for Provenance
Argument Scheme for Provenance
Premises:
• Given pj about activity Acti, entity Eti, or
fact Fti (ppv1)
• GP(pj) includes pattern Pm of p-entities
Apv, p-activities Ppv, p-agents Agpv in-
volved in producing pj (ppv2)
• GP(pj) infers that information pj is true
(ppv3)
Conclusions:
Acti/Eti/Fti in pj may plausibly be true
(ppvcn)
Critical questions:
CQPV1 Is pj consistent with other information?
CQPV2 Is pj supported by evidence?
CQPV3 Does GP(pj) contain p-elements that lead
us not to believe pj?
CQPV4 Is there any other p-element that should
have been included in GP(pj) to infer that
pj is credible?
University of Aberdeen, 2015 Page 48
6 Implementations
Acknowledgement
This handout include material from a number of collaborators including
Massimiliano Giacomin, Mauro Vallati, and Stefan Woltran.
Comprehensive survey recently published in [Cha+15].5
6.1 Ad Hoc Procedures
NAD-Alg [NDA12; NAD14]
6.2 Constraint Satisfaction Programming
A Constraint Satisfaction Problem (CSP) P [BS12; RBW08] is a triple
P = 〈X,D,C〉 such that:10
• X = 〈x1,...,xn〉 is a tuple of variables;
• D = 〈D1,...,Dn〉 a tuple of domains such that ∀i,xi ∈ Di;
• C = 〈C1,...,Ct〉 is a tuple of constraints, where ∀j,Cj = 〈RSj
,Sj〉,
Sj ⊆ {xi|xi is a variable}, RSj
⊆ SD
j
× SD
j
where SD
j
= {Di|Di is a
domain, and xi ∈ Sj}.15
A solution to the CSP P is A = 〈a1,...,an〉 where ∀i,ai ∈ Di and ∀j,RSj
holds on the projection of A onto the scope Sj. If the set of solutions is
empty, the CSP is unsatisfiable.
University of Aberdeen, 2015 Page 49
Implementations • Answer Set Programming
CONArg2 [BS12]
In [BS12], the authors propose a mapping from AFs to CSPs.
Given an AF Γ, they first create a variable for each argument whose
domain is always {0,1} — ∀ai ∈ A ,∃xi ∈ X such that Di = {0,1}.
Subsequently, they describe constraints associated to different defi-5
nitions of Dung’s argumentation framework: for instance {a1,a2} ⊆ A is
D-conflict-free iff ¬(x1 = 1∧ x2 = 1).
6.3 Answer Set Programming
Answer Set Programming (ASP) [Fab13] is a declarative problem solving
paradigm. In ASP, representation is done using a rule-based language,10
while reasoning is performed using implementations of general-purpose
algorithms, referred to as ASP solvers.
AspartixM [EGW10; Dvo+11]
AspartixM [Dvo+11] expresses argumentation semantics in Answer Set
Programming (ASP): a single program is used to encode a particular ar-15
gumentation semantics, and the instance of an argumentation framework
is given as an input database. Tests for subset-maximality exploit the
metasp optimisation frontend for the ASP-package gringo/claspD.
Given an AF Γ, Aspartix encodes the requirements for a “semantics”
(e.g. the D-conflict-free requirements) in an ASP program whose database20
considers:
{arg(a) | a ∈ A }∪{defeat(a1,a2) | 〈a1,a2〉 ∈→}
The following program fragment is thus used to check the D-conflict-
freeness [Dvo+11]:
πcf = { in(X) ← not out(X),arg(X);
out(X) ← not in(X),arg(X);
← in(X),in(Y ),defeat(X,Y )}.
25
πS T = { in(X) ← not out(X),arg(X);
out(X) ← not in(X),arg(X);
← in(X),in(Y ),defeat(X,Y );
defeated(X) ← in(Y ),defeat(Y , X);
← out(X),not defeated(X)}.
6.4 Propositional Satisfiability Problems
In the propositional satisfiability problem (SAT) the goal is to determine
whether a given Boolean formula is satisfiable. A variable assignment
that satisfies a formula is a solution.30
University of Aberdeen, 2015 Page 50
Implementations • Propositional Satisfiability Problems
In SAT, formulae are commonly expressed in Conjunctive Normal Form
(CNF). A formula in CNF is a conjunction of clauses, where clauses are
disjunctions of literals, and a literal is either positive (a variable) or neg-
ative (the negation of a variable). If at least one of the literals in a clause
is true, then the clause is satisfied, and if all clauses in the formula are5
satisfied then the formula is satisfied and a solution has been found.
PrefSAT [Cer+14b]
Requirements for complete labelling as a CNF [Cer+14b]: for each argu-
ment ai ∈ A , three propositional variables are considered: Ii (which is
true iff L ab(ai) = in), Oi (which is true iff L ab(ai) = out), Ui (which is10
true iff L ab(ai) = undec). Given |A | = k and φ : {1,...,k} → A .
i∈{1,...,k}
(Ii ∨Oi ∨Ui)∧(¬Ii ∨¬Oi)∧(¬Ii ∨¬Ui)∧(¬Oi ∨¬Ui) (6.1)
{i|φ(i)−= }
Ii (6.2)
{i|φ(i)−= }
Ii ∨
{j|φ(j)→φ(i)}
(¬Oj) (6.3)
{i|φ(i)−= } {j|φ(j)→φ(i)}
¬Ii ∨Oj (6.4)15
{i|φ(i)−= } {j|φ(j)→φ(i)}
¬I j ∨Oi (6.5)
{i|φ(i)−= }
¬Oi ∨
{j|φ(j)→φ(i)}
I j (6.6)
{i|φ(i)−= } {k|φ(k)→φ(i)}
Ui ∨¬Uk ∨
{j|φ(j)→φ(i)}
I j (6.7)
{i|φ(i)−= } {j|φ(j)→φ(i)}
(¬Ui ∨¬I j) ∧ ¬Ui ∨
{j|φ(j)→φ(i)}
Uj (6.8)
i∈{1,...k}
Ii (6.9)20
University of Aberdeen, 2015 Page 51
Implementations • Propositional Satisfiability Problems
As noticed in [Cer+14b], the conjunction of the above formulae is re-
dundant. However, the non-redundant CNFs are not equivalent from an
empirical evaluation [Cer+14b]: the overall performance is significantly
affected by the chosen configuration pair CNF encoding–SAT solver.
University of Aberdeen, 2015 Page 52
Implementations • Propositional Satisfiability Problems
Algorithm 1 Enumerating the D-preferred extensions of an AF
PrefSAT(∆)
1: Input: ∆ = Γ
2: Output: Ep ⊆ 2A
3: Ep :=
4: cnf := Π∆
5: repeat
6: cnf df := cnf
7: pref cand :=
8: repeat
9: lastcompf ound := SatS(cnf df )
10: if lastcompf ound ! = ε then
11: pref cand := lastcompf ound
12: for a1 ∈ I-ARGS(lastcompf ound) do
13: cnf df := cnf df ∧ Iφ−1(a1)
14: end for
15: remaining := F ALSE
16: for a1 ∈ A I-ARGS(lastcompf ound) do
17: remaining := remaining ∨ Iφ−1(a1)
18: end for
19: cnf df := cnf df ∧ remaining
20: end if
21: until (lastcompf ound ! = ε∧I-ARGS(lastcompf ound) ! = A )
22: if pref cand ! = then
23: Ep := Ep ∪{I-ARGS(pref cand)}
24: oppsolution := F ALSE
25: for a1 ∈ A I-ARGS(pref cand) do
26: oppsolution := oppsolution∨ Iφ−1(a1)
27: end for
28: cnf := cnf ∧ oppsolution
29: end if
30: until (pref cand ! = )
31: if Ep = then
32: Ep = { }
33: end if
34: return Ep
University of Aberdeen, 2015 Page 53
Implementations • Propositional Satisfiability Problems
Parallel-SCCp [Cer+14a; Cer+15]
Based on the SCC-Recursiveness Schema [BGG05].
ab
ef
cdgh
University of Aberdeen, 2015 Page 54
Implementations • Propositional Satisfiability Problems
Algorithm 1 Computing D-preferred labellings of an AF
P-PREF(∆)
1: Input: ∆ = Γ
2: Output: Ep ∈ 2L(∆)
3: return P-SCC-REC(∆,A )
Algorithm 2 Greedy computation of base cases
GREEDY(L,C)
1: Input: L = (L1
,...,Ln
:= {Sn
1 ,...,Sn
h
}),C ⊆ A
2: Output: M = {...,(Si,Bi),...}
3: M :=
4: for S ∈ n
i=1
Li
do in parallel
5: B := B-PR(∆↓S,S ∩C)
6: M = M ∪{(S,B)}
7: end for
8: return M
BOUNDCOND(∆,Si,L ab) returns (O, I) where O = {a1 ∈ Si | ∃a2 ∈
S ∩ a−
1 : L ab(a2) = in} and I = {a1 ∈ Si | ∀ a2 ∈ S ∩ a−
1 ,L ab(a2) = out},
with S ≡ S1 ∪...∪ Si−1.
University of Aberdeen, 2015 Page 55
Implementations • Propositional Satisfiability Problems
Algorithm 3 Determining the D-grounded labelling of an AF in a set C
GROUNDED(∆,C)
1: Input: ∆ = Γ, C ⊆ A
2: Output: (L ab,U) : U ⊆ A ,L ab ∈ LA U
3: L ab :=
4: U := A
5: repeat
6: initial f ound := ⊥
7: for a1 ∈ C do
8: if {a2 ∈ U | a2 → a1} = then
9: initial f ound :=
10: L ab := L ab ∪{(a1,in)}
11: U := U a1
12: C := C a1
13: for a2 ∈ (U ∩a+
1 ) do
14: L ab := L ab ∪{(a2,out)}
15: U := U a2
16: C := C a2
17: end for
18: end if
19: end for
20: until (initial f ound)
21: return(L ab,U)
University of Aberdeen, 2015 Page 56
Implementations • Propositional Satisfiability Problems
Algorithm 4 Computing D-preferred labellings of an AF in C
P-SCC-REC(∆,C)
1: Input: ∆ = Γ, C ⊆ A
2: Output: Ep ∈ 2L(∆)
3: (L ab,U) = GROUNDED(∆,C)
4: Ep := {L ab}
5: ∆ = ∆↓U
6: L:= (L1
:= {S1
1,...,S1
k
},...,Ln
:= {Sn
1 ,...,Sn
h
})
= SCCS-LIST(∆)
7: M := {...,(Si,Bi),...} = GREEDY(L,C)
8: for l ∈ {1,...,n} do
9: El := {E
S1
l
:= (),...,E
Sk
l
:= ()}
10: for S ∈ Ll
do in parallel
11: for L ab ∈ Ep do in parallel
12: (O, I) := L-COND(∆,S,Ll
,L ab)
13: if I = then
14: ES
l
[L ab] ={{(a1,out) | a1 ∈ O} ∪{(a1,undec) | a1 ∈ S O}}
15: else
16: if I = S then
17: ES
l
[L ab] = B where (S,B) ∈ M
18: else
19: if O = then
20: ES
l
[L ab] = B-PR(∆↓S, I ∩C)
21: else
22: ES
l
[L ab]={{(a1,out) | a1 ∈ O}}
23: ES
l
[L ab] = ES
l
[L ab]⊗P-SCC-REC(∆↓SO, I ∩C)
24: end if
25: end if
26: end if
27: end for
28: end for
29: for S ∈ Ll
do
30: Ep :=
31: for L ab ∈ Ep do in parallel
32: Ep = Ep ∪({L ab}⊗ ES
l
[L ab])
33: end for
34: Ep := Ep
35: end for
36: end for
37: return Ep
University of Aberdeen, 2015 Page 57

More Related Content

What's hot

Continuation calculus at Term Rewriting Seminar
Continuation calculus at Term Rewriting SeminarContinuation calculus at Term Rewriting Seminar
Continuation calculus at Term Rewriting Seminarbgeron
 
differentiate free
differentiate freedifferentiate free
differentiate freelydmilaroy
 
Presentation iaf 2014 v1
Presentation iaf 2014 v1Presentation iaf 2014 v1
Presentation iaf 2014 v1Fayçal Touazi
 
Epanaliptiko pros spiros_giannakaros_2021
Epanaliptiko pros spiros_giannakaros_2021Epanaliptiko pros spiros_giannakaros_2021
Epanaliptiko pros spiros_giannakaros_2021Christos Loizos
 
5.1 Defining and visualizing functions. Dynamic slides.
5.1 Defining and visualizing functions. Dynamic slides.5.1 Defining and visualizing functions. Dynamic slides.
5.1 Defining and visualizing functions. Dynamic slides.Jan Plaza
 
Postdoctoral research statement
Postdoctoral research statementPostdoctoral research statement
Postdoctoral research statementSusovan Pal
 
Best polynomial approximation
Best polynomial approximationBest polynomial approximation
Best polynomial approximationDadang Hamzah
 
Declare Your Language: Constraint Resolution 1
Declare Your Language: Constraint Resolution 1Declare Your Language: Constraint Resolution 1
Declare Your Language: Constraint Resolution 1Eelco Visser
 
Top school in noida
Top school in noidaTop school in noida
Top school in noidaEdhole.com
 
Existence and Uniqueness of Algebraic Closure
Existence and Uniqueness of Algebraic ClosureExistence and Uniqueness of Algebraic Closure
Existence and Uniqueness of Algebraic ClosureAyan Sengupta
 
Learning with Nets and Meshes
Learning with Nets and MeshesLearning with Nets and Meshes
Learning with Nets and MeshesDon Sheehy
 
Class1
 Class1 Class1
Class1issbp
 
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...Argumentation and Machine Learning: When the Whole is Greater than the Sum of...
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...Federico Cerutti
 

What's hot (18)

Continuation calculus at Term Rewriting Seminar
Continuation calculus at Term Rewriting SeminarContinuation calculus at Term Rewriting Seminar
Continuation calculus at Term Rewriting Seminar
 
differentiate free
differentiate freedifferentiate free
differentiate free
 
Prosomoiwsh 1 xenos
Prosomoiwsh 1 xenosProsomoiwsh 1 xenos
Prosomoiwsh 1 xenos
 
Presentation iaf 2014 v1
Presentation iaf 2014 v1Presentation iaf 2014 v1
Presentation iaf 2014 v1
 
Epanaliptiko pros spiros_giannakaros_2021
Epanaliptiko pros spiros_giannakaros_2021Epanaliptiko pros spiros_giannakaros_2021
Epanaliptiko pros spiros_giannakaros_2021
 
5.1 Defining and visualizing functions. Dynamic slides.
5.1 Defining and visualizing functions. Dynamic slides.5.1 Defining and visualizing functions. Dynamic slides.
5.1 Defining and visualizing functions. Dynamic slides.
 
Postdoctoral research statement
Postdoctoral research statementPostdoctoral research statement
Postdoctoral research statement
 
Best polynomial approximation
Best polynomial approximationBest polynomial approximation
Best polynomial approximation
 
10.1.1.226.4381
10.1.1.226.438110.1.1.226.4381
10.1.1.226.4381
 
Declare Your Language: Constraint Resolution 1
Declare Your Language: Constraint Resolution 1Declare Your Language: Constraint Resolution 1
Declare Your Language: Constraint Resolution 1
 
Top school in noida
Top school in noidaTop school in noida
Top school in noida
 
Existence and Uniqueness of Algebraic Closure
Existence and Uniqueness of Algebraic ClosureExistence and Uniqueness of Algebraic Closure
Existence and Uniqueness of Algebraic Closure
 
Learning with Nets and Meshes
Learning with Nets and MeshesLearning with Nets and Meshes
Learning with Nets and Meshes
 
Au4103292297
Au4103292297Au4103292297
Au4103292297
 
Poster of ECAI 2020
Poster of ECAI 2020Poster of ECAI 2020
Poster of ECAI 2020
 
Class1
 Class1 Class1
Class1
 
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...Argumentation and Machine Learning: When the Whole is Greater than the Sum of...
Argumentation and Machine Learning: When the Whole is Greater than the Sum of...
 
Limits BY ATC
Limits BY ATCLimits BY ATC
Limits BY ATC
 

Similar to Argumentation in Artificial Intelligence: 20 years after Dung's work. Right margin for notes

Math 150 fall 2020 homework 1 due date friday, october 15,
Math 150 fall 2020 homework 1 due date friday, october 15,Math 150 fall 2020 homework 1 due date friday, october 15,
Math 150 fall 2020 homework 1 due date friday, october 15,MARRY7
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting PosterChelsea Battell
 
9 normalization
9 normalization9 normalization
9 normalizationGRajendra
 
Congruence Distributive Varieties With Compact Intersection Property
Congruence Distributive Varieties With Compact Intersection PropertyCongruence Distributive Varieties With Compact Intersection Property
Congruence Distributive Varieties With Compact Intersection Propertyfilipke85
 
The Chase in Database Theory
The Chase in Database TheoryThe Chase in Database Theory
The Chase in Database TheoryJan Hidders
 
Relations and function class xii copy
Relations and function class xii   copyRelations and function class xii   copy
Relations and function class xii copycsanjeive
 
ContextFreeGrammars.pptx
ContextFreeGrammars.pptxContextFreeGrammars.pptx
ContextFreeGrammars.pptxPEzhumalai
 
ContextFreeGrammars (1).pptx
ContextFreeGrammars (1).pptxContextFreeGrammars (1).pptx
ContextFreeGrammars (1).pptxviswanath kani
 
Context Free Languages by S.Mandal-1.ppt
Context Free Languages by S.Mandal-1.pptContext Free Languages by S.Mandal-1.ppt
Context Free Languages by S.Mandal-1.ppt1sonalishipu
 
proving triangles are congruent.docx
proving triangles are congruent.docxproving triangles are congruent.docx
proving triangles are congruent.docxJOHNFRITSGERARDMOMBA1
 
Differential Equations Assignment Help
Differential Equations Assignment HelpDifferential Equations Assignment Help
Differential Equations Assignment HelpMaths Assignment Help
 
Partial ordering in soft set context
Partial ordering in soft set contextPartial ordering in soft set context
Partial ordering in soft set contextAlexander Decker
 
DBMS FDs and Normalization.pptx
DBMS FDs and Normalization.pptxDBMS FDs and Normalization.pptx
DBMS FDs and Normalization.pptxSakshamLal3
 

Similar to Argumentation in Artificial Intelligence: 20 years after Dung's work. Right margin for notes (20)

Cerutti -- TAFA2013
Cerutti -- TAFA2013Cerutti -- TAFA2013
Cerutti -- TAFA2013
 
Cs501 fd nf
Cs501 fd nfCs501 fd nf
Cs501 fd nf
 
Math 150 fall 2020 homework 1 due date friday, october 15,
Math 150 fall 2020 homework 1 due date friday, october 15,Math 150 fall 2020 homework 1 due date friday, october 15,
Math 150 fall 2020 homework 1 due date friday, october 15,
 
2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster2015 CMS Winter Meeting Poster
2015 CMS Winter Meeting Poster
 
9 normalization
9 normalization9 normalization
9 normalization
 
Congruence Distributive Varieties With Compact Intersection Property
Congruence Distributive Varieties With Compact Intersection PropertyCongruence Distributive Varieties With Compact Intersection Property
Congruence Distributive Varieties With Compact Intersection Property
 
Assignment#16
Assignment#16Assignment#16
Assignment#16
 
The Chase in Database Theory
The Chase in Database TheoryThe Chase in Database Theory
The Chase in Database Theory
 
proving triangles are congruent
proving triangles are congruentproving triangles are congruent
proving triangles are congruent
 
Context free grammar
Context free grammarContext free grammar
Context free grammar
 
Relations and function class xii copy
Relations and function class xii   copyRelations and function class xii   copy
Relations and function class xii copy
 
18560 lecture6
18560 lecture618560 lecture6
18560 lecture6
 
ContextFreeGrammars.pptx
ContextFreeGrammars.pptxContextFreeGrammars.pptx
ContextFreeGrammars.pptx
 
ContextFreeGrammars (1).pptx
ContextFreeGrammars (1).pptxContextFreeGrammars (1).pptx
ContextFreeGrammars (1).pptx
 
Context Free Languages by S.Mandal-1.ppt
Context Free Languages by S.Mandal-1.pptContext Free Languages by S.Mandal-1.ppt
Context Free Languages by S.Mandal-1.ppt
 
proving triangles are congruent.docx
proving triangles are congruent.docxproving triangles are congruent.docx
proving triangles are congruent.docx
 
Differential Equations Assignment Help
Differential Equations Assignment HelpDifferential Equations Assignment Help
Differential Equations Assignment Help
 
Partial ordering in soft set context
Partial ordering in soft set contextPartial ordering in soft set context
Partial ordering in soft set context
 
DBMS FDs and Normalization.pptx
DBMS FDs and Normalization.pptxDBMS FDs and Normalization.pptx
DBMS FDs and Normalization.pptx
 
Context free grammer.ppt
Context free grammer.pptContext free grammer.ppt
Context free grammer.ppt
 

More from Federico Cerutti

Security of Artificial Intelligence
Security of Artificial IntelligenceSecurity of Artificial Intelligence
Security of Artificial IntelligenceFederico Cerutti
 
Introduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksIntroduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksFederico Cerutti
 
Human-Argumentation Experiment Pilot 2013: Technical Material
Human-Argumentation Experiment Pilot 2013: Technical MaterialHuman-Argumentation Experiment Pilot 2013: Technical Material
Human-Argumentation Experiment Pilot 2013: Technical MaterialFederico Cerutti
 
Probabilistic Logic Programming with Beta-Distributed Random Variables
Probabilistic Logic Programming with Beta-Distributed Random VariablesProbabilistic Logic Programming with Beta-Distributed Random Variables
Probabilistic Logic Programming with Beta-Distributed Random VariablesFederico Cerutti
 
Supporting Scientific Enquiry with Uncertain Sources
Supporting Scientific Enquiry with Uncertain SourcesSupporting Scientific Enquiry with Uncertain Sources
Supporting Scientific Enquiry with Uncertain SourcesFederico Cerutti
 
Introduction to Formal Argumentation Theory
Introduction to Formal Argumentation TheoryIntroduction to Formal Argumentation Theory
Introduction to Formal Argumentation TheoryFederico Cerutti
 
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...Federico Cerutti
 
Algorithm Selection for Preferred Extensions Enumeration
Algorithm Selection for Preferred Extensions EnumerationAlgorithm Selection for Preferred Extensions Enumeration
Algorithm Selection for Preferred Extensions EnumerationFederico Cerutti
 
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...Federico Cerutti
 
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...Federico Cerutti
 
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...Federico Cerutti
 
Cerutti-AT2013-Graphical Subjective Logic
Cerutti-AT2013-Graphical Subjective LogicCerutti-AT2013-Graphical Subjective Logic
Cerutti-AT2013-Graphical Subjective LogicFederico Cerutti
 
Cerutti-AT2013-Trust and Risk
Cerutti-AT2013-Trust and RiskCerutti-AT2013-Trust and Risk
Cerutti-AT2013-Trust and RiskFederico Cerutti
 
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)Federico Cerutti
 
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...Federico Cerutti
 
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...Federico Cerutti
 
Cerutti--PhD viva voce defence
Cerutti--PhD viva voce defenceCerutti--PhD viva voce defence
Cerutti--PhD viva voce defenceFederico Cerutti
 

More from Federico Cerutti (20)

Security of Artificial Intelligence
Security of Artificial IntelligenceSecurity of Artificial Intelligence
Security of Artificial Intelligence
 
Introduction to Evidential Neural Networks
Introduction to Evidential Neural NetworksIntroduction to Evidential Neural Networks
Introduction to Evidential Neural Networks
 
Human-Argumentation Experiment Pilot 2013: Technical Material
Human-Argumentation Experiment Pilot 2013: Technical MaterialHuman-Argumentation Experiment Pilot 2013: Technical Material
Human-Argumentation Experiment Pilot 2013: Technical Material
 
Probabilistic Logic Programming with Beta-Distributed Random Variables
Probabilistic Logic Programming with Beta-Distributed Random VariablesProbabilistic Logic Programming with Beta-Distributed Random Variables
Probabilistic Logic Programming with Beta-Distributed Random Variables
 
Supporting Scientific Enquiry with Uncertain Sources
Supporting Scientific Enquiry with Uncertain SourcesSupporting Scientific Enquiry with Uncertain Sources
Supporting Scientific Enquiry with Uncertain Sources
 
Introduction to Formal Argumentation Theory
Introduction to Formal Argumentation TheoryIntroduction to Formal Argumentation Theory
Introduction to Formal Argumentation Theory
 
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...
Argumentation in Artificial Intelligence: 20 years after Dung's work. Left ma...
 
Algorithm Selection for Preferred Extensions Enumeration
Algorithm Selection for Preferred Extensions EnumerationAlgorithm Selection for Preferred Extensions Enumeration
Algorithm Selection for Preferred Extensions Enumeration
 
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...
Formal Arguments, Preferences, and Natural Language Interfaces to Humans: an ...
 
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...
Argumentation Extensions Enumeration as a Constraint Satisfaction Problem: a ...
 
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...
A SCC Recursive Meta-Algorithm for Computing Preferred Labellings in Abstract...
 
Cerutti-AT2013-Graphical Subjective Logic
Cerutti-AT2013-Graphical Subjective LogicCerutti-AT2013-Graphical Subjective Logic
Cerutti-AT2013-Graphical Subjective Logic
 
Cerutti-AT2013-Trust and Risk
Cerutti-AT2013-Trust and RiskCerutti-AT2013-Trust and Risk
Cerutti-AT2013-Trust and Risk
 
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)
Cerutti--Introduction to Argumentation (seminar @ University of Aberdeen)
 
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...
Cerutti--Knowledge Representation and Reasoning (postgrad seminar @ Universit...
 
Cerutti--TAFA 2011
Cerutti--TAFA 2011Cerutti--TAFA 2011
Cerutti--TAFA 2011
 
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...
Cerutti--Verification of Crypto Protocols (postgrad seminar @ University of B...
 
Cerutti--ARGAIP 2010
Cerutti--ARGAIP 2010Cerutti--ARGAIP 2010
Cerutti--ARGAIP 2010
 
Cerutti--ECSQARU 2009
Cerutti--ECSQARU 2009Cerutti--ECSQARU 2009
Cerutti--ECSQARU 2009
 
Cerutti--PhD viva voce defence
Cerutti--PhD viva voce defenceCerutti--PhD viva voce defence
Cerutti--PhD viva voce defence
 

Recently uploaded

Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptxJoelynRubio1
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxmarlenawright1
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...pradhanghanshyam7136
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the ClassroomPooky Knightsmith
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17Celine George
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structuredhanjurrannsibayan2
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxPooja Bhuva
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactisticshameyhk98
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...Nguyen Thanh Tu Collection
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxJisc
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxJisc
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.MaryamAhmad92
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxannathomasp01
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfNirmal Dwivedi
 

Recently uploaded (20)

Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...Kodo Millet  PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
Kodo Millet PPT made by Ghanshyam bairwa college of Agriculture kumher bhara...
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Fostering Friendships - Enhancing Social Bonds in the Classroom
Fostering Friendships - Enhancing Social Bonds  in the ClassroomFostering Friendships - Enhancing Social Bonds  in the Classroom
Fostering Friendships - Enhancing Social Bonds in the Classroom
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactistics
 
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỬ TUYỂN SINH TIẾNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.ICT role in 21st century education and it's challenges.
ICT role in 21st century education and it's challenges.
 
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptxCOMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
COMMUNICATING NEGATIVE NEWS - APPROACHES .pptx
 
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdfUGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
UGC NET Paper 1 Mathematical Reasoning & Aptitude.pdf
 

Argumentation in Artificial Intelligence: 20 years after Dung's work. Right margin for notes

  • 1. Computing Science Argumentation in Artificial Intelligence: 20 Years after Dung’s Work Federico Cerutti Department of Computing Science July 2015 University of Aberdeen King’s College Aberdeen AB24 3UE Copyright © 2015, The University of Aberdeen
  • 2. Argumentation in Artificial Intelligence: 20 Years after Dung’s Work Federico Cerutti Department of Computing Science University of Aberdeen July 2015 Abstract: Handouts for the IJCAI 2015 tutorial on Argumentation. This document is a collection of technical definitions as well as ex- amples of various topics addressed in the tutorial. It is not supposed to be an exhaustive compendium of twenty years of research in ar- gumentation theory. This material is derived from a variety of publications from many researchers who hold the copyright and any other intellectual prop- erty of their work. Original publications are thoroughly cited and reported in the bibliography at the end of the document. Errors and misunderstandings rest with the author of this tutorial: please send an email to federico.cerutti@acm.org for reporting any. Keywords: argumentation; tutorial; IJCAI2015 University of Aberdeen, 2015 Page 1
  • 3. 1 Dung’s Argumentation Framework Acknowledgement This handout include material from a number of collaborators including Pietro Baroni, Massimiliano Giacomin, and Stefan Woltran.5 Definition 1 ([Dun95]). A Dung argumentation framework AF is a pair 〈A ,→ 〉 where A is a set of arguments, and → is a binary relation on A i.e. →⊆ A ×A . ♠ An argumentation framework has an obvious representation as a di- rected graph where the nodes are arguments and the edges are drawn10 from attacking to attacked arguments. The set of attackers of an argument a1 will be denoted as a− 1 {a2 : a2 → a1}, the set of arguments attacked by a1 will be denoted as a+ 1 {a2 : a1 → a2}. We also extend these notations to sets of arguments, i.e. given E ⊆ A , E− {a2 | ∃a1 ∈ E,a2 → a1} and E+ {a2 | ∃a1 ∈ E,a1 → a2}.15 With a little abuse of notation we define S → a ≡ ∃a ∈ S : a → b. Simi- larly, b → S ≡ ∃a ∈ S : b → a. 1.1 Principles for Extension-based Semantics: [BG07] Definition 2.
  • 4. Given an argumentation framework AF = 〈A ,→ 〉, a set20 S ⊆ A is D-conflict-free, denoted as D-cf(S), if and only if a,b ∈ S such that a → b. A semantics σ satisfies the D-conflict-free principle if and only if ∀AF,∀E ∈ Eσ(AF) E is D-conflict-free . ♠ Definition 3. Given an argumentation framework AF = 〈A ,→ 〉, an ar- gument a ∈ A is D-acceptable w.r.t. a set S ⊆ A if and only if ∀b ∈ A25 b → a ⇒ S → b. The function FAF : 2A → 2A which, given a set S ⊆ A , returns the set of the D-acceptable arguments w.r.t. S, is called the D-characteristic function of AF. ♠ Definition 4. Given an argumentation framework AF = 〈A ,→ 〉, a set30 S ⊆ A is D-admissible (S ∈ AS (AF)) if and only if D-cf(S) and ∀a ∈ S University of Aberdeen, 2015 Page 2
  • 5. Dung’s AF • Acceptability of Arguments [PV02; BG09a] a is D-acceptable w.r.t. S. The set of all the D-admissible sets of AF is denoted as AS (AF). ♠ Dσ = {AF|Eσ(AF) = } Definition 5.
  • 6. A semantics σ satisfies the D-admissibility principle if and only if ∀AF ∈ Dσ Eσ(AF) ⊆ AS (AF), namely ∀E ∈ Eσ(AF) it holds that: a ∈ E ⇒ (∀b ∈ A ,b → a ⇒ E → b). ♠ Definition 6. Given an argumentation framework AF = 〈A ,→ 〉, a ∈ A and S ⊆ A , we say that a is D-strongly-defended by S (denoted as5 D-sd(a,S)) iff ∀b ∈ A , b → a, ∃c ∈ S {a} : c → b and D-sd(c,S {a}). ♠ Definition 7.
  • 7. A semantics σ satisfies the D-strongly admissibility prin- ciple if and only if ∀AF ∈ Dσ, ∀E ∈ Eσ(AF) it holds that a ∈ E ⊃ D-sd(a,E) ♠ Definition 8.
  • 8. A semantics σ satisfies the D-reinstatement principle if and only if ∀AF ∈ Dσ, ∀E ∈ Eσ(AF) it holds that: (∀b ∈ A ,b → a ⇒ E → b) ⇒ a ∈ E. ♠ Definition 9.
  • 9. A set of extensions E is D-I-maximal if and only if ∀E1,E2 ∈ E , if E1 ⊆ E2 then E1 = E2. A semantics σ satisfies the D-I-maximality principle if and only if ∀AF ∈ Dσ Eσ(AF) is D-I-maximal. ♠ Definition 10. Given an argumentation framework AF = 〈A ,→ 〉, a non-10 empty set S ⊆ A is D-unattacked if and only if ∃a ∈ (A S) : a → S. The set of D-unattacked sets of AF is denoted as US (AF). ♠ Definition 11. Let AF = 〈A ,→ 〉 be an argumentation framework. The restriction of AF to S ⊆ A is the argumentation framework AF↓S = 〈S,→ ∩(S × S)〉. ♠15 Definition 12.
  • 10. A semantics σ satisfies the D-directionality principle if and only if ∀AF = 〈A ,→ 〉,∀S ∈ US (AF),AE σ(AF,S) = Eσ(AF↓S), where AE σ(AF,S) {(E ∩ S) | E ∈ Eσ(AF)} ⊆ 2S . ♠ 1.2 Acceptability of Arguments [PV02; BG09a] Definition 13. Given a semantics σ and an argumentation framework20 〈A ,→ 〉, an argument AF ∈ Dσ is: • skeptically justified iff ∀E ∈ Eσ(AF), a ∈ S; • credulously justified iff ∃E ∈ Eσ(AF), a ∈ S. ♠ University of Aberdeen, 2015 Page 3
  • 11. Dung’s AF • (Some) Semantics [Dun95] Definition 14. Given a semantics σ and an argumentation framework 〈A ,→ 〉, an argument AF ∈ Dσ is: • justified iff it is skeptically justified; • defensible iff it is credulously justified but not skeptically justified; • overruled iff it is not credulously justified. ♠5 1.3 (Some) Semantics [Dun95] Lemma 1 (Dung’s Fundamental Lemma, [Dun95, Lemma 10]). Given an argumentation framework AF = 〈A ,→ 〉, let S ⊆ A be a D-admissible set of arguments, and a,b be arguments which are acceptable with respect to S. Then:10 1. S = S ∪{a} is D-admissible; and 2. b is D-acceptable with respect to S . ♣ Theorem 1 ([Dun95, Theorem 11]). Given an argumentation framework AF = 〈A ,→ 〉, the set of all D-admissible sets of 〈A ,→ 〉 form a complete partial order with respect to set inclusion. ♣15 Definition 15 (Complete Extension).
  • 12. Given an argumentation frame- work AF = 〈A ,→ 〉, S ⊆ A is a D-complete extension iff S is D-conflict-free and S = FAF(S). C O denotes the complete semantics. ♠ Definition 16 (Grounded Extension).
  • 13. Given an argumentation frame- work AF = 〈A ,→ 〉. The grounded extension of AF is the least complete20 extension of AF. GR denotes the grounded semantics. ♠ Definition 17 (Preferred Extension).
  • 14. Given an argumentation frame- work AF = 〈A ,→ 〉. A preferred extension of AF is a maximal (w.r.t. set inclusion) complete extension of AF. P R denotes the preferred seman- tics. ♠25 Definition 18. Given an argumentation framework AF = 〈A ,→ 〉 and S ⊆ A , S+ {a ∈ A | ∃b ∈ S ∧ b → a}. ♠ Definition 19 (Stable Extension).
  • 15. Given an argumentation framework AF = 〈A ,→ 〉. S ⊆ A is a stable extension of AF iff S is a preferred exten- sion and S+ = A S. S T denotes the stable semantics. ♠30 University of Aberdeen, 2015 Page 4
  • 16. Dung’s AF • Labelling-Based Semantics Representation [Cam06] C O GR P R S T D-conflict-free Yes Yes Yes Yes D-admissibility Yes Yes Yes Yes D-strongly admissibility No Yes No No D-reinstatement Yes Yes Yes Yes D-I-maximality No Yes Yes Yes D-directionality Yes Yes Yes No Table 1.1: Satisfaction of general properties by argumentation semantics [BG07; BCG11] S T P R C O GR Figure 1.1: Relationships among argumentation semantics 1.4 Labelling-Based Semantics Representation [Cam06] Definition 20. Let ∆ = Γ be an argumentation framework. A labelling L ab ∈ L(∆) is a complete labelling of ∆ iff it satisfies the following condi- tions for any a1 ∈ A :5 • L ab(a1) = in ⇔ ∀a2 ∈ a− 1 L ab(a2) = out; • L ab(a1) = out ⇔ ∃a2 ∈ a− 1 : L ab(a2) = in. ♠ The grounded and preferred labelling can then be defined on the basis of complete labellings. Definition 21. Let ∆ = Γ be an argumentation framework. A labelling10 L ab ∈ L(∆) is the grounded labelling of ∆ if it is the complete labelling of ∆ minimizing the set of arguments labelled in, and it is a preferred labelling of ∆ if it is a complete labelling of ∆ maximizing the set of argu- ments labelled in. ♠ In order to show the connection between extensions and labellings, let15 us recall the definition of the function Ext2Lab, returning the labelling corresponding to a D-conflict-free set of arguments S. Definition 22. Given an AF ∆ = Γ and a D-conflict-free set S ⊆ A , the corresponding labelling Ext2Lab(S) is defined as Ext2Lab(S) ≡ L ab, where • L ab(a1) = in ⇔ a1 ∈ S20 • L ab(a1) = out ⇔ ∃ a2 ∈ S s.t. a2 → a1 University of Aberdeen, 2015 Page 5
  • 17. Dung’s AF • Labelling-Based Semantics Representation [Cam06] σ = C O σ = GR σ = P R σ = S T EXISTSσ trivial trivial trivial NP-c CAσ NP-c polynomial NP-c NP-c SAσ polynomial polynomial Π p 2 -c coNP-c VERσ polynomial polynomial coNP-c polynomial NEσ NP-c polynomial NP-c NP-c Table 1.2: Complexity of decision problems by argumentation semantics [DW09] • L ab(a1) = undec ⇔ a1 ∉ S ∧ a2 ∈ S s.t. a2 → a1 ♠ [Cam06] shows that there is a bijective correspondence between the complete, grounded, preferred extensions and the complete, grounded, preferred labellings, respectively. Proposition 1. Given an an AF ∆ = Γ, L ab is a complete (grounded,5 preferred) labelling of ∆ if and only if there is a complete (grounded, pre- ferred) extension S of ∆ such that L ab = Ext2Lab(S). ♣ The set of complete labellings of ∆ is denoted as LC O (∆), the set of preferred labellings as LP R(∆), while LGR(∆) denotes the set including the grounded labelling.10 University of Aberdeen, 2015 Page 6
  • 18. Dung’s AF • Labelling-Based Semantics Representation [Cam06] University of Aberdeen, 2015 Page 7
  • 19. Dung’s AF • Skepticism Relationships [BG09b] GR C O P R GR C O P RS T Figure 1.2: S ⊕ relation for any argumentation framework (left) and for argumentation framework where stable extensions exist (right). 1.5 Skepticism Relationships [BG09b] E1 E E2 denotes that E1 is at least as skeptical as E2. Definition 23. Let E be a skepticism relation between sets of exten- sions. The skepticism relation between argumentation semantics S is such that for any argumentation semantics σ1 and σ2, σ1 S σ2 iff ∀AF ∈5 Dσ1 ∩Dσ2 , EAF(σ1) E EAF(σ2). ♠ Definition 24. Given two sets of extensions E1 and E2 of an argumenta- tion framework AF: • E1 E ∩+ E2 iff ∀E2 ∈ E2, ∃E1 ∈ E1: E1 ⊆ E2; • E1 E ∪+ E2 iff ∀E1 ∈ E1, ∃E2 ∈ E2: E1 ⊆ E2. ♠10 Lemma 2. Given two argumentation semantics σ1 and σ2, if for any argumentation framework AF EAF(σ1) ⊆ EAF(σ2), then σ1 E ∩+ σ2 and σ1 E ∪+ σ2 (σ1 E ⊕ σ2). ♣ 1.6 Signatures [Dun+14] Let A be a countably infinite domain of arguments, and15 AFA = {〈A ,→〉 | A ⊆ A,→⊆ A ×A }. Definition 25. The signature Σσ of a semantics σ is defined as Σσ = {σ(F) | F ∈ AFA} (i.e. the collection of all possible sets of extensions an AF can possess under a semantics). ♠20 Given S ⊆ 2A , ArgsS = S∈S S, PairsS = {〈a,b〉 | ∃S ∈ S s.t. {a,b} ⊆ S}. S is called an extension-set if ArgsS is finite. Definition 26. Let S ⊆ 2A . S is incomparable if ∀S,S ∈ S, S ⊆ S implies S = S . ♠ University of Aberdeen, 2015 Page 8
  • 20. Dung’s AF • Signatures [Dun+14] Definition 27. An extension-set S ⊆ 2A is tight if ∀S ∈ S and a ∈ ArgsS it holds that if S ∪ {a} ∈ S then there exists an b ∈ S such that 〈a,b〉 ∈ PairsS. ♠ Definition 28. S ⊆⊆ 2A is adm-closed if for each A,B ∈ S the following holds: if 〈a,b〉 ∈ PairsS for each a,b ∈ A ∪B, then also A ∪B ∈ S. ♠5 Proposition 2. For each F ∈ AFA: • S T (F) is incomparable and tight; • P R(F) is non-empty, incomparable and adm-closed. ♣ Theorem 2. The signatures for S T and P R are: • ΣS T = {S | S is incomparable and tight};10 • ΣP R = {S = | S is incomparable and adm-closed}. ♣ University of Aberdeen, 2015 Page 9
  • 21. Dung’s AF • Signatures [Dun+14] Consider S = { { a,d, e }, { b, c, e }, { a,b,d } } University of Aberdeen, 2015 Page 10
  • 22. Dung’s AF • Decomposability and Transparancy [Bar+14] 1.7 Decomposability and Transparancy [Bar+14] Definition 29. Given an argumentation framework AF = (A ,→), a labelling-based semantics σ associates with AF a subset of L(AF), de- noted as Lσ(AF). ♠ Definition 30. Given AF = (A ,→) and a set Args ⊆ A , the input of Args,5 denoted as Argsinp, is the set {B ∈ A Args | ∃A ∈ Args,(B, A) ∈→}, the con- ditioning relation of Args, denoted as ArgsR , is defined as → ∩(Argsinp × Args). ♠ Definition 31. An argumentation framework with input is a tuple (AF,I ,LI ,RI ), including an argumentation framework AF = (A ,→), a10 set of arguments I such that I ∩A = , a labelling LI ∈ LI and a rela- tion RI ⊆ I × A . A local function assigns to any argumentation frame- work with input a (possibly empty) set of labellings of AF, i.e. F(AF,I ,LI ,RI ) ∈ 2L(AF) . ♠ Definition 32. Given an argumentation framework with input15 (AF,I ,LI ,RI ), the standard argumentation framework w.r.t. (AF,I ,LI ,RI ) is defined as AF = (A ∪ I ,→ ∪R I ), where I = I ∪ {A | A ∈ out(LI )} and R I = RI ∪ {(A , A) | A ∈ out(LI )} ∪ {(A, A) | A ∈ undec(LI )}. ♠ Definition 33. Given a semantics σ, the canonical local function of σ20 (also called local function of σ) is defined as Fσ(AF,I ,LI ,RI ) = {Lab↓A | Lab ∈ Lσ(AF )}, where AF = (A ,→) and AF is the standard argumenta- tion framework w.r.t. (AF,I ,LI ,RI ). ♠ Definition 34. A semantics σ is complete-compatible iff the following conditions hold:25 1. For any argumentation framework AF = (A ,→), every labelling L ∈ Lσ(AF) satisfies the following conditions: • if A ∈ A is initial, then L(A) = in • if B ∈ A and there is an initial argument A which attacks B, then L(B) = out30 • if C ∈ A is self-defeating, and there are no attackers of C be- sides C itself, then L(C) = undec 2. for any set of arguments I and any labelling LI ∈ LI , the ar- gumentation framework AF = (I ,→ ), where I = I ∪ {A | A ∈ out(LI )} and → = {(A , A) | A ∈ out(LI )}∪{(A, A) | A ∈ undec(LI )},35 admits a (unique) labelling, i.e. |Lσ(AF )| = 1. ♠ University of Aberdeen, 2015 Page 11
  • 23. Dung’s AF • Decomposability and Transparancy [Bar+14] Definition 35. A semantics σ is fully decomposable (or simply decom- posable) iff there is a local function F such that for every argumenta- tion framework AF = (A ,→) and every partition P = {P1,...Pn} of A , Lσ(AF) = U (P , AF,F) where U (P , AF,F) {LP1 ∪ ... ∪ LPn | LPi ∈ F(AF↓Pi ,Pi inp,( j=1···n,j=i LPj )↓ Pi inp,Pi R )}. ♠5 Definition 36. A complete-compatible semantics σ is top-down decom- posable iff for any argumentation framework AF = (A ,→) and any parti- tion P = {P1,...Pn} of A , it holds that Lσ(AF) ⊆ U (P , AF,Fσ). ♠ Definition 37. A complete-compatible semantics σ is bottom-up decom- posable iff for any argumentation framework AF = (A ,→) and any parti-10 tion P = {P1,...Pn} of A , it holds that Lσ(AF) ⊇ U (P , AF,Fσ). ♠ C O S T GR P R Full decomposability Yes Yes No No Top-down decomposability Yes Yes Yes Yes Bottom-up decomposability Yes Yes No No Table 1.3: Decomposability properties of argumentation semantics. University of Aberdeen, 2015 Page 12
  • 24. 2 Argumentation Schemes Argumentation schemes [WRM08] are reasoning patterns which generate arguments: • deductive/inductive inferences that represent forms of common types of arguments used in everyday discourse, and in special contexts5 (e.g. legal argumentation); • neither deductive nor inductive, but defeasible, presumptive, or ab- ductive. Moreover, an argument satisfying a pattern may not be very strong by itself, but may be strong enough to provide evidence to warrant rational10 acceptance of its conclusion, given that it premises are acceptable. According to Toulmin [Tou58] such an argument can be plausible and thus accepted after a balance of considerations in an investigation or dis- cussion moved forward as new evidence is being collected. The investiga- tion can then move ahead, even under conditions of uncertainty and lack15 of knowledge, using the conclusions tentatively accepted. 2.1 An example: Walton et al. ’s Argumentation Schemes for Practical Reasoning Suppose I am deliberating with my spouse on what to do with our pension investment fund — whether to buy stocks,20 bonds or some other type of investments. We consult with a financial adviser, and expert source of information who can tell us what is happening in the stock market, and so forth at the present time [Wal97]. Premises for practical inference:25 1. states that an agent (“I” or “my”) has a particular goal; 2. states that an agent has a particular goal. 〈S0,S1,...,Sn〉 represents a sequence of states of affairs that can be ordered temporally from earlier to latter. A state of affairs is meant to be like a statement, but one describing some event or occurrence that can30 be brought about by an agent. It may be a human action, or it may be a natural event. University of Aberdeen, 2015 Page 13
  • 25. Argumentation Schemes • AS and Dialogues Practical Inference Premises: Goal Premise Bringing about Sn is my goal Means Premise In order to bring about Sn, I need to bring about Si Conclusions: Therefore, I need to bring about Si. Critical questions: Other-Means Question Are there alternative possible actions to bring about Si that could also lead to the goal? Best-Means Question Is Si the best (or most favourable) of the alternatives? Other-Goals Question Do I have goals other than Si whose achievement is preferable and that should have priority? Possibility Question Is it possible to bring about Si in the given circumstances? Side Effects Question Would bringing about Si have known bad consequences that ought to be taken into account? 2.2 AS and Dialogues Dialogue for practical reasoning: all moves (propose, prefer, justify) are co- ordinated in a formal deliberation dialogue that has eight stages [HMP01]. 1. Opening of the deliberation dialogue, and the raising of a governing5 question about what is to be done. 2. Discussion of: (a) the governing question; (b) desirable goals; (c) any constraints on the possible actions which may be considered; (d) perspectives by which proposals may be evaluated; and (e) any premises (facts) relevant to this evaluation.10 3. Suggesting of possible action-options appropriate to the governing question. 4. Commenting on proposals from various perspectives. 5. Revising of: (a) the governing question, (b) goals, (c) constraints, (d) perspectives, and/or (e) action-options in the light of the comments15 University of Aberdeen, 2015 Page 14
  • 26. Argumentation Schemes • AS and Dialogues presented; and the undertaking of any information-gathering or fact-checking required for resolution. 6. Recommending an option for action, and acceptance or non-accept- ance of this recommendation by each participant. 7. Confirming acceptance of a recommended option by each partici-5 pant. 8. Closing of the deliberation dialogue. Proposals are initially made at stage 3, and then evaluated at stages 4, 5 and 6. Especially at stage 5, much argumentation taking the form of practi-10 cal reasoning would seem to be involved. As discussed in [Wal06], there are three dialectical adequacy condi- tions for defining the speech act of making a proposal. The Proponent’s Requirement (Condition 1). The proponent puts forward a statement that describes an action and says that15 both proponent and respondent (or the respondent group) should carry out this action. The proponent is committed to carrying out that action: the state- ment has the logical form of the conclusion of a practical inference, and also expresses an attitude toward that statement.20 The Respondent’s Requirement (Condition 2). The statement is put forward with the aim of offering reasons of a kind that will lead the respondent to become committed to it. The Governing Question Requirement (Condition 3). The job of the proponent is to overcame doubts or conflicts of opinions, while25 the job of the respondent is to express them. Thus the role of the respondent is to ask questions that cast the prudential reasonable- ness of the action in the statement into doubt, and to mount attacks (counter-arguments and rebuttals) against it. Condition 3 relates to the global structure of the dialogue, whereas30 conditions 1 and 2 are more localised to the part where the proposal was made. Condition 3 relates to the global burden of proof [Wal14] and the roles of the two parties in the dialogue as a whole. Speech acts [MP02], like making a proposal, are seen as types of moves in a dialogue that are governed by rules. Three basic character-35 istics of any type of move that have to be defined: 1. pre-conditions of the move; University of Aberdeen, 2015 Page 15
  • 27. Argumentation Schemes • AS and Dialogues 2. the conditions defining the move itself; 3. the post-conditions that state the result of the move. Preconditions • At least two agents (proponent and opponent); • A governing question;5 • Set of statements (propositions); • The proponent proposes the proposition to the respondent if and only if: 1. there is a set of premises that the proponent is committed to, and fit the premises of the argumentation scheme for practical10 reasoning; 2. the proponent is advocating these premises, that is, he is mak- ing a claim that they are true or applicable in the case at issue; 3. there is an inference from these premises fitting the argumen- tation scheme for practical reasoning; and15 4. the proposition is the conclusion of the inference. The Defining Conditions The central defining condition sets out the conditions defining the struc- ture of the move of making a proposal. The Goal Statement: We have a goal G.20 The Means Statement: Bringing about p is necessary (or suffi- cient) for us to bring about G. Then the inference follows. The Proposal Statement: We should (practically ought to) bring about p.25 University of Aberdeen, 2015 Page 16
  • 28. Argumentation Schemes • AS and Dialogues Proposal Statement in form of AS Premises: Goal Statement We have a goal G. The Means Statement Bringing about p is necessary (or suffi- cient) for us to bring about G. Conclusions: We should (practically ought to) bring about p. The Post-Conditions The central post-condition is the response condition. The proposal must be open to critical questioning by opponent. The5 proponent should be open to answering doubts and objections correspond- ing to any one of the five critical questions for practical reasoning; as well as to counter-proposals, and is in charge of giving reasons why her pro- posal is better than the alternatives. The response condition set by these critical questions helps to explain10 how and why the maker of a proposal needs to be open to questioning and to requests for justification. University of Aberdeen, 2015 Page 17
  • 29. 3 A Semantic-Web View of Argumentation Acknowledgement This handout include material from a number of collaborators including Chris Reed. An overview can also be find at [Bex+13].5 3.1 The Argument Interchange Format [Rah+11] Node Graph (argument network) has-a Information Node (I-Node) is-a Scheme Node S-Node has-a Edge is-a Rule of inference application node (RA-Node) Conflict application node (CA-Node) Preference application node (PA-Node) Derived concept application node (e.g. defeat) is-a ... ContextScheme Conflict scheme contained-in Rule of inference scheme Logical inference scheme Presumptive inference scheme ... is-a Logical conflict scheme is-a ... Preference scheme Logical preference scheme is-a ... Presumptive preference scheme is-a uses uses uses Figure 3.1: Original AIF Ontology [Che+06; Rah+11] 3.2 An Ontology of Arguments [Rah+11] Please download Protégé from http://protege.stanford.edu/ and the AIF OWL version from http://www.arg.dundee.ac.uk/wp-content/ uploads/AIF.owl10 Representation of the argument described in Figure 3.2 ___jobArg : PracticalReasoning_Inference fulfils(___jobArg, PracticalReasoning_Scheme) hasGoalPlan_Premise(___jobArg, ___jobArgGoalPlan) hasConclusion(___jobArg, ___jobArgConclusion)15 hasGoal_Premise(___jobArg, ___jobArgGoal) ___jobArgConclusion : EncouragedAction_Statement fulfils(___jobArgConclusion, EncouragedAction_Desc) University of Aberdeen, 2015 Page 18
  • 30. Semantic Web Argumentation • AIF-OWL Practical Inference Bringing about is my goal Sn Si In order to bring about I need to bring about Sn Therefore I need to bring about Si hasConcDeschasPremiseDesc hasPremiseDesc Bringing about being rich is my goal In order to bring about being rich I need to bring about having a job fulfilsPremiseDesc fulfilsPremiseDesc fulfilsScheme supports supports Therefore I need to bring about having a job hasConclusion fulfils Figure 3.2: An argument network linking instances of argument and scheme components Symmetric attack r → p r pMP2 A1 A2 p → q p qMP1 neg1 Undercut attack r MP2 A3 A2 s → v s vMP1 cut1 p r → p Figure 3.3: Examples of conflicts [Rah+11, Fig. 2] claimText (___jobArgConclusion "Therefore I need to bring about hav- ing a job") ___jobArgGoal : Goal_Statement fulfils(___jobArgGoal, Goal_Desc) claimText (___jobArgGoal "Bringing about being rich is my goal")5 ___jobArgGoalPlan : GoalPlan_Statement fulfils(___jobArgGoalPlan, GoalPlan_Desc) claimText (___jobArgGoalPlan "In order to bring about being rich I need to bring about having a job") University of Aberdeen, 2015 Page 19
  • 31. Semantic Web Argumentation • AIF-OWL Relevant portion of the AIF ontology EncouragedAction_Statement EncouragedAction_Statement Statement GoalPlan_Statement GoalPlan_Statement Statement5 Goal_Statement Goal_Statement Statement I-node I-node ≡ Statement I-node Node10 I-node ¬ S-node Inference Inference ≡ RA-node Inference ∃ fulfils Inference_Scheme Inference ≥ 1 hasPremise Statement15 Inference Scheme_Application Inference = hasConclusion (Scheme_Application Statement) Inference_Scheme Inference_Scheme Scheme ≥ 1 hasPremise_Desc Statement_Description = hasConclusion_Desc20 (Scheme Statement_Description) PracticalReasoning_Inference PracticalReasoning_Inference ≡ Presumptive_Inference ∃ hasCon- clusion EncouragedAction_Statement ∃ hasGoalPlan_Premise Goal- Plan_Statement ∃ hasGoal_Premise Goal_Statement25 RA-node RA-node ≡ Inference RA-node S-node S-node S-node ≡ Scheme_Application30 S-node Node S-node ¬ I-node University of Aberdeen, 2015 Page 20
  • 32. Semantic Web Argumentation • AIF-OWL Scheme Scheme Form Scheme ¬ Statement_Description Scheme_Application Scheme_Application ≡ S-node5 Scheme_Application ∃ fulfils Scheme Scheme_Application Thing Scheme_Application ¬ Statement Statement Statement ≡ NegStatement10 Statement ≡ I-node Statement Thing Statement ∃ fulfils Statement_Description Statement ¬ Scheme_Application Statement_Description15 Statement_Description Form Statement_Description ¬ Scheme fulfils ∃ fulfils Thing Node hasConclusion_Desc20 ∃ hasConclusion_Desc Thing Inference_Scheme hasGoalPlan_Premise hasPremise hasGoal_Premise hasPremise25 claimText ∃ claimText DatatypeLiteral Statement ∀ claimText DatatypeString Individuals of EncouragedAction_Desc EncouragedAction_Desc : Statement_Description30 formDescription (EncouragedAction_Desc "A should be brought about") University of Aberdeen, 2015 Page 21
  • 33. Semantic Web Argumentation • AIF-OWL Individuals of GoalPlan_Desc GoalPlan_Desc : Statement_Description formDescription (GoalPlan_Desc "Bringing about B is the way to bring about A") Individuals of Goal_Desc5 Goal_Desc : Statement_Description formDescription (Goal_Desc "The goal is to bring about A") Individuals of PracticalReasoning_Scheme PracticalReasoning_Scheme : PresumptiveInference_Scheme hasPremise_Desc(PracticalReasoning_Scheme, Goal_Desc)10 hasConclusion_Desc(PracticalReasoning_Scheme, EncouragedAction_Desc) hasPremise_Desc(PracticalReasoning_Scheme, GoalPlan_Desc) University of Aberdeen, 2015 Page 22
  • 34. 4 Argumentation Frameworks: Graphs and Models Acknowledgement This handout include material from a number of collaborators including (in alphabetic order):5 • Pietro Baroni; • Trevor J. M. Bench-Capon; • Claudette Cayrol; • Paul E. Dunne; • Anthony Hunter;10 • Hengfei Li; • Sanjay Modgil; • Nir Oren; • Guillermo R. Simari. 4.1 Graphs15 Value-Based Argumentation Framework [BA09] Example 1 ([AB08], derived from [Col92; Chr00]). The situation involves two agents, called Hal and Carla, both of whom are diabetic. Hal, through no fault of his own, has lost his supply of insulin and urgently needs to take some to stay alive. Hal is aware that Carla has some insulin kept in20 her house, but Hal does not have permission to enter Carla’s house. The question is whether Hal is justified in breaking into Carla’s house and taking her insulin in order to save his life. Note that by taking Carla’s in- sulin, Hal may be putting her life in jeopardy, since she will come to need that insulin herself. One possible response is that if Hal has money, he25 can compensate Carla so that her insulin can be replaced before she needs it. Alternatively if Hal has no money but Carla does, she can replace her insulin herself, since her need is not immediately life threatening. There is, however, a serious problem if neither of them have money, since in that case Carla’s life is really under threat.30 Partial formalisation: University of Aberdeen, 2015 Page 23
  • 35. Frameworks • Graphs a2 LC, FC a3 LC, FH a1 LC Figure 4.1: Graphical representation of Ex. 1. • a1 suggests that Hal should not take insulin, thus allowing Carla to be alive (which promotes the value of Life for Carla LC); • a2 suggests that Hal should take insulin and compensate Carla, thus both of them stay alive (which promotes the value of Life for Carla, and the Freedom — of using money — for Carla FC);5 • a3 suggests that Hal should take insulin and that Carla should buy insulin, thus both of them stay alive (which promotes the value of Life for Carla, and the Freedom — of using money — for Hal FH). a2 defeats a1, a3 defeats a1, a3 and a2 defeat each other. ♥ Extended Argumentation Framework [Mod09]10 Example 2 (From [Mod09]). • a1: “Today will be dry in London since the BBC forecast sunshine”; • a2: “Today will be wet in London since CNN forecast rain”; • a3: “But the BBC are more trustworthy than CNN”; • a4: “However, statistically CNN are more accurate forecasters than15 the BBC”; • a5: “Basing a comparison on statistics is more rigorous and ratio- nal than basing a comparison on your instincts about their relative trustworthiness”. a1 and a2 are mutually conflicting; a3 is a preference in favour of a1,20 a4 is a preference in favour of a2. a3 and a4 are mutually conflicting. a5 is a preference in favour of a4. ♥ University of Aberdeen, 2015 Page 24
  • 36. Frameworks • Graphs Figure 4.2: Graphical representation of Ex. 2. Figure 4.3: Graphical representation of Ex. 3. AFRA: Argumentation Framework with Recursive Attacks [Bar+11; Bar+09] Example 3 ([Bar+11; Bar+09]). Suppose Bob is deciding about his Christ- mas holidays. • a1: There is a last minute offer for Gstaad: therefore I should go to5 Gstaad; • a2: There is a last minute offer for Cuba: therefore I should go to Cuba; • a3: I do like to ski; • a4: The weather report informs that in Gstaad there were no snow-10 falls since one month: therefore it is not possible to ski in Gstaad; • a5: It is anyway possible to ski in Gstaad, thanks to a good amount of artificial snow. ♥ Definition 38 (AFRA). An Argumentation Framework with Recursive Attacks (AFRA) is a pair 〈A ,R〉 where:15 • A is a set of arguments; • R is a set of attacks, namely pairs (a1,X ) s.t. a1 ∈ A and (X ∈ R or X ∈ A ). Given an attack α = (a1,X ) ∈ R, we say that a1 is the source of α, denoted as src(α) = a1 and X is the target of α, denoted as trg(α) = X .20 When useful, we will denote an attack to attack explicitly showing all the recursive steps implied by its definition; for instance (a1,(a2,a3)) means (a1,α) where α = (a2,a3). ♠ University of Aberdeen, 2015 Page 25
  • 37. Frameworks • Graphs Definition 39 (Semantics). Let Γ = 〈A ,R〉 be an AFRA. A set S ⊆ A ∪R is: • a complete extension if and only if S is admissible and every el- ement of A ∪ R which is acceptable w.r.t. S belongs to S , i.e. FΓ(S ) ⊆ S ;5 • the grounded extension of Γ iff is the least fixed point of FΓ; • a preferred extension of Γ iff it is a maximal (w.r.t. set inclusion) admissible set; • a stable extension of Γ if and only if S is conflict-free and ∀V ∈ A ∪R,V ∉ S , ∃α ∈ S s.t. α →R V . ♠10 Theorem 3.
  • 38. In the case where an AFRA is also an AF, a bijective corre- spondence between the semantics notions according to the two formalisms hold. ♣ Theorem 4.
  • 39. Moreover, in the case where an AFRA is not an AF, it is possible to rewrite it as an AF with extra arguments. ♣15 Bipolar Argumentation Framework [CL05] Example 4 ([CL05, Example 1]). A murder has been performed and the suspects are Liz, Mary and Peter. The following pieces of information have been gathered: • The type of murder suggests us that the killer is a female (f );20 • The killer is certainly small (s); • Liz is tall and Mary and Peter are small; • The killer has long hair and uses a lipstick (l); • A witness claims that he saw the killer who was tall; • The witness is reliable (w);25 • Moreover we are told that the witness is short-sighted, so he is no more reliable (b). The following arguments can be formed: • a1 in favour of m, with premises {s, f ,(s∧ f ) → m}; • a2 in favour of ¬s, with premises {w,w → ¬s};30 • a3 in favour of ¬w, with premises {b,b → ¬w}; • a4 in favour of f , with premises {l,l → f } University of Aberdeen, 2015 Page 26
  • 40. Frameworks • Deterministic Structured Argumentation a3 a2 a1 a4 Figure 4.4: Graphical representation of Ex. 4: rounded arrows represent the support relationship. a3 defeats a2; a2 defeats a1. But, the argument a4 confirms one of the premises of a1, thus strengthening it. ♥ 4.2 Deterministic Structured Argumentation Defeasible Logic Programming (DeLP) [Sim89; SL92; GS04; GS14]5 A defeasible logic program (DeLP) is a set of: • facts, i.e. ground literals representing atomic information or the negation of atomic information using strong negation ¬; • strict rules, Lo ←− L1,...,Ln, represent non-defeasible information. Lo is the head, the body {Li}i>0 is a non-empty set of ground literals;10 • defeasible rules, Lo −< L1,...,Ln, represent tentative information. Lo is the head, the body {Li}i>0 is a non-empty set of ground literals. A DeLP program is denoted by 〈Π,∆〉, where Π is the subset of non- defeasible knowledge (strict rules and facts); and ∆ is the subset of defea- sible knowledge.15 A defeasible derivation of a literal Q from a DeLP program 〈Π,∆〉 |∼ Q, is a finite sequence of ground literals L1,L2,...,Ln = Q where either: 1. Li is a fact; 2. there exists a rule Ri in 〈Π,∆〉 (either strict or defeasible) with head Li and body B1,...,Bk, and every literal of the body is an element20 L j of the sequence appearing before Li (j < i). A derivation from 〈Π, 〉 is called a strict derivation. Definition 40. Let H be a ground literal, 〈Π,∆〉 a DeLP program, and A ⊆ ∆. The pair 〈A ,H〉 is an argument structure if: • there exists a defeasible derivation for H from 〈Π,A 〉;25 • there are no defeasible derivations from 〈Π,A 〉 of contradictory lit- erals; University of Aberdeen, 2015 Page 27
  • 41. Frameworks • Deterministic Structured Argumentation • and there is no proper subset A ⊂ A such that A satisfies (1) and (2). ♠ Definition 41. An argument 〈B,S〉 is a counter-argument for 〈A ,H〉 at literal P, if there exists a sub-argument 〈C ,P〉 of 〈A ,H〉 such that P and S disagree, that is, there exist two contradictory literals that have a5 strict derivation from Π∪{S,P}. The literal P is referred as the counter- argument point and 〈C ,P〉 as the disagreement sub-argument. ♠ Let assume an argument comparison criterion . Definition 42. Let 〈B,S〉 be a counter-argument for 〈A ,H〉 at point P, and 〈C ,P〉 the disagreement sub-argument.10 If 〈B,S〉 〈C ,P〉, then 〈B,S〉 is a proper defeater for 〈A ,H〉. If 〈B,S〉 〈C ,P〉 and 〈C ,P〉 〈B,S〉, then 〈B,S〉 is a blocking de- feater for 〈A ,H〉. 〈B,S〉 is a defeater for 〈A ,H〉 if 〈B,S〉 is either a proper or blocking defeater for 〈A ,H〉. ♠15 Example 5. Let 〈Π1,∆1〉 be a DeLP-program such that: Π1 =    monday cloudy dry_season waves grass_grown hire_gardener vacation ¬working ←− vacation few_surfers ←− ¬many_surfers ¬surf ←− ill    ∆1 =    surf −< nice,spare_time nice −< waves spare_time −< ¬busy ¬busy −< ¬working ¬nice −< rain rain −< cloudy ¬rain −< dry_season ...    From 〈Π1,∆1〉, these are some arguments that can be derived: 〈A0,surf〉 =    surf −< nice,spare_time nice −< waves spare_time −< ¬busy ¬busy −< ¬working    ,surf 〈A1,¬nice〉 = 〈{¬nice −< rain; rain −< cloudy},¬nice〉20 〈A2,nice〉 = 〈{nice −< waves},nice〉 〈A3,rain〉 = 〈{rain −< cloudy},rain〉 〈A4,¬rain〉 = 〈{¬rain −< dry_season},¬rain〉 University of Aberdeen, 2015 Page 28
  • 42. Frameworks • Deterministic Structured Argumentation Figure 4.5: Arguments and their interactions from Example 5 〈A9,¬busy〉 = 〈{¬busy −< ¬working},¬busy〉 ♥ Assumption Based Argumentation (ABA) [BTK93; Bon+97; Ton12; Ton14; DT10] Definition 43. An ABA is a tuple 〈L ,R,A , 〉 where:5 • 〈L ,R〉 is a deductive system, with L the language and R a set of rules, that we assume of the form σ0 ←− σ1,...,σm (m ≥ 0), with σi ∈ L ; σ0 is referred to as the head and σ1,...,σm as the body of the rule σ0 ←− σ1,...,σm; • A ⊆ L is a (non-empty) set, referred to as assumptions;10 • is a total mapping from A into L ; a is referred to as the contrary of a. ♠ Definition 44. A deduction for σ ∈ L supported by S ⊆ L and R ⊆ R, denoted as S R σ, is a (finite) tree with nodes labelled by sentences in15 L or by τ ∉ L , the root labelled by σ, leaves either τ or sentences in S, non-leaves σ with, as children, the elements of the body of some rule in R with head σ , and R the set of all such rules. ♠ Definition 45. An argument for the claim σ ∈ L supported by A ⊆ A (A σ) is a deduction for σ supported by A (and some R ⊆ R). ♠20 Definition 46. An argument A1 σ1 attacks an argument A2 σ2 iff σ1 is the contrary of one of the assumptions in A2. ♠ University of Aberdeen, 2015 Page 29
  • 43. Frameworks • Deterministic Structured Argumentation Figure 4.6: Graphical representation of Ex. 6. Example 6. R = { innocent(X) ←− notGuilty(X); killer(oj) ←− DNAshows(oj),DNAshows(X) ⊃ killer(X); DNAshows(X) ⊃ killer(X) ←− DNAfromReliableEvidence(X); evidenceUnreliable(X) ←− collected(X,Y ),racist(Y ); DNAshows(oj) ←−; collected(oj,mary) ←−; racist(mary) ←− } A = { notGuilty(oj); DNAfromReliableEvidence(oj) } Moreover, notGuilty(oj) = killer(oj), and DNAfromReliableEvidence(oj) = evidenceUnreliale(oj). ♥5 ASPIC+ [Pra10; MP13; MP14] Given a logical language L , and a set of strict or defeasible inference rules — resp. ϕ1,...,ϕn −→ ϕ and ϕ1,...,ϕn =⇒ ϕ. A strict rule inference always holds — i.e. if the antecedents ϕ1,...,ϕn hold, the consequent ϕ holds as well — while a defeasible inference “usually” holds. Arguments10 are constructed w.r.t. a knowledge base with two types of formulae. Definition 47. An argumentation system is as tuple AS = 〈L ,R,ν〉 where: • : L → 2L is a contrariness function s.t. if ϕ ∈ ψ and: – ψ ∉ ϕ, then ϕ is a contrary of ψ; – ψ ∈ ϕ, then ϕ is a contradictory of ψ (ϕ = –ψ);15 • R = Rd ∪ Rs is a set of strict (Rs) and defeasible (Rd) inference rules such that Rd ∩Rs = ; • ν : Rd → L , is a partial function.1 1Informally, ν(r) is a wff in L which says that the defeasible rule r is applicable. University of Aberdeen, 2015 Page 30
  • 44. Frameworks • Deterministic Structured Argumentation For any P ⊆ L , Cl(P ) denotes the closure of P under strict rules, viz. the smallest set containing P and any consequent of any consequent of any strict rule in Rs whose antecedents are in Cl(P ). P ⊆ L is consistent iff ϕ,ψ ∈ P s.t. ϕ ∈ ψ, otherwise is inconsistent. A knowledge base in an AS is a set Kn ∪ Kp = K ⊆ L ; {Kn,Kp} is a5 partition of K ; Kn contains axioms that cannot be attacked; Kp contains ordinary premises that can be attacked. An argumentation theory is a pair AT = 〈AS,K 〉. ♠ Definition 48.
  • 45. An argument a on the basis of a AT = 〈AS,K 〉, AS = 〈L ,R,ν〉 is:10 1. ϕ if ϕ ∈ K with: Prem(a) = {ϕ}; Conc(a) = ϕ; Sub(a) = {ϕ}; Rules(a) = DefRules(a) = ; TopRule(a) = undefined. 2. a1,...,an −→ / =⇒ ψ if a1,...,an, with n ≥ 0, are arguments such that there exists a strict/defeasible rule r = Conc(a1),...,Conc(an) −→ / =⇒ ψ ∈ Rs/Rd.15 Prem(a) = n i=1 Prem(ai); Conc(a) = ψ; Sub(a) = n i=1 Sub(ai)∪{a}; Rules(a) = n i=1 Rules(ai)∪{r}; DefRules(a) = {d | d ∈ Rules(a)∩Rd}; TopRule(a) = r20 a is strict if DefRules(a) = , otherwise defeasible; firm if Prem(a) ⊆ Kn, otherwise plausible. P A ϕ iff ∃a strict argument s.t. Conc(a) = ϕ and P ⊇ Prem(a). ♠ An argument can be attacked in its premises (undermining), conclu- sion (rebuttal), or inference step (undercut). The definition of defeats25 takes into account an argument ordering : a b iff a is “less preferred” than b (a b iff a b and b a). Definition 49.
  • 46. Given a and b arguments, a defeats b iff a undercuts, successfully rebuts or successfully undermines b, where: • a undercuts b (on b ) iff Conc(a) ∉ ν(r) for some b ∈ Sub(b) s.t. r =30 TopRule(b ) ∈ Rd; • a successfully rebuts b (on b ) iff Conc(a) ∉ ϕ for some b ∈ Sub(b) of the form b1,...,bn =⇒ –ϕ, and a b ; • a successfully undermines b (on ϕ) iff Conc(a) ∉ ϕ, and ϕ ∈ Prem(b)∩ Kp, and a ϕ. ♠35 University of Aberdeen, 2015 Page 31
  • 47. Frameworks • Deterministic Structured Argumentation Definition 50. AF is the abstract argumentation framework defined by AT = 〈AS,K 〉, AS = 〈L ,R,ν〉 if A is the smallest set of all finite argu- ments constructed from K satisfying Def. 48; and → is the defeat relation on A as defined in Def. 49. ♠ Definition 51 (Rationality postulates [CA07; MP14]). Given ∆, an AF5 defined by an AT, and a semantic σ. ∀S ∈ E∆(σ), ∆ satisfies : P1: direct consistency iff {Conc(a) | a ∈ S} is consistent; P2: indirect consistency iff Cl({Conc(a) | a ∈ S}) is consistent; P3: closure iff {Conc(a) | a ∈ S} = Cl({Conc(a) | a ∈ S}); P4 : sub-argument closure iff ∀a ∈ S, Sub(a) ⊆ S. ♠10 Note that P2 follows from P1 and P3. An AT satisfies the postulates (i.e. it is Well-Formed) iff (let us con- sider classical negation here instead of contrariness function) [MP13; MP14]: • it is close under transposition2 or under contraposition;3 • Cl(Kn) is consistent;15 • the argument ordering is reasonable, namely: – ∀a,b, if a is strict and firm, and b is plausible or defeasible, then a b; – ∀a,b, if b is strict and firm, then b a; – ∀a,a ,b such that a is a strict continuation of {a}, if a b20 then a b, and if b a, then b a ; – given a finite set of arguments {a1,...,an}, let a+i be some strict continuation of {a1,...,ai−1,ai+1,...,an}. Then it is not the case that ∀i,a+i ai. An argument a is a strict continuation of a set of arguments {a1,...,an}25 iff (Prem(a)∩Kp) = n i=1 (Prem(ai)∩Kp); DefRules(a) = n i=1 DefRules(ai); Rules(a) ⊇ n i=1 Rules(ai) and (Prem(a)∩Kn) ⊆ n i=1 (Prem(ai∩Kn)). Example 7. It is well known that (1) birds normally fly; while (2) pen- guins are known not to fly, although (3) all penguins are birds. In these terms, one can say that (4) penguins are abnormal birds with respect to30 flying. (5) Tweety is observed to be a penguin, and (6) animals that are observed to be penguins normally are penguins. d1 : bird =⇒ canfly; d2 : penguin =⇒ ¬canfly; d3 : observed_penguin =⇒ penguin; f1 : penguin ⊃ bird; f2 : penguin ⊃ ¬d1; f3 : observed_penguin. The 2If ϕ1,...,ϕn −→ ψ ∈ Rs, then ∀i = 1...n, ϕ1,...,ϕi−1,¬ψ,ϕi+1,...,ϕn =⇒ ¬ϕi ∈ Rs. 3∀P ⊆ L , l ∈ P , if P A ϕ, then P {l}∪{¬ϕ} A ¬l University of Aberdeen, 2015 Page 32
  • 48. Frameworks • Deterministic Structured Argumentation derived arguments are: a1 : observed_penguin; a2 : a1 =⇒ penguin; a3 : penguin ⊃ bird; a4 : a2,a3 =⇒ canfly; b1 : a2 =⇒ ¬canfly; c1 : a2 =⇒ ¬ν(d1). ♥ Deductive Argumentation [BH01; BH08; GH11; BH14] Focus on simple logic and classical logic, but other options include5 non-monotonic logics, conditional logics, temporal logics, description log- ics, and paraconsistent logics. Definition 52 (Base Logic). Let L be a language for a logic, and let i be the consequence relation for that logic. If α is an atom in L , then α is a positive literal in L and ¬α is a negative literal in L .10 For a literal β, the complement of β is defined as follows: • If β is a positive literal, i.e. it is of the form α, then the complement of β is the negative literal ¬α, • if β is a negative literal, i.e. it is of the form ¬α, then the comple- ment of β is the positive literal α. ♠15 Definition 53 (Deductive Argument). A deductive argument is an or- dered pair 〈Φ,α〉 where Φ i α. Φ is the support, or premises, or assump- tions of the argument, and α is the claim, or conclusion, of the argument. For an argument a = 〈Φ,α〉, the function Support(a) returns Φ and the function Claim(a) returns α. ♠20 Definition 54 (Constraints). An argument 〈Φ,α〉 satisfies the: • consistency constraint when Φ is consistent (not essential, cf. paraconsistent logic). • minimality constraint when there is no Ψ ⊂ Φ such that Ψ α. ♠25 Definition 55 (Classical Logic Argument). A classical logic argument from a set of formulae ∆ is a pair 〈Φ,α〉 such that 1. Φ ⊆ ∆ 2. Φ ⊥ 3. Φ α30 4. there is no Φ ⊂ Φ such that Φ α. ♠ Definition 56 (Counterargument). If 〈Φ,α〉 and 〈Ψ,β〉 are arguments, then • 〈Φ,α〉 rebuts 〈Ψ,β〉 iff α ¬β University of Aberdeen, 2015 Page 33
  • 49. Frameworks • Deterministic Structured Argumentation • 〈Φ,α〉 undercuts 〈Ψ,β〉 iff α ¬∧Ψ ♠ Definition 57 (Direct undercut). Let a and b be two classical arguments. We define the following types of classical attack. • a is a direct undercut of b if ¬Claim(a) ∈ Support(b) • a is a classical defeater of b if Claim(a) ¬ Support(b).5 • a is a classical direct defeater of b if ∃φ ∈ Support(b) s.t. Claim(a) ¬φ • a is a classical undercut of b if ∃Ψ ⊆ Support(b) s.t. Claim(a) ≡ ¬ Ψ • a is a classical direct undercut of b if ∃φ ∈ Support(b) s.t. Claim(a) ≡10 ¬φ • a is a classical canonical undercut of b if Claim(a) ≡ ¬ Support(b). • a is a classical rebuttal of b if Claim(a) ≡ ¬Claim(b). • a is a classical defeating rebuttal of b if Claim(a) ¬Claim(b). ♠15 An arrow from D1 to D2 indicates that D1 ⊆ D2. Defeater Direct defeat Undercut Direct rebut Direct undercut Canonical undercut Rebut University of Aberdeen, 2015 Page 34
  • 50. Frameworks • Deterministic Structured Argumentation bp(high) ok(diuretic) bp(high) ∧ok(diuretic) → give(diuretic) ¬ok(diuretic) ∨¬ok(betablocker) give(diuretic) ∧¬ok(betablocker) bp(high) ok(betablocker) bp(high) ∧ok(betablocker) → give(betablocker) ¬ok(diuretic) ∨¬ok(betablocker) give(betablocker) ∧¬ok(diuretic) symptom(emphysema), symptom(emphysema) → ¬ok(betablocker) ¬ok(betablocker) Figure 4.7: Example of argumentation with classical logic. A Logic for Clinical Knowledge [GHW09; HW12; Wil+15] Evidence on treatments T1 and T2 Inference rules for inductive arguments and meta-arguments Arguments Preferences on outcomes and their magnitude Argument graph (T1 > T2) or (T1 = T2) or (T1 < T2) Let us assume a set of evidence EVIDENCE = {e1,..., en}. Definition 58 (Inductive Arguments). Given treatments τ1 and τ2, X ⊆ EVIDENCE, there are three kinds of inductive argument that can be formed.5 1. 〈X,τ1 > τ2〉, meaning the evidence in X supports the claim that treatment τ1 is superior to τ2. 2. 〈X,τ1 ∼ τ2〉, meaning the evidence in X supports the claim that treatment τ1 is equivalent to τ2 3. 〈X,τ1 < τ2〉, meaning the evidence in X supports the claim that10 treatment τ1 is inferior to τ2. University of Aberdeen, 2015 Page 35
  • 51. Frameworks • Probabilistic Argumentation ♠ Given an inductive argument a = 〈X, 〉, support(a) = X. ARG(EVIDENCE) denotes the set of inductive arguments that can be generated from the evidence in EVIDENCE. Definition 59 (Conflicts). If the claim of argument ai is i and the claim5 of argument aj is j then we say that ai conflicts with aj whenever: 1. i = τ1 > τ2, and ( j = τ1 ∼ τ2 or j = τ1 < τ2 ). 2. i = τ1 ∼ τ2, and ( j = τ1 > τ2 or j = τ1 < τ2 ). 3. i = τ1 < τ2, and ( j = τ1 > τ2 or j = τ1 ∼ τ2 ). ♠ Definition 60 (Attack). For any pair of arguments ai and aj, and a pref-10 erence relation R, ai attacks aj with respect to R iff ai conflicts with aj and it is not the case that aj is strictly preferred to ai according to R. ♠ A domain-specific benefit preference relation is defined in [HW12]. Definition 61 (Meta-Arguments). For a ∈ ARG(EVIDENCE), if there is an e ∈ SUPPORT(a) such that:15 • e is not statistically significant, and the outcome indicator of e is not a side-effect, then the following is a meta-argument that attacks a: 〈Not statistically significant〉; • e is a non-randomised and non-blind trial, then the following is a meta-argument that attacks a: 〈Non-randomized & non-blind20 trials〉; • e is a meta-analysis that concerns a narrow patient group then the following is a meta-argument that attacks a: 〈Meta-analysis for a narrow patient group〉. ♠ Example 8. Example where CP is contraceptive pill and NT is no treat-25 ment. Fictional data. ID Left Right Indicator Risk ratio Outcome p e1 CP NT Pregnancy 0.05 superior 0.01 e2 CP NT Ovarian cancer 0.99 superior 0.07 e3 CP NT Breast cancer 1.04 inferior 0.01 e4 CP NT DVT 1.02 inferior 0.05 ♥ University of Aberdeen, 2015 Page 36
  • 52. Frameworks • Probabilistic Argumentation 〈{e1},CP > NT〉 〈{e2},CP > NT〉 〈{e1, e2},CP > NT〉 〈{e3},CP < NT〉 〈{e4},CP < NT〉 〈{e3, e4},CP < NT〉 〈Notstatistically significant〉 Figure 4.8: Arguments derived from Ex. 8, with preferences and meta arguments. 4.3 Probabilistic Argumentation Epistemic Approach [Thi12; Hun13; HT14; BGV14] Definition 62. Probability distribution over models of the language M A function P : M → [0,1] such that m∈M P(m) = 15 ♠ Definition 63. Probability of a formula φ, cf. [Par94] P(φ) = m∈Models(φ) P(m) ♠ Example 9.10 Model a b P m1 true true 0.8 m2 true false 0.2 m3 false true 0.0 m4 false false 0.0 • P(a) = 1 • P(a∧ b) = 0.8 • P(b ∨¬b) = 1 • P(¬a∨¬b) = 0.215 ♥ University of Aberdeen, 2015 Page 37
  • 53. Frameworks • Probabilistic Argumentation Definition 64. Probability of an argument The probability of an argu- ment 〈Φ,α〉, denoted P(〈Φ,α〉), is P(φ1∧...∧φn), where Φ = {φ, ...,φn}. ♠ Example 10. Consider the following probability distributions over mod- els Model a b Agent 1 Agent 2 m1 true true 0.5 0.0 m2 true false 0.5 0.0 m3 false true 0.0 0.6 m4 false false 0.0 0.4 5 Below is the probability of each argument according to each participant. Argument Agent 1 Agent 2 a1 = 〈{a},a〉 1.0 0.0 a2 = 〈{b,b → ¬a},¬a〉 0.0 0.6 a3 = 〈{¬b},¬b〉 0.5 0.4 ♥ Definition 65. For an argumentation framework AF = 〈A ,→〉 and a probability assignment P, the epistemic extension is10 {a ∈ A | P(a) > 0.5} ♠ Definition 66 (From [Thi12; Hun13; BGV14]). Given an argumentation framework 〈A ,→〉, a probability function: COH P is coherent if for every a,b ∈ A , if a attacks b then P(a) ≤ 1−P(b).15 SFOU P is semi-founded if P(a) ≥ 0.5 for every unattacked a ∈ A . FOU P is foundedif P(a) = 1 for every unattacked a ∈ A . SOPT P is semi-optimistic if P(a) ≥ 1− b∈a− P(b) for every a ∈ A with at least one attacker. OPT P is optimistic if P(a) ≥ 1− b∈a− P(b) for every a ∈ A .20 JUS P is justifiableif P is coherent and optimistic. TER P is ternary if P(a) ∈ {0,0.5,1} for every a ∈ A . RAT P is rational if for every a,b ∈ A , if a attacks b then P(a) > 0.5 implies P(b) ≤ 0.5. NEU P is neutral if P(a) = 0.5 for every a ∈ A .25 University of Aberdeen, 2015 Page 38
  • 54. Frameworks • Structural Approach [Hun14] INV P is involutary if for every a,b ∈ A , if a attacks b, then P(a) = 1− P(b). Let the event “a is accepted” be denoted as a, and let be Eac(S) = {a|a ∈ S}. Then P is weakly p-justifiable iff ∀a ∈ A , ∀b ∈ a− , P(a) ≤ 1 − P(b). ♠5 Proposition 3 ([BGV14]). For every argumentation framework, there is at least one P that it is de Finetti coherent [Fin74] and weakly p-justifiable. ♣ Definition 67. Correspondences between probabilistic and classical se- mantics10 Restriction on complete probability function P Classical semantics No restriction complete extensions No arguments a such that P(a) = 0.5 stable Maximal no. of a such that P(a) = 1 preferred Maximal no. of a such that P(a) = 0 preferred Maximal no. of a such that P(a) = 0.5 grounded Minimal no. of a such that P(a) = 1 grounded Minimal no. of a such that P(a) = 0 grounded Minimal no. of a such that P(a) = 0.5 semi-stable ♠ 4.4 Structural Approach [Hun14] Definition 68. Subframework For G = 〈A ,→〉 and G = 〈A ,→ 〉, G G iff A ⊆ A and → = {〈a,b〉 ∈→| a,b ∈ A }15 ♠ Definition 69. Graphs giving an extension For an argument framework G = 〈A ,→〉, a set of arguments Γ ⊆ A , and a semantics σ, QX (Γ) = {G G | G σ Γ} where G σ Γ denotes that Γ is an σ extension of G . ♠20 Definition 70. Probability of a set being an extension The probability that a set of arguments Γ is an σ extension, denoted Pσ(Γ), is PX (Γ) = G ∈Qσ(Γ) P(G ) where P is a probability distribution over subframeworks of G. ♠ University of Aberdeen, 2015 Page 39
  • 55. Frameworks • A Computational Framework Example 11. Subframework Probability G1 a ↔ b 0.09 G2 a 0.81 G3 b 0.01 G4 0.09 PGR({a,b}) = = 0.00 PGR({a}) = P(G2) = 0.81 PGR({b}) = P(G3) = 0.01 PGR({}) = P(G1)+ P(G4) = 0.18 ♥ 4.5 A Computational Framework5 Definition 71 ([LON12; Li15]). A Li-PAF is a tuple 〈A ,PA ,→,P→〉, where 〈A ,→〉 is an argumentation framework, PA : A → (0..1] and P→ :→→ (0..1]. ♠ Definition 72 ([LON12; Li15]). Given a Li-PAF 〈A ,PA ,→,P→〉, AFI = 〈A I ,→I 〉 is said to be induced iff A I ⊆ A ; and →I ⊆→ ∩(A T × A T ); and10 ∀a ∈ A s.t. PA (a) = 1,a ∈ A I ; and ∀〈a,b〉 ∈→ where P(a) = P(b) = 1 if P→(〈a,b〉) = 1, then 〈a,b〉 ∈→I . ♠ Under an assumption of independence, the probability of an inducible ∆I = 〈A I ,→I 〉, denoted PI PrAF (∆I ), by the following equation: PI PrAF (∆I ) = a∈A I PA (a) a∈A A I (1− PA (a)) 〈a,b〉∈→I P→(〈a,b〉) 〈a,b〉∈(→∪(A I ×A I ))→I (1− P→(〈a,b〉)) Assumption relaxed in [LON13; Li15] by relying on a bipolar argu-15 mentation framework, i.e. the evidential argumentation framework [ON08]. A correspondence with ASPIC+ is also drawn in [Li15], see Figure 4.9. University of Aberdeen, 2015 Page 40
  • 56. Frameworks • A Computational Framework Convert to ASPIC+ Argumentation System • Logical Language • Inference Rules • Contrariness Function • ...... Structured Argumentation Framework (SAF) DAF DAFEAF Extended Evidential Framework (EEAF) Probabilistic Extended Evidential Framework Convert to Convert to Extended Evidential Framework (EEAF) Model Probabilistic Extended Evidential Framework Associate Probabilities Convert toPrEAF Associate Probabilities Semantics Preserved PrAF Associate Probabilities Figure 4.9: [Li15]’s probabilistic argumentation architecture. University of Aberdeen, 2015 Page 41
  • 57. 5 A novel synthesis: Collaborative Intelligence Spaces (CISpaces) Acknowledgement This handout include material from a number of collaborators including Alice Toniolo and Timothy J. Norman. Main reference: [Ton+15].5 5.1 Introduction Problem • Intelligence analysis is critical for making well-informed decisions • Complexities in current military operations increase the amount of information available to intelligence analysts10 CISpaces (Collaborative Intelligence Spaces) • A toolkit developed to support collaborative intelligence analysis • CISpaces aims to improve situational understanding of evolving sit- uations 5.2 Intelligence Analysis15 Definition 73 ([DCD11]). The directed and coordinated acquisition and analysis of information to assess capabilities, intent and opportunities for exploitation by leaders at all levels. ♠ Fig. 5.1 summarises the Pirolli and Card Model [PC05]. Table 5.1 illustrates the problems of individual analysis and how col-20 laborative analysis can improve it. University of Aberdeen, 2015 Page 42
  • 58. CISpaces • Intelligence Analysis External Data Sources Presentation Search and Filter Schematize Build Case Tell Story Reevaluate Search for support Search for evidence Search for information FORAGING LOOP SENSE-MAKING LOOP Structure Effort inf Shoebox Ev Ev EvEv Ev Ev Ev Ev Ev Ev Ev Evidence File Hyp1 Hyp2 Hypotheses Pirolli & Card Model Figure 5.1: The Pirolli & Card Model [PC05] Individual analysis Collaborative analysis • Scattered Information & Noise • Hard to make connections • Missing Information • Cognitive biases • Missing Expertise • More effective and reliable • Brings together different expertise, resources • Prevent biases Table 5.1: Individual vs. Collaborative Analysis University of Aberdeen, 2015 Page 43
  • 59. CISpaces • Intelligence Analysis Harbour Kish Farm KISH River Water pipe Aqueduct KISHSHIRE Kish Hall Hotel Illness among young and elderly people in Kishshire caused by bacteria Unidentified illness is affecting the local livestock in Kishshire, the rural area of Kish Figure 5.2: Initial information assigned to Joe PEOPLE and LIVESTOCK illness Water TEST shows a BACTERIA in the water supply Answer to POI: "GER-MAN" seen in Kish Explosion in KISH Hall Hotel TIME Tests on people/livestock POI for suspicious people Figure 5.3: Further events happening in Kish Example of Intelligence Analysis Process Goal: discover potential threats in Kish Analysts: Joe, Miles and Ella What Joe knows is summarised by Figs. 5.2 and 5.3 Main critical points and possible conclusions during the analysis:5 • Causes of water contamination → waterborne/non-waterborne bacteria; • POI responsible for water contamination; • Causes of hotel explosion. University of Aberdeen, 2015 Page 44
  • 60. CISpaces • Reasoning with Evidence 5.3 Reasoning with Evidence • Identify what to believe happened from the claims constructed upon information (the sensemaking process); • Derive conclusions from data aggregated from explicitly requested information (the crowdsourcing process);5 • Assess what is credible according to the history of data manipula- tion (the provenance reasoning process). 5.4 Arguments for Sensemaking Formal Linkage for Semantics Computation A CISpace graph, WAT, can be transformed into a corresponding ASPIC-10 based argumentation theory. An edge in CISpaces is represented textu- ally as →, an info/claim node is written pi and a link node is referred to as type where type = {Pro,Con}. Then, [p1,...,pn → Pro → pφ] indicates that the Pro-link has p1,..., pn as incoming nodes and an outgoing node pφ.15 Definition 74. A WAT is a tuple 〈K, AS〉 such that AS= 〈L ,¯,R〉 is con- structed as follows: • L is a propositional logic language, and a node corresponds to a proposition p ∈ L . The WAT set of propositions is Lw. • The set R is formed by rules ri ∈ R corresponding to Pro links20 between nodes such that: [p1,..., pn → Pro → pφ] is converted to ri : p1,..., pn ⇒ pφ • The contrariness function between elements is defined as: i) if [p1 → Con → p2] and [p2 → Con → p1], p1 and p2 are contradictory; ii) [p1 → Con → p2] and p1 is the only premise of the Con link, then p125 is a contrary of p2; iii) if [p1, p3 → Con → p2] then a rule is added such that p1 and p3 form an argument with conclusion ph against p2, ri : p1, p3 ⇒ ph and ph is a contrary of p2. ♠ Definition 75. K is composed of propositions pi, K = {pj, pi,...}, such that: i) let a set of rules r1,...,rn ∈ R indicate a cycle30 such that for all pi that are consequents of a rule r exists r containing pi as antecedent, then pi ∈ K if pi is an info-node; ii) otherwise, pi ∈ K if pi is not consequent of any rule r ∈ R. ♠ University of Aberdeen, 2015 Page 45
  • 61. CISpaces • Arguments for Provenance An Example of Argumentation Schemes for Intelligence Analysis Intelligence analysis broadly consists of three components: Activities (Act) including actions performed by actors, and events happening in the world; Entities (Et) including actors as individuals or groups, and objects5 such as resources; and Facts (Ft) including statements about the state of the world regarding entities and activities. A hypothesis in intelligence analysis is composed of activities and events that show how the situation has evolved. The argument from cause to ef- fect (ArgCE) forms the basis of these hypotheses. The scheme, adapted10 from [WRM08], is: Argument from cause to effect Premises: • Typically, if C (either a fact Fti or an ac- tivity Acti) occurs, then E (either a fact Fti or an activity Acti) will occur • In this case, C occurs Conclusions: In this case E will occur Critical questions: CQCE1 Is there evidence for C to occur? CQCE1 Is there a general rule for C causing E ? CQCE3 Is the relationship between C and E causal? CQCE4 Are there any exceptions to the causal rule that prevent the effect E from occur- ring? CQCE5 Has C happened before E ? CQCE6 Is there any other C that caused E ? Formally: rCE : rule(R,C ,E ),occur(C ),before(C ,E ),15 ruletype(R,causal),noexceptions(R) ⇒ occur(E ) 5.5 Arguments for Provenance Provenance can be used to annotate how, where, when and by whom some information was produced [MM13]. Figure 5.4 depicts the core model for University of Aberdeen, 2015 Page 46
  • 62. CISpaces • Arguments for Provenance WasInformedBy Used WasGeneratedBy WasAssociatedWith ActedOnBehalfOf WasAttributedTo WasDerivedFrom Entity Actor Activity Figure 5.4: PROV Data Model [MM13] Lab Water Testing wasGeneratedBy Used wasAssociatedWith pjID:Bacteria contaminates local water Water Sample Generate Requirement Water monitoring Requirement wasDerivedFrom Used wasGeneratedBy wasInformedBy Monitoring of water supply used water contamination report Report generation Used wasGeneratedBy wasAssociatedWith wasDerivedFrom ?a1Pattern Pg Goal NGO lab assistant NGO Chemical Lab PrimarySource Time2014-11-13T08-16-45Z Time2014-11-12T10-14-40Z Time2014-11-14T05-14-10Z ?a2 ?p ?ag LEGEND p-Agent p-Entity p-Activity Node Older p-elements Newer Figure 5.5: Provenance of Joe’s information representing provenance, and Figure 5.5 shows an example of provenance for the pieces of information for analyst Joe w.r.t. the water contamination problem in Kish. Patterns representing relevant provenance information that may war- rant the credibility of a datum can be integrated into the analysis by ap-5 plying the argument scheme for provenance (ArgPV) [Ton+14]: University of Aberdeen, 2015 Page 47
  • 63. CISpaces • Arguments for Provenance Argument Scheme for Provenance Premises: • Given pj about activity Acti, entity Eti, or fact Fti (ppv1) • GP(pj) includes pattern Pm of p-entities Apv, p-activities Ppv, p-agents Agpv in- volved in producing pj (ppv2) • GP(pj) infers that information pj is true (ppv3) Conclusions: Acti/Eti/Fti in pj may plausibly be true (ppvcn) Critical questions: CQPV1 Is pj consistent with other information? CQPV2 Is pj supported by evidence? CQPV3 Does GP(pj) contain p-elements that lead us not to believe pj? CQPV4 Is there any other p-element that should have been included in GP(pj) to infer that pj is credible? University of Aberdeen, 2015 Page 48
  • 64. 6 Implementations Acknowledgement This handout include material from a number of collaborators including Massimiliano Giacomin, Mauro Vallati, and Stefan Woltran. Comprehensive survey recently published in [Cha+15].5 6.1 Ad Hoc Procedures NAD-Alg [NDA12; NAD14] 6.2 Constraint Satisfaction Programming A Constraint Satisfaction Problem (CSP) P [BS12; RBW08] is a triple P = 〈X,D,C〉 such that:10 • X = 〈x1,...,xn〉 is a tuple of variables; • D = 〈D1,...,Dn〉 a tuple of domains such that ∀i,xi ∈ Di; • C = 〈C1,...,Ct〉 is a tuple of constraints, where ∀j,Cj = 〈RSj ,Sj〉, Sj ⊆ {xi|xi is a variable}, RSj ⊆ SD j × SD j where SD j = {Di|Di is a domain, and xi ∈ Sj}.15 A solution to the CSP P is A = 〈a1,...,an〉 where ∀i,ai ∈ Di and ∀j,RSj holds on the projection of A onto the scope Sj. If the set of solutions is empty, the CSP is unsatisfiable. University of Aberdeen, 2015 Page 49
  • 65. Implementations • Answer Set Programming CONArg2 [BS12] In [BS12], the authors propose a mapping from AFs to CSPs. Given an AF Γ, they first create a variable for each argument whose domain is always {0,1} — ∀ai ∈ A ,∃xi ∈ X such that Di = {0,1}. Subsequently, they describe constraints associated to different defi-5 nitions of Dung’s argumentation framework: for instance {a1,a2} ⊆ A is D-conflict-free iff ¬(x1 = 1∧ x2 = 1). 6.3 Answer Set Programming Answer Set Programming (ASP) [Fab13] is a declarative problem solving paradigm. In ASP, representation is done using a rule-based language,10 while reasoning is performed using implementations of general-purpose algorithms, referred to as ASP solvers. AspartixM [EGW10; Dvo+11] AspartixM [Dvo+11] expresses argumentation semantics in Answer Set Programming (ASP): a single program is used to encode a particular ar-15 gumentation semantics, and the instance of an argumentation framework is given as an input database. Tests for subset-maximality exploit the metasp optimisation frontend for the ASP-package gringo/claspD. Given an AF Γ, Aspartix encodes the requirements for a “semantics” (e.g. the D-conflict-free requirements) in an ASP program whose database20 considers: {arg(a) | a ∈ A }∪{defeat(a1,a2) | 〈a1,a2〉 ∈→} The following program fragment is thus used to check the D-conflict- freeness [Dvo+11]: πcf = { in(X) ← not out(X),arg(X); out(X) ← not in(X),arg(X); ← in(X),in(Y ),defeat(X,Y )}. 25 πS T = { in(X) ← not out(X),arg(X); out(X) ← not in(X),arg(X); ← in(X),in(Y ),defeat(X,Y ); defeated(X) ← in(Y ),defeat(Y , X); ← out(X),not defeated(X)}. 6.4 Propositional Satisfiability Problems In the propositional satisfiability problem (SAT) the goal is to determine whether a given Boolean formula is satisfiable. A variable assignment that satisfies a formula is a solution.30 University of Aberdeen, 2015 Page 50
  • 66. Implementations • Propositional Satisfiability Problems In SAT, formulae are commonly expressed in Conjunctive Normal Form (CNF). A formula in CNF is a conjunction of clauses, where clauses are disjunctions of literals, and a literal is either positive (a variable) or neg- ative (the negation of a variable). If at least one of the literals in a clause is true, then the clause is satisfied, and if all clauses in the formula are5 satisfied then the formula is satisfied and a solution has been found. PrefSAT [Cer+14b] Requirements for complete labelling as a CNF [Cer+14b]: for each argu- ment ai ∈ A , three propositional variables are considered: Ii (which is true iff L ab(ai) = in), Oi (which is true iff L ab(ai) = out), Ui (which is10 true iff L ab(ai) = undec). Given |A | = k and φ : {1,...,k} → A . i∈{1,...,k} (Ii ∨Oi ∨Ui)∧(¬Ii ∨¬Oi)∧(¬Ii ∨¬Ui)∧(¬Oi ∨¬Ui) (6.1) {i|φ(i)−= } Ii (6.2) {i|φ(i)−= } Ii ∨ {j|φ(j)→φ(i)} (¬Oj) (6.3) {i|φ(i)−= } {j|φ(j)→φ(i)} ¬Ii ∨Oj (6.4)15 {i|φ(i)−= } {j|φ(j)→φ(i)} ¬I j ∨Oi (6.5) {i|φ(i)−= } ¬Oi ∨ {j|φ(j)→φ(i)} I j (6.6) {i|φ(i)−= } {k|φ(k)→φ(i)} Ui ∨¬Uk ∨ {j|φ(j)→φ(i)} I j (6.7) {i|φ(i)−= } {j|φ(j)→φ(i)} (¬Ui ∨¬I j) ∧ ¬Ui ∨ {j|φ(j)→φ(i)} Uj (6.8) i∈{1,...k} Ii (6.9)20 University of Aberdeen, 2015 Page 51
  • 67. Implementations • Propositional Satisfiability Problems As noticed in [Cer+14b], the conjunction of the above formulae is re- dundant. However, the non-redundant CNFs are not equivalent from an empirical evaluation [Cer+14b]: the overall performance is significantly affected by the chosen configuration pair CNF encoding–SAT solver. University of Aberdeen, 2015 Page 52
  • 68. Implementations • Propositional Satisfiability Problems Algorithm 1 Enumerating the D-preferred extensions of an AF PrefSAT(∆) 1: Input: ∆ = Γ 2: Output: Ep ⊆ 2A 3: Ep := 4: cnf := Π∆ 5: repeat 6: cnf df := cnf 7: pref cand := 8: repeat 9: lastcompf ound := SatS(cnf df ) 10: if lastcompf ound ! = ε then 11: pref cand := lastcompf ound 12: for a1 ∈ I-ARGS(lastcompf ound) do 13: cnf df := cnf df ∧ Iφ−1(a1) 14: end for 15: remaining := F ALSE 16: for a1 ∈ A I-ARGS(lastcompf ound) do 17: remaining := remaining ∨ Iφ−1(a1) 18: end for 19: cnf df := cnf df ∧ remaining 20: end if 21: until (lastcompf ound ! = ε∧I-ARGS(lastcompf ound) ! = A ) 22: if pref cand ! = then 23: Ep := Ep ∪{I-ARGS(pref cand)} 24: oppsolution := F ALSE 25: for a1 ∈ A I-ARGS(pref cand) do 26: oppsolution := oppsolution∨ Iφ−1(a1) 27: end for 28: cnf := cnf ∧ oppsolution 29: end if 30: until (pref cand ! = ) 31: if Ep = then 32: Ep = { } 33: end if 34: return Ep University of Aberdeen, 2015 Page 53
  • 69. Implementations • Propositional Satisfiability Problems Parallel-SCCp [Cer+14a; Cer+15] Based on the SCC-Recursiveness Schema [BGG05]. ab ef cdgh University of Aberdeen, 2015 Page 54
  • 70. Implementations • Propositional Satisfiability Problems Algorithm 1 Computing D-preferred labellings of an AF P-PREF(∆) 1: Input: ∆ = Γ 2: Output: Ep ∈ 2L(∆) 3: return P-SCC-REC(∆,A ) Algorithm 2 Greedy computation of base cases GREEDY(L,C) 1: Input: L = (L1 ,...,Ln := {Sn 1 ,...,Sn h }),C ⊆ A 2: Output: M = {...,(Si,Bi),...} 3: M := 4: for S ∈ n i=1 Li do in parallel 5: B := B-PR(∆↓S,S ∩C) 6: M = M ∪{(S,B)} 7: end for 8: return M BOUNDCOND(∆,Si,L ab) returns (O, I) where O = {a1 ∈ Si | ∃a2 ∈ S ∩ a− 1 : L ab(a2) = in} and I = {a1 ∈ Si | ∀ a2 ∈ S ∩ a− 1 ,L ab(a2) = out}, with S ≡ S1 ∪...∪ Si−1. University of Aberdeen, 2015 Page 55
  • 71. Implementations • Propositional Satisfiability Problems Algorithm 3 Determining the D-grounded labelling of an AF in a set C GROUNDED(∆,C) 1: Input: ∆ = Γ, C ⊆ A 2: Output: (L ab,U) : U ⊆ A ,L ab ∈ LA U 3: L ab := 4: U := A 5: repeat 6: initial f ound := ⊥ 7: for a1 ∈ C do 8: if {a2 ∈ U | a2 → a1} = then 9: initial f ound := 10: L ab := L ab ∪{(a1,in)} 11: U := U a1 12: C := C a1 13: for a2 ∈ (U ∩a+ 1 ) do 14: L ab := L ab ∪{(a2,out)} 15: U := U a2 16: C := C a2 17: end for 18: end if 19: end for 20: until (initial f ound) 21: return(L ab,U) University of Aberdeen, 2015 Page 56
  • 72. Implementations • Propositional Satisfiability Problems Algorithm 4 Computing D-preferred labellings of an AF in C P-SCC-REC(∆,C) 1: Input: ∆ = Γ, C ⊆ A 2: Output: Ep ∈ 2L(∆) 3: (L ab,U) = GROUNDED(∆,C) 4: Ep := {L ab} 5: ∆ = ∆↓U 6: L:= (L1 := {S1 1,...,S1 k },...,Ln := {Sn 1 ,...,Sn h }) = SCCS-LIST(∆) 7: M := {...,(Si,Bi),...} = GREEDY(L,C) 8: for l ∈ {1,...,n} do 9: El := {E S1 l := (),...,E Sk l := ()} 10: for S ∈ Ll do in parallel 11: for L ab ∈ Ep do in parallel 12: (O, I) := L-COND(∆,S,Ll ,L ab) 13: if I = then 14: ES l [L ab] ={{(a1,out) | a1 ∈ O} ∪{(a1,undec) | a1 ∈ S O}} 15: else 16: if I = S then 17: ES l [L ab] = B where (S,B) ∈ M 18: else 19: if O = then 20: ES l [L ab] = B-PR(∆↓S, I ∩C) 21: else 22: ES l [L ab]={{(a1,out) | a1 ∈ O}} 23: ES l [L ab] = ES l [L ab]⊗P-SCC-REC(∆↓SO, I ∩C) 24: end if 25: end if 26: end if 27: end for 28: end for 29: for S ∈ Ll do 30: Ep := 31: for L ab ∈ Ep do in parallel 32: Ep = Ep ∪({L ab}⊗ ES l [L ab]) 33: end for 34: Ep := Ep 35: end for 36: end for 37: return Ep University of Aberdeen, 2015 Page 57
  • 73. Implementations • Which One? 6.5 Which One? We need to be smart Holger H. Hoos, Invited Keynote Talk at ECAI2014 Features for AFs [VCG14; CGV14] Directed Graph (26 features)5 Structure: # vertices ( |A | ) # edges ( | → | ) # vertices / #edges ( |A |/| → | ) # edges / #vertices ( | → |/|A | ) density average Degree: stdev attackers max min # average stdev max SCCs: min Structure: # self-def # unattacked flow hierarchy Eulerian aperiodic CPU-time: . . . University of Aberdeen, 2015 Page 58
  • 74. Implementations • Which One? Undirected Graph (24 features) Structure: # edges # vertices / #edges # edges / #vertices density Degree: average stdev max min SCCs: # average stdev max min Structure: Transitivity 3-cycles: # average stdev max min CPU-time: . . . Average CPU-time, stdev, needed for extracting the features Direct Graph Features (DG) Class CPU-Time # feat Mean stdDev Graph Size 0.001 0.009 5 Degree 0.003 0.009 4 SCC 0.046 0.036 5 Structure 2.304 2.868 5 Undirect Graph Features (UG) Class CPU-Time # feat Mean stDev Graph Size 0.001 0.003 4 Degree 0.002 0.004 4 SCC 0.011 0.009 5 Structure 0.799 0.684 1 Triangles 0.787 0.671 5 5 Best Features for Runtime Prediction [CGV14] Determined by a greedy forward search based on the Correlation-based Feature Selection (CFS) attribute evaluator. University of Aberdeen, 2015 Page 59
  • 75. Implementations • Which One? Solver B1 B2 B3 AspartixM num. arguments density (DG) size max. SCC PrefSAT density (DG) num. SCCs aperiodicity NAD-Alg density (DG) CPU-time density CPU-time Eulerian SSCp density (DG) num. SCCs size max SCC Predicting the (log)Runtime [CGV14] RSME of Regression (Lower is better) B1 B2 B3 DG UG SCC All AspartixM 0.66 0.49 0.49 0.48 0.49 0.52 0.48 PrefSAT 1.39 0.93 0.93 0.89 0.92 0.94 0.89 NAD-Alg 1.48 1.47 1.47 0.77 0.57 1.61 0.55 SSCp 1.36 0.80 0.78 0.75 0.75 0.79 0.74 Log runtime is defined as n i=1 log10( ti )−log10( yi ) 2 n 5 Best Features for Classification [CGV14] Determined by a greedy forward search based on the Correlation-based Feature Selection (CFS) attribute evaluator. C-B1 C-B2 C-B3 num. arguments density (DG) min attackers Classification [CGV14]10 Classification (Higher is better) C −B1 C-B2 C-B3 DG UG SCC All Accuracy 48.5% 70.1% 69.9% 78.9% 79.0% 55.3% 79.5% Prec. AspartixM 35.0% 64.6% 63.7% 74.5% 74.9% 42.2% 76.1% Prec. PrefSAT 53.7% 67.8% 68.1% 79.6% 80.5% 60.4% 80.1% Prec. NAD-Alg 26.5% 69.2% 69.0% 81.7% 85.1% 35.3% 86.0% Prec. SSCp 54.3% 73.0% 72.7% 76.6% 76.8% 57.8% 77.2% Selecting the Best Algorithm [CGV14] Metric: Fastest (max. 1007) AspartixM 106 NAD-Alg 170 PrefSAT 278 SSCp 453 EPMs Regression 755 EPMs Classification 788 University of Aberdeen, 2015 Page 60
  • 76. Implementations • Which One? Metric: IPC (max. 1007) NAD-Alg 210.1 AspartixM 288.3 PrefSAT 546.7 SSCp 662.4 EPMs Regression 887.7 EPMs Classification 928.1 IPC score1 : for each AF, each system gets a score of T∗ /T, where T is its execution time and T∗ the best execution time among the compared systems, or a score of 0 if it fails in that case. Runtimes below 0.01 seconds get by default the maximal score of 1. The IPC score considers, at the5 same time, the runtimes and the solved instances 1 http://ipc.informatik.uni-freiburg.de/ . University of Aberdeen, 2015 Page 61
  • 77. Bibliography [AB08] K Atkinson and T J M Bench-Capon. “Addressing moral prob- lems through practical reasoning”. In: Journal of Applied Logic 6.2 (2008), pp. 135–151. [BA09] Trevor Bench-Capon and Katie Atkinson. “Abstract Argumen-5 tation and Values”. In: Argumentation in Artificial Intelli- gence. Ed. by Guillermo Simari and Iyad Rahwan. Springer US, 2009, pp. 45–64. [Bar+09] Pietro Baroni et al. “Encompassing Attacks to Attacks in Ab- stract Argumentation Frameworks”. In: ECSQARU 2009. Ed.10 by C Sossai and G Chemello. LLNAI, Springer-Verlag, 2009, pp. 2–7. URL: http://link.springer.com/chapter/10. 1007/978-3-642-02906-6_9. [Bar+11] Pietro Baroni et al. “AFRA: Argumentation framework with recursive attacks”. In: International Journal of Approximate15 Reasoning (Special Issue Tenth European Conference on Sym- bolic and Quantitative Approaches to Reasoning with Uncer- tainty - ECSQARU 2009) 52.1 (2011), pp. 19–37. ISSN: 0888- 613X. DOI: DOI:10.1016/j.ijar.2010.05.004. URL: http: //www.sciencedirect.com/science/article/B6V07-20 5051PHF-1/2/0627465d80529b35028ab8f7e7d44698. [Bar+14] Pietro Baroni et al. “On the Input/Output behavior of argu- mentation frameworks”. In: Artificial Intelligence 217 (2014), pp. 144–197. URL: http://www.sciencedirect.com/science/ article/pii/S0004370214001131.25 [BCG11] P Baroni, M Caminada, and M Giacomin. “An introduction to argumentation semantics”. In: Knowledge Engineering Re- view 26.4 (2011), pp. 365–410. [Ben03] Trevor J M Bench-Capon. “Try to See it My Way: Modelling Persuasion in Legal Discourse”. In: Artificial Intelligence and30 Law 11.4 (2003), pp. 271–287. ISSN: 0924-8463. DOI: 10 . 1023/B:ARTI.0000045997.45038.8f. [Bex+13] Floris Bex et al. “Implementing the argument web”. In: Com- munications of the ACM 56.10 (Oct. 2013), p. 66. University of Aberdeen, 2015 Page 62
  • 78. BIBLIOGRAPHY • BIBLIOGRAPHY [BG07] Pietro Baroni and Massimiliano Giacomin. “On principle-based evaluation of extension-based argumentation semantics”. In: Artificial Intelligence (Special issue on Argumentation in A.I.) 171.10/15 (2007), pp. 675–700. [BG09a] Pietro Baroni and Massimiliano Giacomin. “Semantics of Ab-5 stract Argument Systems”. In: Argumentation in Artificial Intelligence. Ed. by Guillermo Simari and Iyad Rahwan. Springer US, 2009, pp. 25–44. [BG09b] Pietro Baroni and Massimiliano Giacomin. “Skepticism rela- tions for comparing argumentation semantics”. In: Interna-10 tional Journal of Approximate Reasoning 50.6 (June 2009), pp. 854–866. ISSN: 0888-613X. DOI: 10.1016/j.ijar.2009. 02.006. URL: http://linkinghub.elsevier.com/retrieve/ pii/S0888613X09000383http://dx.doi.org/10.1016/j. ijar.2009.02.006http://dl.acm.org/citation.cfm?15 id=1542547.1542704. [BGG05] Pietro Baroni, Massimiliano Giacomin, and Giovanni Guida. “SCC-recursiveness: a general schema for argumentation se- mantics”. In: Artificial Intelligence 168.1-2 (2005), pp. 165– 210.20 [BGV14] Pietro Baroni, Massimiliano Giacomin, and Paolo Vicig. “On Rationality Conditions for Epistemic Probabilities in Abstract Argumentation”. In: Computational Models of Argument - Pro- ceedings of COMMA 20142. Ed. by Simon Parsons et al. 2014, pp. 121–132.25 [BH01] Philippe Besnard and Anthony Hunter. “A logic-based the- ory of deductive arguments”. In: Artificial Intelligence 128 (2001), pp. 203–235. ISSN: 00043702. DOI: 10.1016/S0004- 3702(01)00071-6. URL: http://www.sciencedirect.com/ science/article/pii/S0004370201000716.30 [BH08] Philippe Besnard and Anthony Hunter. Elements of Argu- mentation. MIT Press, 2008. [BH14] Philippe Besnard and Anthony Hunter. “Constructing argu- ment graphs with deductive arguments: a tutorial”. en. In: Argument and Computation 5.1 (Feb. 2014), pp. 5–30. URL:35 http://www.tandfonline.com/doi/abs/10.1080/19462166. 2013.869765#.VM5j8eptPUY. [Bon+97] Andrei Bondarenko et al. “An abstract, argumentation-theoretic approach to default reasoning”. In: Artificial Intelligence 93 (1997), pp. 63–101. ISSN: 00043702. DOI: 10.1016/S0004-40 3702(97)00015-5. University of Aberdeen, 2015 Page 63
  • 79. BIBLIOGRAPHY • BIBLIOGRAPHY [BS12] Stefano Bistarelli and Francesco Santini. “Modeling and Solv- ing AFs with a Constraint-Based Tool: ConArg”. In: Theory and Applications of Formal Argumentation. Vol. 7132. Springer, 2012, pp. 99–116. ISBN: 978-3-642-29183-8. [BTK93] Andrei Bondarenko, Francesca Toni, and Robert A. Kowalski.5 “An Assumption-Based Framework for Non-Monotonic Rea- soning”. In: Proceedings Second International Workshop on Logic Programming and Non-Monotonic Reasoning. Ed. by A Nerode and L Pereira. MIT Press, 1993. URL: citeseer.nj. nec.com/bondarenko93assumptionbased.html.10 [CA07] Martin Caminada and Leila Amgoud. “On the evaluation of argumentation formalisms”. In: Artificial Intelligence 171.5- 6 (Apr. 2007), pp. 286–310. ISSN: 00043702. DOI: 10.1016/j. artint.2007.02.003. URL: http://linkinghub.elsevier. com/retrieve/pii/S0004370207000410.15 [Cam06] Martin Caminada. “On the Issue of Reinstatement in Ar- gumentation”. In: Proceedings of the 10th European Confer- ence on Logics in Artificial Intelligence (JELIA 2006). 2006, pp. 111–123. ISBN: 3-540-39625-X. [Cer+14a] Federico Cerutti et al. “A SCC Recursive Meta-Algorithm for20 Computing Preferred Labellings in Abstract Argumentation”. In: 14th International Conference on Principles of Knowledge Representation and Reasoning. Ed. by Chitta Baral and Giuseppe De Giacomo. 2014, pp. 42–51. URL: http://www.aaai.org/ ocs/index.php/KR/KR14/paper/view/7974.25 [Cer+14b] Federico Cerutti et al. “Computing Preferred Extensions in Abstract Argumentation: A SAT-Based Approach”. In: TAFA 2013. Ed. by Elizabeth Black, Sanjay Modgil, and Nir Oren. Vol. 8306. Lecture Notes in Computer Science. Springer-Verlag Berlin Heidelberg, 2014, pp. 176–193. URL: http://link.30 springer.com/chapter/10.1007%2F978-3-642-54373- 9_12. [Cer+15] Federico Cerutti et al. “Exploiting Parallelism for Hard Prob- lems in Abstract Argumentation”. In: 29th AAAI Conference - AAAI 2015. 2015, pp. 1475–1481. URL: http://www.aaai.35 org / ocs / index . php / AAAI / AAAI15 / paper / viewFile / 9451/9421. [CGV14] Federico Cerutti, Massimiliano Giacomin, and Mauro Vallati. “Algorithm Selection for Preferred Extensions Enumeration”. In: 5th Conference on Computational Models of Argument.40 University of Aberdeen, 2015 Page 64
  • 80. BIBLIOGRAPHY • BIBLIOGRAPHY Ed. by Simon Parsons et al. 2014, pp. 221–232. URL: http: //ebooks.iospress.nl/volumearticle/37791. [Cha+15] Günther Charwat et al. “Methods for solving reasoning prob- lems in abstract argumentation — A survey”. In: Artificial Intelligence 220 (Mar. 2015), pp. 28–63. ISSN: 00043702. DOI:5 10.1016/j.artint.2014.11.008. URL: http://www. sciencedirect.com/science/article/pii/S0004370214001404. [Che+06] Carlos Iván Chesnevar et al. “Towards an argument inter- change format”. English. In: The Knowledge Engineering Re- view 21.04 (Dec. 2006), p. 293. ISSN: 0269-8889. DOI: 10 .10 1017/S0269888906001044. URL: http://journals.cambridge. org/abstract_S0269888906001044. [Chr00] George C Christie. The Notion of an Ideal Audience in Legal Argument. Kluwer Academic Publishers, 2000. [CL05] C Cayrol and M C Lagasquie-Schiex. “On the Acceptability15 of Arguments in Bipolar Argumentation Frameworks”. In: Symbolic and Quantitative Approaches to Reasoning with Un- certainty. Ed. by Lluís Godo. Vol. 3571. Lecture Notes in Com- puter Science. Springer Berlin Heidelberg, 2005, pp. 378– 389.20 [Col92] J L Coleman. Risks and Wrongs. Cambridge University Press, 1992. [DCD11] DCDC. Understanding and Intelligence Support to Joint Op- erations. Tech. rep. 2011. [DT10] Phan Minh Dung and Phan Minh Thang. “Towards (Prob-25 abilistic) Argumentation for Jury-based Dispute Resolution”. In: Proocedings of the Third International Conference on Com- putational Models of Argument (COMMA 2010). Ed. by Pietro Baroni et al. IOS Press, Aug. 2010, pp. 171–182. ISBN: 978-1- 60750-618-8. URL: http://dl.acm.org/citation.cfm?id=30 1860828.1860846. [Dun+14] Paul E. Dunne et al. “Characteristics of Multiple Viewpoints in Abstract Argumentation”. In: Proceedings of the 14th Con- ference on Principles of Knowledge Representation and Rea- soning. 2014, pp. 72–81.35 [Dun95] Phan Minh Dung. “On the Acceptability of Arguments and Its Fundamental Role in Nonmonotonic Reasoning, Logic Pro- gramming, and n-Person Games”. In: Artificial Intelligence 77.2 (1995), pp. 321–357. University of Aberdeen, 2015 Page 65
  • 81. BIBLIOGRAPHY • BIBLIOGRAPHY [Dvo+11] Wolfgang Dvoˇrák et al. “Making Use of Advances in Answer- Set Programming for Abstract Argumentation Systems”. In: Proceedings of the 19th International Conference on Applica- tions of Declarative Programming and Knowledge Manage- ment (INAP 2011). 2011.5 [DW09] Paul E. Dunne and Michael Wooldridge. “Complexity of ab- stract argumentation”. In: Argumentation in AI. Ed. by I Rah- wan and G Simari. Springer-Verlag, 2009. Chap. 5, pp. 85– 104. [EGW10] Uwe Egly, Sarah Alice Gaggl, and Stefan Woltran. “Answer-10 set programming encodings for argumentation frameworks”. In: Argument & Computation 1.2 (June 2010), pp. 147–177. ISSN: 1946-2166. DOI: 10.1080/19462166.2010.486479. URL: http : // dx . doi . org/ 10 . 1080 / 19462166 . 2010 . 486479.15 [Fab13] Wolfgang Faber. “Answer Set Programming”. In: Reasoning Web. Semantic Technologies for Intelligent Data Access. Vol. 8067. Lecture Notes in Computer Science. Springer Berlin Heidel- berg, 2013, pp. 162–193. [Fin74] Bruno De Finetti. Theory of Probability. Vol. I. Wuley, 1974.20 [GH11] Nikos Gorogiannis and Anthony Hunter. “Instantiating ab- stract argumentation with classical logic arguments: Postu- lates and properties”. In: Artificial Intelligence 175.9-10 (June 2011), pp. 1479–1497. ISSN: 00043702. DOI: 10 . 1016 / j . artint.2010.12.003. URL: http://linkinghub.elsevier.25 com/retrieve/pii/S0004370210002201http://www.sciencedirect. com/science/article/pii/S0004370210002201. [GHW09] Nikos Gorogiannis, Anthony Hunter, and Matthew Williams. “An argument-based approach to reasoning with clinical knowl- edge”. In: International Journal of Approximate Reasoning30 51.1 (Dec. 2009), pp. 1–22. ISSN: 0888613X. DOI: 10.1016/ j.ijar.2009.06.015. URL: http://www.sciencedirect. com/science/article/pii/S0888613X09001315. [GS04] Alejandro J. García and Guillermo R. Simari. “Defeasible logic programming: an argumentative approach”. In: Theory and35 Practice of Logic Programming 4.1+2 (2004), pp. 95–138. ISSN: 1475-3081. DOI: 10.1017/S1471068403001674. URL: http: //journals.cambridge.org/article_S1471068403001674. University of Aberdeen, 2015 Page 66
  • 82. BIBLIOGRAPHY • BIBLIOGRAPHY [GS14] Alejandro J. García and Guillermo R. Simari. “Defeasible logic programming: DeLP-servers, contextual queries, and expla- nations for answers”. en. In: Argument & Computation 5.1 (Feb. 2014), pp. 63–88. ISSN: 1946-2166. DOI: 10.1080/19462166. 2013.869767. URL: http://www.tandfonline.com/doi/5 ref/10.1080/19462166.2013.869767#.VV2rticVhBc. [HMP01] D Hitchcock, P McBurney, and P Parsons. “A Framework for Deliberation Dialogues, Argument and Its Applications”. In: Proceedings of the Fourth Biennial Conference of the Ontario Society for the Study of Argumentation (OSSA 2001). Ed. by10 H V Hansen et al. 2001. [HT14] Anthony Hunter and Matthias Thimm. “Probabilistic Argu- mentation with Incomplete Information”. In: ECAI 2014 - 21st European Conference on Artificial Intelligence2. 2014, pp. 1033 –1034.15 [Hun13] Anthony Hunter. “A probabilistic approach to modelling un- certain logical arguments”. In: International Journal of Ap- proximate Reasoning 54.1 (Jan. 2013), pp. 47–81. ISSN: 0888613X. DOI: 10.1016/j.ijar.2012.08.003. URL: http://www. sciencedirect.com/science/article/pii/S0888613X12001442.20 [Hun14] Anthony Hunter. “Probabilistic qualification of attack in ab- stract argumentation”. In: International Journal of Approxi- mate Reasoning 55.2 (Jan. 2014), pp. 607–638. ISSN: 0888613X. DOI: 10.1016/j.ijar.2013.09.002. URL: http://www. sciencedirect.com/science/article/pii/S0888613X13001710.25 [HW12] Anthony Hunter and Matthew Williams. “Aggregating evi- dence about the positive and negative effects of treatments.” In: Artificial intelligence in medicine 56.3 (Nov. 2012), pp. 173– 90. ISSN: 1873-2860. DOI: 10.1016/j.artmed.2012.09.004. URL: http://www.sciencedirect.com/science/article/30 pii/S0933365712001194. [Li15] Hengfei Li. “Probabilistic Argumentation”. PhD thesis. U. Ab- erdeen, 2015. [LON12] Hengfei Li, Nir Oren, and TimothyJ. Norman. “Probabilistic Argumentation Frameworks”. In: Theorie and Applications35 of Formal Argumentation. Ed. by Sanjay Modgil, Nir Oren, and Francesca Toni. Vol. 7132. Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2012, pp. 1–16. [LON13] Hengfei Li, Nir Oren, and Timothy J. Norman. “Relaxing In- dependence Assumptions in Probabilistic Argumentation”. In:40 ArgMAS 2013. 2013. University of Aberdeen, 2015 Page 67
  • 83. BIBLIOGRAPHY • BIBLIOGRAPHY [MM13] L Moreau and P Missier. PROV-DM: The PROV Data Model. Available at http://www.w3.org/TR/prov-dm/. Apr. 2013. [Mod09] Sanjay Modgil. “Reasoning about preferences in argumenta- tion frameworks”. In: Artificial Intelligence 173.9-10 (2009), pp. 901–934.5 [MP02] Peter McBurney and Simon Parsons. “Games that agents play: A formal framework for dialogues between autonomous agents”. In: Journal of Logic, Language and Information 11.3 (2002), pp. 315–334. URL: http://www.springerlink.com/index/ N809NP4PPR3HFTDV.pdf.10 [MP13] Sanjay Modgil and Henry Prakken. “A general account of argumentation with preferences”. In: Artificial Intelligence 195 (2013), pp. 361–397. ISSN: 00043702. DOI: 10.1016/j. artint.2012.10.008. URL: http://www.sciencedirect. com/science/article/pii/S0004370212001361.15 [MP14] Sanjay Modgil and Henry Prakken. “The ASPIC+ framework for structured argumentation: a tutorial”. In: Argument & Computation 5.1 (2014), pp. 31–62. ISSN: 1946-2166. DOI: 10 . 1080 / 19462166 . 2013 . 869766. URL: http : / / www . tandfonline.com/doi/abs/10.1080/19462166.2013.20 869766. [NAD14] Samer Nofal, Katie Atkinson, and Paul E. Dunne. “Algorithms for decision problems in argument systems under preferred semantics”. In: Artificial Intelligence 207 (2014), pp. 23–51. URL: http://www.sciencedirect.com/science/article/25 pii/S0004370213001161. [NDA12] S Nofal, P E Dunne, and K Atkinson. “On Preferred Exten- sion Enumeration in Abstract Argumentation”. In: Proceed- ings of 3rd International Conference on Computational Mod- els of Arguments (COMMA 2012). 2012, pp. 205–216.30 [ON08] Nir Oren and Timothy J. Norman. “Semantics for Evidence- Based Argumentation”. In: (June 2008), pp. 276–284. URL: http://dl.acm.org/citation.cfm?id=1566134.1566160. [Par94] Jeff Paris. The Uncertain Reasoner’s Companion: A Mathe- matical Perspective. Cambridge University Press, 1994.35 [PC05] P. Pirolli and S. Card. “The sensemaking process and lever- age points for analyst technology as identified through cogni- tive task analysis”. In: Proceedings of the International Con- ference on Intelligence Analysis. 2005. [Per80] Chaïm Perelman. Justice, Law, and Argument. D. Reidel Pub-40 lishing Company, Dordrecht, 1980. University of Aberdeen, 2015 Page 68
  • 84. BIBLIOGRAPHY • BIBLIOGRAPHY [PO69] Chaïm Chaim Perelman and Lucie Olbrechts-Tyteca. The New Rehetoric: A Treatise on Argumentation. Notre Dame, Ind: University of Notre Dame Press, 1969. [Pra10] Henry Prakken. “An abstract framework for argumentation with structured arguments”. In: Argument & Computation5 1.2 (June 2010), pp. 93–124. ISSN: 1946-2166. DOI: 10.1080/ 19462160903564592. URL: http://www.tandfonline.com/ doi/abs/10.1080/19462160903564592. [PV02] Henry Prakken and Gerard Vreeswijk. “Logics for Defeasi- ble Argumentation”. In: Handbook of philosophical logic 410 (2002), pp. 218–319. ISSN: 0955792X. DOI: 10.1007/978- 94- 017- 0456- 4_3. URL: http://link.springer.com/ chapter/10.1007/978-94-017-0456-4_3. [Rah+11] Iyad Rahwan et al. “Representing and classifying arguments on the Semantic Web”. English. In: The Knowledge Engineer-15 ing Review 26.04 (Nov. 2011), pp. 487–511. ISSN: 0269-8889. DOI: 10.1017/S0269888911000191. URL: http://journals. cambridge.org/abstract_S0269888911000191. [RBW08] Francesca Rossi, Peter van Beek, and Toby Walsh. “Chap- ter 4 Constraint Programming”. In: Handbook of Knowledge20 Representation. Ed. by Vladimir Lifschitz van Harmelen and Bruce Porter. Vol. 3. Foundations of Artificial Intelligence. Elsevier, 2008, pp. 181–211. DOI: http : / / dx . doi . org / 10.1016/S1574- 6526(07)03004- 0. URL: http://www. sciencedirect.com/science/article/pii/S1574652607030040.25 [Sim89] Guillermo Ricardo Simari. “A mathematical treatment of de- feasible reasoning and its implementation”. PhD thesis. Wash- ington University, Jan. 1989. URL: http://dl.acm.org/ citation.cfm?id=917275. [SL92] Guillermo R Simari and Ronald P Loui. “A mathematical30 treatment of defeasible reasoning and its implementation”. In: Artificial Intelligence 53.2–3 (1992), pp. 125–157. [Thi12] Matthias Thimm. “A Probabilistic Semantics for Abstract Ar- gumentation”. In: Proceedings of the 20th European Confer- ence on Artificial Intelligence (ECAI’12). Aug. 2012.35 [Ton12] Francesca Toni. “Reasoning on the Web with Assumption- Based Argumentation”. In: Reasoning Web. Semantic Tech- nologies for Advanced Query Answering. Ed. by Thomas Eiter and Thomas Krennwallner. Vol. 7487. Lecture Notes in Com- puter Science. Springer Berlin Heidelberg, 2012, pp. 370–40 386. University of Aberdeen, 2015 Page 69
  • 85. BIBLIOGRAPHY • BIBLIOGRAPHY [Ton+14] Alice Toniolo et al. “Making Informed Decisions with Prove- nance and Argumentation Schemes”. In: Eleventh Interna- tional Workshop on Argumentation in Multi-Agent Systems (ArgMAS 2014). 2014. URL: http://www.inf.pucrs.br/ felipe.meneguzzi/download/AAMAS_14/workshops/AAMAS2014-5 W12/w12-11.pdf. [Ton14] Francesca Toni. “A tutorial on assumption-based argumenta- tion”. en. In: Argument & Computation 5.1 (Feb. 2014), pp. 89– 117. ISSN: 1946-2166. DOI: 10.1080/19462166.2013.869878. URL: http://www.tandfonline.com/doi/full/10.1080/10 19462166.2013.869878. [Ton+15] Alice Toniolo et al. “Agent Support to Reasoning with Dif- ferent Types of Evidence in Intelligence Analysis”. In: Pro- ceedings of the 14th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2015). 2015, pp. 781–15 789. URL: http://aamas2015.com/en/AAMAS_2015_USB/ aamas/p781.pdf. [Tou58] S Toulmin. The Uses of Argument. Cambridge University Press, Cambridge, UK, 1958. [VCG14] Mauro Vallati, Federico Cerutti, and Massimiliano Giacomin.20 “Argumentation Frameworks Features: an Initial Study”. In: 21st European Conference on Artificial Intelligence. Ed. by T. Shaub, G. Friedrich, and B. O’Sullivan. 2014, pp. 1117– 1118. URL: http://ebooks.iospress.nl/volumearticle/ 37148.25 [Wal06] Douglas N Walton. “How to make and defend a proposal in a deliberation dialogue”. In: Artif. Intell. Law 14.3 (Sept. 2006), pp. 177–239. ISSN: 0924-8463. DOI: 10.1007/s10506-006- 9025-x. URL: http://portal.acm.org/citation.cfm?id= 1238120.1238122.30 [Wal14] Douglas N. Walton. Burned of Proof, Presumption and Argu- mentation. Cambridge University Press, 2014. [Wal97] Douglas N Walton. Appeal to Expert Opinion. University Park: Pennsylvania State University, 1997. [Wil+15] Matt Williams et al. “An updated systematic review of lung35 chemo-radiotherapy using a new evidence aggregation method.” In: Lung cancer (Amsterdam, Netherlands) 87.3 (Mar. 2015), pp. 290–5. ISSN: 1872-8332. DOI: 10 . 1016 / j . lungcan . 2014.12.004. URL: http://www.sciencedirect.com/ science/article/pii/S0169500214005145.40 University of Aberdeen, 2015 Page 70
  • 86. BIBLIOGRAPHY • BIBLIOGRAPHY [WRM08] Douglas N. Walton, Chris Reed, and Fabrizio Macagno. Argu- mentation Schemes. Cambridge University Press, NY, 2008. University of Aberdeen, 2015 Page 71