This document provides an overview of Ordered Weighted Averaging (OWA) operators for information fusion. It defines OWA operators and describes their properties such as boundedness, monotonicity, and idempotency. It discusses approaches for learning OWA operator weights from data using methods like the O'Hagan method and exponential OWA operators. Exponential OWA operators, including optimistic and pessimistic variants, allow generating OWA weights given a desired degree of orness. The document also gives an example of applying OWA operators for result fusion in multi-agent systems, such as a financial investment planning system.
5. Fusion Function Properties
Idempotency:
i:a ia F(a1,…, a n) a
Commutativity:
F(a,b) F(b,a)
Monotonicity:
i:a i>b i F(a1,…, a n) > F(b1,…, b n)
4
6. Examples of Averaging Operators
Mean: 𝐹 (𝑎1, 𝑎2, … , 𝑎𝑛) =
Max
Min
Median
Mode
OWA Operators
5
7. OWA Operators Definition
A mapping function 𝐹: 𝑅𝑛 → 𝑅 is called an OWA Operator of
dimension n if it has an associated weighting vector W such that:
From: D. Filev and R.R. Yager, “Learning OWA Operator Weights from Data”, In
Proceedings of the Third IEEE Conference on Fuzzy Systems, pp. 468-473, 1994.
6
10. Mathematical Properties of the OWA Operators
Bounded:
Monotonicity:
Symmetric (or Commutativity):
Idempotency:
9
11. Mathematical Properties of the OWA Operators
Associativity:
Stability for a linear function:
Invariance:
10
12. Behavioral Properties of the OWA Operators
Decisional behavior:
-express the behavior of the decision-maker, such as: tolerant,
optimistic,pessimistic or strict. These behaviors usually named
disjunctive and conjunctive behaviors.
Interpretability of the parameters:
-It is to be hoped that the parameters have almost evident
semantic interpretation. This property forbids the use of a black
box methodology.
Weights on the arguments:
-It is crucial to have the possibility to express weights on the
arguments. This can be understood as privileging some of them.
11
13. Measure Definition
Orness: characterizes the degree to which the aggregation is like an
OR (Max) operation.
Some special cases:
12
14. Measure Definition
Dispersion: dispersion or entropy associated with a weighting vector.
It was suggested for the use in calculating how much of the
information in the arguments is used during an aggregation based
on W.
13
15. Learning OWA Operators
O’Hagan method: using Orness and Dispersion measures to
develop a way to generate the OWA weights while having a
predefined degree of Orness (α) and aim to maximize the entropy.
14
16. Learning OWA Operators From Observations
Available information:
– a collection of m samples (observations) each comprised of an n-
tuple of values:
– aggregated value:dk
Looking for a vector of OWA weights by minimizing the instantaneous
error ek with respect to weights wi, where:
15
17. Learning OWA Operators From Observations
Constraints:
Transformation: we shall express the OWA weights as follows
16
18. Unconstrained Minimization Problem
Minimize the instantaneous error 𝑒 𝑘 with respect to the parameters 𝜆𝑖:
Using the gradient descent method, we obtain the following rule for updating
the parameters 𝜆𝑖:
where β denotes the Learning Rate (0≤ β ≤ 1)
17
19. Unconstrained Minimization Problem
Therefore, the parameters 𝜆𝑖 determining the OWA
weights, are updated by back-propagation of the error (𝑑^ 𝑘 − 𝑑 𝑘 )
between the current estimated aggregated value and the actual aggregated
value with factors 𝑤𝑖 and (𝑏 𝑘𝑖−𝑑^ 𝑘).
These factors are the current OWA weight 𝑤𝑖 and the difference (𝑏 𝑘𝑖
−𝑑^ 𝑘) between the i-th aggregate object 𝑏𝑘𝑖 and the current
estimated aggregated 𝑑^ 𝑘 .
18
23. Learning OWA Weights: Example 2
Applying the learning algorithm:
– Initial values:
– Learning rate:
– After 50 iterations:
22
24. Exponential OWA Operators
O’Hagan method: involves the solution of a constrainednonlinear
programming problem
Exponential OWA Operators: very simple relationship exists between
the Orness function and the parameter which determines the OWA
weights.
Usefulness: Generation of OWA weights satisfying a given degree of
Orness.
23
25. Optimistic Exponential OWA Operators
Definition (Optimistic Exponential OWA Operators):
where parameter α belongs to the unit interval (0≤ α ≤ 1)
A recursive express of these weights is as:
24
26. Optimistic Exponential OWA Operators
An alternative view to the process of generation of the Optimistic
Exponential OWA weights:
Assume W is a weighting vector of dimension n, we extend it to a
vector V of dimension n+1 as follows:
Thus in extending this, we simply proportionally divide the weight in
the last place of the old vector.
25
27. Optimistic Exponential OWA Operators
Properties:
Thus, from a formal point of view these weights can serve as OWA
weights.
Special cases:
26
29. Optimistic Exponential OWA Operators
For the Optimistic Exponential OWA Operators, the Orness function
is a monotonically increasing function of parameter α.
As the number of arguments increases, this aggregation becomes
more and more OR-like.
28
30. Optimistic Exponential OWA Operators
The functional relationship between the Orness and
parameter α for different numbers of arguments
n=2.3.. 10.
The monotonically increasing nature of this
relationship with respect to α, for a fixed n, the Orness
of this aggregation increases as α increases.
The monotonically increasing nature of this
relationship with respect to n, for a fixed α, the Orness
of this aggregation increases as n increases.
For n=2 the Orness value of this aggregation is
always equal to α.
For any value of n>2, the degree of Orness is higher
than the value of the parameter α. 32
29
31. Pessimistic Exponential OWA Operators
Definition:
where parameter α belongs to the unit interval (0≤ α ≤ 1)
A recursive express of these weights is as:
30
32. Pessimistic Exponential OWA Operators
An alternative view to the process of generation of the Pessimistic
Exponential OWA weights:
Assume W is a weighting vector of dimension n, we extend it to a
vector V of dimension n+1 as follows:
31
33. Pessimistic Exponential OWA Operators
Properties:
Thus, from a formal point of view these weights can serve as OWA
weights.
Special cases:
32
34. Pessimistic Exponential OWA Operators
For the Pessimistic Exponential OWA Operators, the Orness unction
is a monotonically increasing function of parameter α.
Orness function is a monotonically decreasing function of paramete n.
As the number of arguments increases, this aggregation becomes
more and more AND-like.
33
35. Pessimistic Exponential OWA Operators
The functional relationship between the Orness and
parameter α for different numbers of arguments n = 2.
3 .... 10.
The monotonically increasing nature of this
relationship with respect to α, for a fixed n, the Orness
of this aggregation increases as α increases.
The monotonically decreasing nature of this
relationship with respect to n, for a fixed α, the Orness
of this aggregation decreases as n increases.
For n=2 the Orness value of this aggregation is
always equal to α. • For any value of n>2, the degree
of Orness is lower than the value of the parameter α.
34
36. Exponential OWA Operators
The Optimistic and Pessimistic Exponential OWA Operators have one
very useful property:
– Given a value of n and a desired degree of Orness, one can simply
obtain from previous figures the associated value of α. Then,
the OWA weights can easily be generated according to the
corresponding equations.
This simple technique eliminates the need to calculate the OWA
weights via the sophisticated O'Hagan’s procedure, which requires
the solution of the constrained optimization problem.
35
37. Exponential OWA Operators
Let us assume the number of aggregate objects is 5. Also, assume
the desired degree of Orness is 0.6.
36
38. Exponential OWA Operators: An Example
Let us assume the number of aggregate objects is 5. Also, assume
the desired degree of Orness is 0.9.
37
39. An Application:Result Fusion in Multi-Agent Systems
In multi-agent systems, each of the agents may have its own expertise.When
they are asked to accomplish the same task, the results may be different.
In such situations, it is required to fuse the results to obtain a final one.
This paper [Z. Zhang and C. Zhang, “Result Fusion in Multi-Agent Systems
Based on OWA Operator”, 23rd Australasian Computer Science Conference
(ACSC 2000), 2000]:
– proposes an information fusion approach when multiple agents are
asked to collect the information for the same query from different
resources.
– Discusses about the group decision fusion problem that several
agents with similar knowledge make decisions using the same
information.
38
40. An Application:Result Fusion in Multi-Agent Systems
Application domain:
– Financial Investment Planner Using Intelligent Agent Technologies
Sub-systems:
– User interface and result visualization agents
– Decision-making agents
– Information retrieval and information filter agents
– Data mining agents
39