SlideShare a Scribd company logo
Practical representations of probability sets: a
guided tour with applications
Sébastien Destercke
in collaboration with E. Miranda, I. Montes, M. Troffaes, D.
Dubois, O. Strauss, C. Baudrit, P.H. Wuillemin.
CNRS researcher, Laboratoire Heudiasyc, Compiègne
Madrid Seminar
Prac Rep 1
Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
Prac Rep 2
Introduction Basics Practical Representations Applications
Where is Compiegne
Prac Rep 3
Introduction Basics Practical Representations Applications
Heudiasyc and LABEX MS2T activities
Heudiasyc
G 140 members
G 6M budget
G 4 teams:
H Uncertainty and machine
learning
H Automatic and robotic
H Artificial intelligence
H Operational research and
networks
LABEX MS2T
G Topic: systems of systems
G 3 laboratories:
H Heudiasyc
H BMBI: Bio-mechanic
H Roberval: mechanic
If interested in collaborations, let
me know
Prac Rep 4
Introduction Basics Practical Representations Applications
Talk in a nutshell
What is this talk about
1. (very) Basics of imprecise probability
2. A review of practical representations
3. Some applications
What is this talk not about
G Deep mathematics of imprecise probabilities (you can ask Nacho or
Quique)
G Imprecise parametric models
Prac Rep 5
Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
Prac Rep 6
Introduction Basics Practical Representations Applications
Imprecise probabilities
What?
Representing uncertainty as a convex set P of probabilities rather than a
single one
Why?
G precise probabilities inadequate to model lack of information;
G generalize set-uncertainty and probabilistic uncertainty;
G can model situations where probabilistic information is partial;
G allow axiomatically alternatives to possibly be incomparable
Prac Rep 7
Introduction Basics Practical Representations Applications
Probabilities
Probability mass on finite space X = {x1,...,xn} equivalent to a n
dimensional vector
p := (p(x1),...,p(xn))
Limited to the set PX of all probabilities
p(x) > 0,
x∈X
p(x) = 1 and
The set PX is the (n−1)-unit simplex.
Prac Rep 8
Introduction Basics Practical Representations Applications
Point in unit simplex
p(x1) = 0.2, p(x2) = 0.5, p(x3) = 0.3
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
∝
p(X
1 )
∝p(x2)
∝
p(x 3
)
Prac Rep 9
Introduction Basics Practical Representations Applications
Imprecise probability
Set P defined as a set of n constraints
E(fi) ≤
x∈X
fi(x)p(x) ≤ E(fi)
where fi :→ R bounded functions
Example
2p(x2)−p(x3) ≥ 0
f(x1) = 0,f(x2) = 2,f(x3) = −1,E(f) = 0
Lower/upper probabilities
Bounds P(A),P(A) on event A equivalent to
P(A) ≤
x∈A
p(x) ≤ P(A)
Prac Rep 10
Introduction Basics Practical Representations Applications
Set P example
2p(x2)−p(x3) ≥ 0
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
Prac Rep 11
Introduction Basics Practical Representations Applications
Credal set example
2p(x2)−p(x3) ≥ 0
2p(x1)−p(x2)−p(x3) ≥ 0
P
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
Prac Rep 12
Introduction Basics Practical Representations Applications
Natural extension
From an initial set P defined by constraints, we can compute
G The lower expectation E(g) of any function g as
E(g) = inf
p∈P
E(g)
G The lower probability P(A) of any event A as
P(A) = inf
p∈P
P(A)
Prac Rep 13
Introduction Basics Practical Representations Applications
Some usual problems
G Computing E(g) = infp∈P E(g) of new function g
G Updating P (θ|x) = L(x|θ)P (θ)
G Computing conditional E(f|A)
G Simulating/sampling P
G Building joint over variables X1,...,Xn
can be difficult to perform in general → practical representations reduce
computational cost
Prac Rep 14
Introduction Basics Practical Representations Applications
What makes a representation "practical"
G A reasonable, algorithmically enumerable number of extreme points
reminder
p ∈ P extreme iff p = λp1 +(1−λ)p2 with λ ∈ (0,1) implies p1 = p2 = p.
We will denote E (P ) the set of extreme points of P
G n-monotone property of P
2-monotonicity (sub-modularity, convexity)
P(A∪B)+P(A∩B) ≥ P(A)+P(B) for any A,B ⊆ X
∞-monotonicity
P(∪n
i=1Ai) ≥
A ⊆{A1,...,An}
−1|A|+1
P(∪Ai ∈A Ai) for any A1,...,An ⊆ X and n > 0
Prac Rep 15
Introduction Basics Practical Representations Applications
Extreme points: illustration
G p(x1) = 1,p(x2) = 0,p(x3) = 0
G p(x1) = 0,p(x2) = 1,p(x3) = 0
G p(x1) = 0.25,p(x2) = 0.25,p(x3) = 0.5
p(x2)
p(x3) p(x1)
Prac Rep 16
Introduction Basics Practical Representations Applications
Extreme points: utility
G Computing E(g) → minimal E on ext. points
G Updating → update extreme points, take convex hull
G Conditional E(f|A) → minimal E(f|A) on ext. points
G Simulating P → take convex mixtures of ext. points
G Joint over variables X1,...,Xn → convex hull of joint extreme
Again, if number of extreme points is limited, or inner approximation (by
sampling) acceptable.
Prac Rep 17
Introduction Basics Practical Representations Applications
2-monotonicity
Computing E(g)
Choquet integral
E(g) = infg +
supg
infg
P({g ≥ t})dt
In finite spaces → sorting n values of g and compute P(A) for n events
Conditioning
P(A|B) =
P(A∩B)
P(A∩B)+P(Ac ∩B)
And P(A|B) remains 2-monotone (can be used to get E(f|A))
Prac Rep 18
Introduction Basics Practical Representations Applications
∞-monotonicity
If P is ∞-monotone, its Möbius inverse m : 2X
→ R
m(A) =
B⊆A
−1|AB|
P(B)
is positive and sums up to one, and is often called belief function
Simulating P
Sampling m and considering the associated set A
Joint model of X1,...,XN
If m1,m2 corresponds to inverses of X1,X2, consider joint m12 s.t.
m12(A×B) = m1(A)·m2(B)
G still ∞-monotone
G outer-approximate other def. of independence between P1, P2
Prac Rep 19
Introduction Basics Practical Representations Applications
2-monotonicity and extreme points [3]
Generating extreme points if P 2-monotone:
1. Pick a permutation σ : [1,n] → [1,n] of X
2. Consider sets Aσ
i
= {xσ(1),...,xσ(i)}
3. define Pσ
({xσ(i)}) = P(Aσ
i
)−P(Aσ
i−1
) for i = 1,...,n (Aσ
0
= )
4. then Pσ
∈ E (P )
Some comments
G Maximal value of |E (P )| = n!
G We can have Pσ1 = Pσ2 with σ1 = σ2 → |E (P )| often less than n!
Prac Rep 20
Introduction Basics Practical Representations Applications
Example
G X = {x1,x2,x3}
G σ(1) = 2,σ(2) = 3,σ(3) = 1
G Aσ
0
= ,Aσ
1
= {x2},Aσ
2
= {x2,x3},Aσ
3
= X
G Pσ
({xσ(1)}) = Pσ
({x2}) = P({x2})−P( ) = P({x2})
G Pσ
({xσ(2)}) = Pσ
({x3}) = P({x2,x3})−P({x2})
G Pσ
({xσ(3)}) = Pσ
({x1}) = P(X )−P({x2,x3}) = 1−P({x2,x3})
Prac Rep 21
Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
H Basics
H Possibility distributions
H P-boxes
H Probability intervals
H Elementary Comparative probabilities
G Illustrative applications
Prac Rep 22
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Two very basic models
Probability
G P({xi}) = P({xi}) = P({xi})
G ∞-monotone, n constraints, |E | = 1
Vacuous model PX
Only support X of probability is known
G P(X ) = 1
G ∞-monotone, 1 constraints, |E (P )| = n (Dirac distribution)
Easily extends to vacuous on set A (can be used in robust optimisation,
decision under risk, interval-analysis)
Prac Rep 23
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 24
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Neighbourhood models
Build a neighbourhood around a given probability P0
Linear vacuous/ -contamination
G P(A) = (1− )P0(A)+( )PX (A)
G ∞-monotone, n+1 constraints, |E (P )| = n
G ∈ [0,1]: unreliability of information P0
Pari-Mutuel [16]
G P(A) = max{(1+ )P0(A)− ,0}
G 2-monotone, n+1 constraints, |E (P )| =? (n?)
G ∈ [0,1]: unreliability of information P0
Other models exist, such as odds-ratio or distance-based (all q s.t. d(p,q) < δ)
→ often not attractive for |E (P )|/monotonicity, but may have nice properties
(odds/ratio: updating, square/log distances: convex continuous neighbourhood)
Prac Rep 25
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Illustration
x2
x3 x1
pari-mutuel
linear-vacuous
P0 = (0.5,0.3,0.2)
= 0.2
Prac Rep 26
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 27
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Possibility distributions [10]
Definition
Distribution π : X → [0,1]
with π(x) = 1 for at least one x
P given by
P(A) = min
x∈Ac
1−π(x),
which is a necessity measure
π
Characteristics of P
G Necessitates at most n values
G P is an ∞-monotone measure
Prac Rep 28
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Possibility distributions
Alternative definition
Provide nested events
A1 ⊆ ... ⊆ An.
Give lower confidence bounds
P(Ai) = αi
with αi+1 ≥ αi
Ai
αi
Extreme points [19]
G Maximum number is 2n−1
G Algorithm using nested structure of sets Ai
Prac Rep 29
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A basic distribution: simple support
G Set E of most plausible
values
G Confidence degree α = P(E)
Extends to multiple sets E1,...,Ep
→ Confidence degrees over
nested sets [18]
pH value ∈ [4.5,5.5] with
α = 0.8 (∼ "quite probable")
π
3 4 4.5 5.5 6 7
0
0.2
0.4
0.6
0.8
1.0
Prac Rep 30
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Partially specified probabilities [1] [8]
Triangular distribution M,[a,b]
encompasses all probabilities
with
G mode/reference value M
G support domain [a,b].
Getting back to pH
G M = 5
G [a,b] = [3,7]
1
pH
π
5 73
Prac Rep 31
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Normalized likelihood as possibilities [9] [2]
π(θ) = L(θ|x)/maxθ∗∈Θ L(θ∗
|x)
Binomial situation:
G θ = success probability
G x number of observed
successes
G x= 4 succ. out of 11
G x= 20 succ. out of 55
θ
1
π
4/11
Prac Rep 32
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Other examples
G Statistical inequalities (e.g., Chebyshev inequality) [8]
G Linguistic information (fuzzy sets) [5]
G Approaches based on nested models
Prac Rep 33
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 34
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
P-boxes [6]
Definition
When X ordered, bounds on
events of the kind:
Ai = {x1,...,xi}
Each bounded by
F(xi) ≤ P(Ai) ≤ F(xi)
0.5
1.0
x1 x2 x3 x4 x5 x6 x7
Characteristics of P
G Necessitates at most 2n values
G P is an ∞-monotone measure
Prac Rep 35
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
In general
Definition
A set of nested events
A1 ⊆ ... ⊆ An
Each bounded by
αi ≤ P(Ai) ≤ βi
0.5
1.0
x1 x2 x3 x4 x5 x6 x7
Extreme points [15]
G At most equal to Pell number Kn = 2Kn−1 +Kn−2
G Algorithm based on a tree structure construction
Prac Rep 36
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
P-box on reals [11]
A pair [F,F] of cumulative
distributions
Bounds over events [−∞,x]
G Percentiles by experts;
G Kolmogorov-Smirnov bounds;
Can be extended to any
pre-ordered space [6], [21] ⇒
multivariate spaces!
Expert providing percentiles
0 ≤ P([−∞,12]) ≤ 0.2
0.2 ≤ P([−∞,24]) ≤ 0.4
0.6 ≤ P([−∞,36]) ≤ 0.8
0.5
1.0
6 12 18 24 30 36 42
E1
E2
E3
E4
E5
Prac Rep 37
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 38
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Probability intervals [4]
Definition
Elements {x1,...,xn}.
Each bounded by
p(xi) ∈ [p(xi),p(xi)] x1 x2 x3 x4 x5 x6
Characteristics of P
G Necessitates at most 2n values
G P is an 2-monotone measure
Extreme points [4]
G Specific algorithm to extract
G If n even, maximum number is n+1
n/2
n
2
Prac Rep 39
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Probability intervals: example
Linguistic assessment
G x is very probable
G x has a good chance
G x is very unlikely
G x probability is about α
⇒
Numerical translation
G p(x) ≥ 0.75
G 0.4 ≤ p(x) ≤ 0.85
G p(x) ≤ 0.25
G α−0.1 ≤ p(x) ≤ α+1
Prac Rep 40
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 41
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Comparative probabilities
definitions
Comparative probabilities on X : assessments
P(A) ≥ P(B)
event A "at least as probable as" event B.
Some comments
G studied from the axiomatic point of view [13, 20]
G few studies on their numerical aspects [17]
G interesting for qualitative uncertainty modeling/representation,
expert elicitation, . . .
Prac Rep 42
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A specific case: elementary comparisons [14]
Elementary comparisons
Comparative probability orderings of the states X = {x1,...,xn} in the
form of a subset L of {1,...,n}×{1,...,n}.
The set of probability measures compatible with this information is
P (L ) = {p ∈ PX |∀(i,j) ∈ L ,p(xi) ≥ p(xj)},
Prac Rep 43
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Why focusing on this case?
Practical interest
G multinomial models (e.g., imprecise prior for Dirichlet), modal value
elicitation
G direct extension to define imprecise belief functions
Easy to represent/manipulate
G Through a graph G = (X ,L ) with states as nodes and relation L
for edges
G Example: given X = {x1,...,x5},L = {(1,3),(1,4),(2,5),(4,5)}, its
associated graph G is:
x1
x3 x4
x2
x5
Prac Rep 44
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Some properties
Characteristics of P
G Necessitates at most n2
values
G No guarantee that P is a 2-monotone measure
Extreme points [14]
G Algorithm identifying subsets of disconnected nodes
G maximal number is 2n−1
Prac Rep 45
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise list (acc. to my knowledge)
Name Monot. Max. |const.| Max. |E (P )| Algo. to get E (P )
Proba ∞ n 1 Yes
Vacuous ∞ 1 n Yes
2-mon 2 2n
n! Yes [3]
∞-mon ∞ 2n
n! No
Lin-vac. ∞ n+1 n Yes
Pari-mutuel 2 n+1 ? (n) No
Possibility ∞ n 2n−1
Yes [19]
P-box (gen.) ∞ 2n Kn (Pell) Yes [15]
Prob. int. 2 2n n+1
n/2
n
2 Yes [4]
Elem. Compa. × n2
2n−1
Yes [14]
Prac Rep 46
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise final graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 47
Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Some open questions
G study the numerical aspects of comparative probabilities with
numbers/general events.
G study the potential link between possibilities and elementary
comparative probabilities (share same number of extreme points,
induce ordering between states).
G study restricted bounds/information over specific families of events,
other than nested/elementary ones (e.g., events of at most k states).
G look at probability sets induced by bounding specific distances to p0,
in particular L1,L2,L∞ norms.
Prac Rep 48
Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
H Numerical Signal processing [7]
H Camembert ripening [12]
Prac Rep 49
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Signal processing: introduction
G Impulse response µ
filter
µ
G Filtering: convolving kernel µ and observed output f(x)
Prac Rep 50
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Link with probability
G If µ positive and R µ(x) dx = 1
G µ equivalent to probability density
G Convolution: compute mathematical expectation Eµ(f)
G Numerical filtering: discretize (sampling) µ and f
f f
x
0
µ
x
0
µ
G µ > 0, xi
µ(xi) = 1
Prac Rep 51
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Which bandwidth?
R
x
µ
∆???
→ use imprecise probabilistic models to model sets of bandwidth
→ possiblities/p-boxes with sets centred around x
Prac Rep 52
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Example on simulated signal
time (msec)
signalamplitude
maxitive upper enveloppe
maxitive lower enveloppe
cloudy upper enveloppe
cloudy lower enveloppe
original signal
Prac Rep 53
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Application: pepper/salt noise removal
Original image Noisy image
Prac Rep 54
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Results
Zoom on 2 parts
Résultats
CWMF ROAD Us
Prac Rep 55
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Motivations
A complex system
The Camembert-type cheese ripening process
t
j1 j14Cheese making Cheese ripening (∼ 13◦
C, ∼ 95%hum) Warehouse (∼ 4◦
C)
P( t = 14
|t = 0
)
G Multi-scale modeling; from microbial activities to sensory properties
G Dynamic probabilistic model
G Knowledge is fragmented, heterogeneous and incomplete
G Difficulties to learn precise model parameters
Use of -contamination for a robustness analysis of the model
Prac Rep 56
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Experiments
The network
T(t)
Km(t)
lo(t)
Km(t+1)
lo(t+1)
Time slice t Time slice t +1
Unrolled over 14 time steps (days)
T(1)
Km(1)
lo(1)
T(2)
Km(2)
lo(2)
Km(14)
lo(14)
t
j1 j14Cheese making Cheese ripening (∼ 13◦
C, ∼ 95%hum) Warehouse (∼ 4◦
C)
...
Prac Rep 57
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Propagation results
Forward propagation, ∀t ∈ 1,τ ,T(t) = 12o
C (average ripening room temperature) :
Ext{Km(t)|{Km(1),lo(1),T(1),...,T(τ)}}
Ext{lo(t)|{Km(1),lo(1),T(1),...,T(τ)}}
no physical constraints with added physical constraints
Prac Rep 58
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Conclusions
Use of practical representations
G +: "Easy" robustness analysis of precise methods, or approximation
of imprecise ones
G +: allow experts to express imprecision or partial information
G +: often easier to explain/represent than general ones
G -: usually focus on specific events
G -: their form may not be conserved by information processing
Prac Rep 59
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References I
[1] C. Baudrit and D. Dubois.
Practical representations of incomplete probabilistic knowledge.
Computational Statistics and Data Analysis, 51(1):86–108, 2006.
[2] M. Cattaneo.
Likelihood-based statistical decisions.
In Proc. 4th International Symposium on Imprecise Probabilities and
Their Applications, pages 107–116, 2005.
[3] A. Chateauneuf and J.-Y. Jaffray.
Some characterizations of lower probabilities and other monotone
capacities through the use of Mobius inversion.
Mathematical Social Sciences, 17(3):263–283, 1989.
Prac Rep 60
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References II
[4] L. de Campos, J. Huete, and S. Moral.
Probability intervals: a tool for uncertain reasoning.
I. J. of Uncertainty, Fuzziness and Knowledge-Based Systems,
2:167–196, 1994.
[5] G. de Cooman and P. Walley.
A possibilistic hierarchical model for behaviour under uncertainty.
Theory and Decision, 52:327–374, 2002.
[6] S. Destercke, D. Dubois, and E. Chojnacki.
Unifying practical uncertainty representations: I generalized
p-boxes.
Int. J. of Approximate Reasoning, 49:649–663, 2008.
Prac Rep 61
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References III
[7] S. Destercke and O. Strauss.
Filtering with clouds.
Soft Computing, 16(5):821–831, 2012.
[8] D. Dubois, L. Foulloy, G. Mauris, and H. Prade.
Probability-possibility transformations, triangular fuzzy sets, and
probabilistic inequalities.
Reliable Computing, 10:273–297, 2004.
[9] D. Dubois, S. Moral, and H. Prade.
A semantics for possibility theory based on likelihoods,.
Journal of Mathematical Analysis and Applications, 205(2):359 –
380, 1997.
Prac Rep 62
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References IV
[10] D. Dubois and H. Prade.
Practical methods for constructing possibility distributions.
International Journal of Intelligent Systems, 31(3):215–239, 2016.
[11] S. Ferson, L. Ginzburg, V. Kreinovich, D. Myers, and K. Sentz.
Constructing probability boxes and dempster-shafer structures.
Technical report, Sandia National Laboratories, 2003.
[12] M. Hourbracq, C. Baudrit, P.-H. Wuillemin, and S. Destercke.
Dynamic credal networks: introduction and use in robustness
analysis.
In Proceedings of the Eighth International Symposium on Imprecise
Probability: Theories and Applications, pages 159–169, 2013.
Prac Rep 63
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References V
[13] B. O. Koopman.
The axioms and algebra of intuitive probability.
Annals of Mathematics, pages 269–292, 1940.
[14] E. Miranda and S. Destercke.
Extreme points of the credal sets generated by comparative
probabilities.
Journal of Mathematical Psychology, 64:44–57, 2015.
[15] I. Montes and S. Destercke.
On extreme points of p-boxes and belief functions.
In Int. Conf. on Soft Methods in Probability and Statistics (SMPS),
2016.
Prac Rep 64
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References VI
[16] R. Pelessoni, P. Vicig, and M. Zaffalon.
Inference and risk measurement with the pari-mutuel model.
International journal of approximate reasoning, 51(9):1145–1158,
2010.
[17] G. Regoli.
Comparative probability orderings.
Technical report, Society for Imprecise Probabilities: Theories and
Applications, 1999.
[18] S. Sandri, D. Dubois, and H. Kalfsbeek.
Elicitation, assessment and pooling of expert judgments using
possibility theory.
IEEE Trans. on Fuzzy Systems, 3(3):313–335, August 1995.
Prac Rep 65
Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References VII
[19] G. Schollmeyer.
On the number and characterization of the extreme points of the
core of necessity measures on finite spaces.
ISIPTA conference, 2016.
[20] P. Suppes, G. Wright, and P. Ayton.
Qualitative theory of subjective probability.
Subjective probability, pages 17–38, 1994.
[21] M. C. M. Troffaes and S. Destercke.
Probability boxes on totally preordered spaces for multivariate
modelling.
Int. J. Approx. Reasoning, 52(6):767–791, 2011.
Prac Rep 66

More Related Content

What's hot

Deep generative model.pdf
Deep generative model.pdfDeep generative model.pdf
Deep generative model.pdf
Hyungjoo Cho
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
Frank Nielsen
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
Frank Nielsen
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
Christian Robert
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
Christian Robert
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Christian Robert
 
A nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formulaA nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formula
Alexander Litvinenko
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts model
Matt Moores
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
Frank Nielsen
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
Frank Nielsen
 
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast AlgorithmsReinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Sean Meyn
 
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
Cristiano Longo
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
Christian Robert
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning
Sean Meyn
 
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
The Statistical and Applied Mathematical Sciences Institute
 
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
Alexander Litvinenko
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
Christian Robert
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
Konkuk University, Korea
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
Matt Moores
 
talk MCMC & SMC 2004
talk MCMC & SMC 2004talk MCMC & SMC 2004
talk MCMC & SMC 2004
Stephane Senecal
 

What's hot (20)

Deep generative model.pdf
Deep generative model.pdfDeep generative model.pdf
Deep generative model.pdf
 
Divergence clustering
Divergence clusteringDivergence clustering
Divergence clustering
 
Patch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective DivergencesPatch Matching with Polynomial Exponential Families and Projective Divergences
Patch Matching with Polynomial Exponential Families and Projective Divergences
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
A nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formulaA nonlinear approximation of the Bayesian Update formula
A nonlinear approximation of the Bayesian Update formula
 
Approximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts modelApproximate Bayesian computation for the Ising/Potts model
Approximate Bayesian computation for the Ising/Potts model
 
Divergence center-based clustering and their applications
Divergence center-based clustering and their applicationsDivergence center-based clustering and their applications
Divergence center-based clustering and their applications
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast AlgorithmsReinforcement Learning: Hidden Theory and New Super-Fast Algorithms
Reinforcement Learning: Hidden Theory and New Super-Fast Algorithms
 
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
Herbrand-satisfiability of a Quantified Set-theoretical Fragment (Cantone, Lo...
 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
 
Introducing Zap Q-Learning
Introducing Zap Q-Learning   Introducing Zap Q-Learning
Introducing Zap Q-Learning
 
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
QMC: Operator Splitting Workshop, Stochastic Block-Coordinate Fixed Point Alg...
 
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
Tensor Completion for PDEs with uncertain coefficients and Bayesian Update te...
 
Multiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximationsMultiple estimators for Monte Carlo approximations
Multiple estimators for Monte Carlo approximations
 
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier DetectionA Tutorial of the EM-algorithm and Its Application to Outlier Detection
A Tutorial of the EM-algorithm and Its Application to Outlier Detection
 
Intro to ABC
Intro to ABCIntro to ABC
Intro to ABC
 
talk MCMC & SMC 2004
talk MCMC & SMC 2004talk MCMC & SMC 2004
talk MCMC & SMC 2004
 

Viewers also liked

παραδείγματα σύννεφα ετικετών
παραδείγματα  σύννεφα ετικετώνπαραδείγματα  σύννεφα ετικετών
παραδείγματα σύννεφα ετικετώνchristina_fil
 
Home security system
Home security systemHome security system
Home security system
De Cocoon
 
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
Sebastien Destercke
 
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource TutorialNRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
Vinicius Coutinho
 
Video door phone
Video door phoneVideo door phone
Video door phone
De Cocoon
 
3 1 tipos de conhecimento
3 1 tipos de conhecimento3 1 tipos de conhecimento
3 1 tipos de conhecimento
Arnaldo Aguiar
 
Startup agency
Startup agencyStartup agency
Startup agency
Phong Duy
 
1. pengetian ai
1. pengetian ai1. pengetian ai
1. pengetian ai
Triodi Freed
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
Serendib Tea
 
miRQuest - How to use Tutorial
miRQuest - How to use TutorialmiRQuest - How to use Tutorial
miRQuest - How to use Tutorial
Vinicius Coutinho
 
Information Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshopInformation Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshop
Sebastien Destercke
 
Pertemuan 1
Pertemuan 1Pertemuan 1
Pertemuan 1
Fandy Barestu
 
On combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lectureOn combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lecture
Sebastien Destercke
 
Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014
Sebastien Destercke
 
Cara kerja mesin fax
Cara kerja mesin faxCara kerja mesin fax
Cara kerja mesin faxguntarii
 
Cctv surveillance system
Cctv surveillance systemCctv surveillance system
Cctv surveillance system
De Cocoon
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
Serendib Tea
 

Viewers also liked (18)

παραδείγματα σύννεφα ετικετών
παραδείγματα  σύννεφα ετικετώνπαραδείγματα  σύννεφα ετικετών
παραδείγματα σύννεφα ετικετών
 
Home security system
Home security systemHome security system
Home security system
 
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
Tutorial SUM 2012: some of the things you wanted to know about uncertainty (b...
 
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource TutorialNRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
NRDR 2.0 Tutorial - The Non-Coding RNA Databases Resource Tutorial
 
Video door phone
Video door phoneVideo door phone
Video door phone
 
גליל ים
גליל יםגליל ים
גליל ים
 
3 1 tipos de conhecimento
3 1 tipos de conhecimento3 1 tipos de conhecimento
3 1 tipos de conhecimento
 
Startup agency
Startup agencyStartup agency
Startup agency
 
1. pengetian ai
1. pengetian ai1. pengetian ai
1. pengetian ai
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
 
miRQuest - How to use Tutorial
miRQuest - How to use TutorialmiRQuest - How to use Tutorial
miRQuest - How to use Tutorial
 
Information Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshopInformation Fusion - Reconciliation data workshop
Information Fusion - Reconciliation data workshop
 
Pertemuan 1
Pertemuan 1Pertemuan 1
Pertemuan 1
 
On combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lectureOn combination and conflict - Belief function school lecture
On combination and conflict - Belief function school lecture
 
Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014Imprecise probability theory - Summer School 2014
Imprecise probability theory - Summer School 2014
 
Cara kerja mesin fax
Cara kerja mesin faxCara kerja mesin fax
Cara kerja mesin fax
 
Cctv surveillance system
Cctv surveillance systemCctv surveillance system
Cctv surveillance system
 
Serendib tea
Serendib teaSerendib tea
Serendib tea
 

Similar to Madrid easy

Computing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdfComputing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdf
Polytechnique Montréal
 
MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)
Arnaud de Myttenaere
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.
Tomasz Kusmierczyk
 
Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...
Duyen Do
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Per Kristian Lehre
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
PK Lehre
 
Slides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometrySlides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometry
Frank Nielsen
 
Distributed solution of stochastic optimal control problem on GPUs
Distributed solution of stochastic optimal control problem on GPUsDistributed solution of stochastic optimal control problem on GPUs
Distributed solution of stochastic optimal control problem on GPUs
Pantelis Sopasakis
 
Nested sampling
Nested samplingNested sampling
Nested sampling
Christian Robert
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
Alexander Decker
 
Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...
Advanced-Concepts-Team
 
Double Robustness: Theory and Applications with Missing Data
Double Robustness: Theory and Applications with Missing DataDouble Robustness: Theory and Applications with Missing Data
Double Robustness: Theory and Applications with Missing Data
Lu Mao
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
Gabriel Peyré
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
matsushimalab
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
Valentin De Bortoli
 
Section1 stochastic
Section1 stochasticSection1 stochastic
Section1 stochastic
cairo university
 
sada_pres
sada_pressada_pres
sada_pres
Stephane Senecal
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
The Statistical and Applied Mathematical Sciences Institute
 
Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
Jagadeeswaran Rathinavel
 

Similar to Madrid easy (20)

Computing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdfComputing Information Flow Using Symbolic-Model-Checking_.pdf
Computing Information Flow Using Symbolic-Model-Checking_.pdf
 
MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)MAPE regression, seminar @ QUT (Brisbane)
MAPE regression, seminar @ QUT (Brisbane)
 
Introduction to modern Variational Inference.
Introduction to modern Variational Inference.Introduction to modern Variational Inference.
Introduction to modern Variational Inference.
 
Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
 
Simplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution AlgorithmsSimplified Runtime Analysis of Estimation of Distribution Algorithms
Simplified Runtime Analysis of Estimation of Distribution Algorithms
 
Slides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometrySlides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometry
 
Distributed solution of stochastic optimal control problem on GPUs
Distributed solution of stochastic optimal control problem on GPUsDistributed solution of stochastic optimal control problem on GPUs
Distributed solution of stochastic optimal control problem on GPUs
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
 
Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...Equational axioms for probability calculus and modelling of Likelihood ratio ...
Equational axioms for probability calculus and modelling of Likelihood ratio ...
 
Double Robustness: Theory and Applications with Missing Data
Double Robustness: Theory and Applications with Missing DataDouble Robustness: Theory and Applications with Missing Data
Double Robustness: Theory and Applications with Missing Data
 
Low Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse ProblemsLow Complexity Regularization of Inverse Problems
Low Complexity Regularization of Inverse Problems
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Section1 stochastic
Section1 stochasticSection1 stochastic
Section1 stochastic
 
sada_pres
sada_pressada_pres
sada_pres
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
 

Recently uploaded

LEARNING TO LIVE WITH LAWS OF MOTION .pptx
LEARNING TO LIVE WITH LAWS OF MOTION .pptxLEARNING TO LIVE WITH LAWS OF MOTION .pptx
LEARNING TO LIVE WITH LAWS OF MOTION .pptx
yourprojectpartner05
 
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
frank0071
 
Introduction_Ch_01_Biotech Biotechnology course .pptx
Introduction_Ch_01_Biotech Biotechnology course .pptxIntroduction_Ch_01_Biotech Biotechnology course .pptx
Introduction_Ch_01_Biotech Biotechnology course .pptx
QusayMaghayerh
 
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
Sérgio Sacani
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
Carl Bergstrom
 
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Leonel Morgado
 
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
Sérgio Sacani
 
Microbiology of Central Nervous System INFECTIONS.pdf
Microbiology of Central Nervous System INFECTIONS.pdfMicrobiology of Central Nervous System INFECTIONS.pdf
Microbiology of Central Nervous System INFECTIONS.pdf
sammy700571
 
Direct Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart AgricultureDirect Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart Agriculture
International Food Policy Research Institute- South Asia Office
 
gastroretentive drug delivery system-PPT.pptx
gastroretentive drug delivery system-PPT.pptxgastroretentive drug delivery system-PPT.pptx
gastroretentive drug delivery system-PPT.pptx
Shekar Boddu
 
AJAY KUMAR NIET GreNo Guava Project File.pdf
AJAY KUMAR NIET GreNo Guava Project File.pdfAJAY KUMAR NIET GreNo Guava Project File.pdf
AJAY KUMAR NIET GreNo Guava Project File.pdf
AJAY KUMAR
 
cathode ray oscilloscope and its applications
cathode ray oscilloscope and its applicationscathode ray oscilloscope and its applications
cathode ray oscilloscope and its applications
sandertein
 
IMPORTANCE OF ALGAE AND ITS BENIFITS.pptx
IMPORTANCE OF ALGAE  AND ITS BENIFITS.pptxIMPORTANCE OF ALGAE  AND ITS BENIFITS.pptx
IMPORTANCE OF ALGAE AND ITS BENIFITS.pptx
OmAle5
 
Male reproduction physiology by Suyash Garg .pptx
Male reproduction physiology by Suyash Garg .pptxMale reproduction physiology by Suyash Garg .pptx
Male reproduction physiology by Suyash Garg .pptx
suyashempire
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
Leonel Morgado
 
2001_Book_HumanChromosomes - Genéticapdf
2001_Book_HumanChromosomes - Genéticapdf2001_Book_HumanChromosomes - Genéticapdf
2001_Book_HumanChromosomes - Genéticapdf
lucianamillenium
 
Sustainable Land Management - Climate Smart Agriculture
Sustainable Land Management - Climate Smart AgricultureSustainable Land Management - Climate Smart Agriculture
Sustainable Land Management - Climate Smart Agriculture
International Food Policy Research Institute- South Asia Office
 
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptxTOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
shubhijain836
 
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfMending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Selcen Ozturkcan
 
11.1 Role of physical biological in deterioration of grains.pdf
11.1 Role of physical biological in deterioration of grains.pdf11.1 Role of physical biological in deterioration of grains.pdf
11.1 Role of physical biological in deterioration of grains.pdf
PirithiRaju
 

Recently uploaded (20)

LEARNING TO LIVE WITH LAWS OF MOTION .pptx
LEARNING TO LIVE WITH LAWS OF MOTION .pptxLEARNING TO LIVE WITH LAWS OF MOTION .pptx
LEARNING TO LIVE WITH LAWS OF MOTION .pptx
 
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
Juaristi, Jon. - El canon espanol. El legado de la cultura española a la civi...
 
Introduction_Ch_01_Biotech Biotechnology course .pptx
Introduction_Ch_01_Biotech Biotechnology course .pptxIntroduction_Ch_01_Biotech Biotechnology course .pptx
Introduction_Ch_01_Biotech Biotechnology course .pptx
 
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
Candidate young stellar objects in the S-cluster: Kinematic analysis of a sub...
 
The cost of acquiring information by natural selection
The cost of acquiring information by natural selectionThe cost of acquiring information by natural selection
The cost of acquiring information by natural selection
 
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...
 
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
SDSS1335+0728: The awakening of a ∼ 106M⊙ black hole⋆
 
Microbiology of Central Nervous System INFECTIONS.pdf
Microbiology of Central Nervous System INFECTIONS.pdfMicrobiology of Central Nervous System INFECTIONS.pdf
Microbiology of Central Nervous System INFECTIONS.pdf
 
Direct Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart AgricultureDirect Seeded Rice - Climate Smart Agriculture
Direct Seeded Rice - Climate Smart Agriculture
 
gastroretentive drug delivery system-PPT.pptx
gastroretentive drug delivery system-PPT.pptxgastroretentive drug delivery system-PPT.pptx
gastroretentive drug delivery system-PPT.pptx
 
AJAY KUMAR NIET GreNo Guava Project File.pdf
AJAY KUMAR NIET GreNo Guava Project File.pdfAJAY KUMAR NIET GreNo Guava Project File.pdf
AJAY KUMAR NIET GreNo Guava Project File.pdf
 
cathode ray oscilloscope and its applications
cathode ray oscilloscope and its applicationscathode ray oscilloscope and its applications
cathode ray oscilloscope and its applications
 
IMPORTANCE OF ALGAE AND ITS BENIFITS.pptx
IMPORTANCE OF ALGAE  AND ITS BENIFITS.pptxIMPORTANCE OF ALGAE  AND ITS BENIFITS.pptx
IMPORTANCE OF ALGAE AND ITS BENIFITS.pptx
 
Male reproduction physiology by Suyash Garg .pptx
Male reproduction physiology by Suyash Garg .pptxMale reproduction physiology by Suyash Garg .pptx
Male reproduction physiology by Suyash Garg .pptx
 
Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...Authoring a personal GPT for your research and practice: How we created the Q...
Authoring a personal GPT for your research and practice: How we created the Q...
 
2001_Book_HumanChromosomes - Genéticapdf
2001_Book_HumanChromosomes - Genéticapdf2001_Book_HumanChromosomes - Genéticapdf
2001_Book_HumanChromosomes - Genéticapdf
 
Sustainable Land Management - Climate Smart Agriculture
Sustainable Land Management - Climate Smart AgricultureSustainable Land Management - Climate Smart Agriculture
Sustainable Land Management - Climate Smart Agriculture
 
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptxTOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
TOPIC OF DISCUSSION: CENTRIFUGATION SLIDESHARE.pptx
 
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfMending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdf
 
11.1 Role of physical biological in deterioration of grains.pdf
11.1 Role of physical biological in deterioration of grains.pdf11.1 Role of physical biological in deterioration of grains.pdf
11.1 Role of physical biological in deterioration of grains.pdf
 

Madrid easy

  • 1. Practical representations of probability sets: a guided tour with applications Sébastien Destercke in collaboration with E. Miranda, I. Montes, M. Troffaes, D. Dubois, O. Strauss, C. Baudrit, P.H. Wuillemin. CNRS researcher, Laboratoire Heudiasyc, Compiègne Madrid Seminar Prac Rep 1
  • 2. Introduction Basics Practical Representations Applications Plan G Introduction G Basics of imprecise probabilities G A tour of practical representations G Illustrative applications Prac Rep 2
  • 3. Introduction Basics Practical Representations Applications Where is Compiegne Prac Rep 3
  • 4. Introduction Basics Practical Representations Applications Heudiasyc and LABEX MS2T activities Heudiasyc G 140 members G 6M budget G 4 teams: H Uncertainty and machine learning H Automatic and robotic H Artificial intelligence H Operational research and networks LABEX MS2T G Topic: systems of systems G 3 laboratories: H Heudiasyc H BMBI: Bio-mechanic H Roberval: mechanic If interested in collaborations, let me know Prac Rep 4
  • 5. Introduction Basics Practical Representations Applications Talk in a nutshell What is this talk about 1. (very) Basics of imprecise probability 2. A review of practical representations 3. Some applications What is this talk not about G Deep mathematics of imprecise probabilities (you can ask Nacho or Quique) G Imprecise parametric models Prac Rep 5
  • 6. Introduction Basics Practical Representations Applications Plan G Introduction G Basics of imprecise probabilities G A tour of practical representations G Illustrative applications Prac Rep 6
  • 7. Introduction Basics Practical Representations Applications Imprecise probabilities What? Representing uncertainty as a convex set P of probabilities rather than a single one Why? G precise probabilities inadequate to model lack of information; G generalize set-uncertainty and probabilistic uncertainty; G can model situations where probabilistic information is partial; G allow axiomatically alternatives to possibly be incomparable Prac Rep 7
  • 8. Introduction Basics Practical Representations Applications Probabilities Probability mass on finite space X = {x1,...,xn} equivalent to a n dimensional vector p := (p(x1),...,p(xn)) Limited to the set PX of all probabilities p(x) > 0, x∈X p(x) = 1 and The set PX is the (n−1)-unit simplex. Prac Rep 8
  • 9. Introduction Basics Practical Representations Applications Point in unit simplex p(x1) = 0.2, p(x2) = 0.5, p(x3) = 0.3 p(x3) p(x1) p(x2) 1 1 1 p(x2) p(x3) p(x1) ∝ p(X 1 ) ∝p(x2) ∝ p(x 3 ) Prac Rep 9
  • 10. Introduction Basics Practical Representations Applications Imprecise probability Set P defined as a set of n constraints E(fi) ≤ x∈X fi(x)p(x) ≤ E(fi) where fi :→ R bounded functions Example 2p(x2)−p(x3) ≥ 0 f(x1) = 0,f(x2) = 2,f(x3) = −1,E(f) = 0 Lower/upper probabilities Bounds P(A),P(A) on event A equivalent to P(A) ≤ x∈A p(x) ≤ P(A) Prac Rep 10
  • 11. Introduction Basics Practical Representations Applications Set P example 2p(x2)−p(x3) ≥ 0 p(x3) p(x1) p(x2) 1 1 1 p(x2) p(x3) p(x1) Prac Rep 11
  • 12. Introduction Basics Practical Representations Applications Credal set example 2p(x2)−p(x3) ≥ 0 2p(x1)−p(x2)−p(x3) ≥ 0 P p(x3) p(x1) p(x2) 1 1 1 p(x2) p(x3) p(x1) Prac Rep 12
  • 13. Introduction Basics Practical Representations Applications Natural extension From an initial set P defined by constraints, we can compute G The lower expectation E(g) of any function g as E(g) = inf p∈P E(g) G The lower probability P(A) of any event A as P(A) = inf p∈P P(A) Prac Rep 13
  • 14. Introduction Basics Practical Representations Applications Some usual problems G Computing E(g) = infp∈P E(g) of new function g G Updating P (θ|x) = L(x|θ)P (θ) G Computing conditional E(f|A) G Simulating/sampling P G Building joint over variables X1,...,Xn can be difficult to perform in general → practical representations reduce computational cost Prac Rep 14
  • 15. Introduction Basics Practical Representations Applications What makes a representation "practical" G A reasonable, algorithmically enumerable number of extreme points reminder p ∈ P extreme iff p = λp1 +(1−λ)p2 with λ ∈ (0,1) implies p1 = p2 = p. We will denote E (P ) the set of extreme points of P G n-monotone property of P 2-monotonicity (sub-modularity, convexity) P(A∪B)+P(A∩B) ≥ P(A)+P(B) for any A,B ⊆ X ∞-monotonicity P(∪n i=1Ai) ≥ A ⊆{A1,...,An} −1|A|+1 P(∪Ai ∈A Ai) for any A1,...,An ⊆ X and n > 0 Prac Rep 15
  • 16. Introduction Basics Practical Representations Applications Extreme points: illustration G p(x1) = 1,p(x2) = 0,p(x3) = 0 G p(x1) = 0,p(x2) = 1,p(x3) = 0 G p(x1) = 0.25,p(x2) = 0.25,p(x3) = 0.5 p(x2) p(x3) p(x1) Prac Rep 16
  • 17. Introduction Basics Practical Representations Applications Extreme points: utility G Computing E(g) → minimal E on ext. points G Updating → update extreme points, take convex hull G Conditional E(f|A) → minimal E(f|A) on ext. points G Simulating P → take convex mixtures of ext. points G Joint over variables X1,...,Xn → convex hull of joint extreme Again, if number of extreme points is limited, or inner approximation (by sampling) acceptable. Prac Rep 17
  • 18. Introduction Basics Practical Representations Applications 2-monotonicity Computing E(g) Choquet integral E(g) = infg + supg infg P({g ≥ t})dt In finite spaces → sorting n values of g and compute P(A) for n events Conditioning P(A|B) = P(A∩B) P(A∩B)+P(Ac ∩B) And P(A|B) remains 2-monotone (can be used to get E(f|A)) Prac Rep 18
  • 19. Introduction Basics Practical Representations Applications ∞-monotonicity If P is ∞-monotone, its Möbius inverse m : 2X → R m(A) = B⊆A −1|AB| P(B) is positive and sums up to one, and is often called belief function Simulating P Sampling m and considering the associated set A Joint model of X1,...,XN If m1,m2 corresponds to inverses of X1,X2, consider joint m12 s.t. m12(A×B) = m1(A)·m2(B) G still ∞-monotone G outer-approximate other def. of independence between P1, P2 Prac Rep 19
  • 20. Introduction Basics Practical Representations Applications 2-monotonicity and extreme points [3] Generating extreme points if P 2-monotone: 1. Pick a permutation σ : [1,n] → [1,n] of X 2. Consider sets Aσ i = {xσ(1),...,xσ(i)} 3. define Pσ ({xσ(i)}) = P(Aσ i )−P(Aσ i−1 ) for i = 1,...,n (Aσ 0 = ) 4. then Pσ ∈ E (P ) Some comments G Maximal value of |E (P )| = n! G We can have Pσ1 = Pσ2 with σ1 = σ2 → |E (P )| often less than n! Prac Rep 20
  • 21. Introduction Basics Practical Representations Applications Example G X = {x1,x2,x3} G σ(1) = 2,σ(2) = 3,σ(3) = 1 G Aσ 0 = ,Aσ 1 = {x2},Aσ 2 = {x2,x3},Aσ 3 = X G Pσ ({xσ(1)}) = Pσ ({x2}) = P({x2})−P( ) = P({x2}) G Pσ ({xσ(2)}) = Pσ ({x3}) = P({x2,x3})−P({x2}) G Pσ ({xσ(3)}) = Pσ ({x1}) = P(X )−P({x2,x3}) = 1−P({x2,x3}) Prac Rep 21
  • 22. Introduction Basics Practical Representations Applications Plan G Introduction G Basics of imprecise probabilities G A tour of practical representations H Basics H Possibility distributions H P-boxes H Probability intervals H Elementary Comparative probabilities G Illustrative applications Prac Rep 22
  • 23. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Two very basic models Probability G P({xi}) = P({xi}) = P({xi}) G ∞-monotone, n constraints, |E | = 1 Vacuous model PX Only support X of probability is known G P(X ) = 1 G ∞-monotone, 1 constraints, |E (P )| = n (Dirac distribution) Easily extends to vacuous on set A (can be used in robust optimisation, decision under risk, interval-analysis) Prac Rep 23
  • 24. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 24
  • 25. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Neighbourhood models Build a neighbourhood around a given probability P0 Linear vacuous/ -contamination G P(A) = (1− )P0(A)+( )PX (A) G ∞-monotone, n+1 constraints, |E (P )| = n G ∈ [0,1]: unreliability of information P0 Pari-Mutuel [16] G P(A) = max{(1+ )P0(A)− ,0} G 2-monotone, n+1 constraints, |E (P )| =? (n?) G ∈ [0,1]: unreliability of information P0 Other models exist, such as odds-ratio or distance-based (all q s.t. d(p,q) < δ) → often not attractive for |E (P )|/monotonicity, but may have nice properties (odds/ratio: updating, square/log distances: convex continuous neighbourhood) Prac Rep 25
  • 26. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Illustration x2 x3 x1 pari-mutuel linear-vacuous P0 = (0.5,0.3,0.2) = 0.2 Prac Rep 26
  • 27. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 27
  • 28. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Possibility distributions [10] Definition Distribution π : X → [0,1] with π(x) = 1 for at least one x P given by P(A) = min x∈Ac 1−π(x), which is a necessity measure π Characteristics of P G Necessitates at most n values G P is an ∞-monotone measure Prac Rep 28
  • 29. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Possibility distributions Alternative definition Provide nested events A1 ⊆ ... ⊆ An. Give lower confidence bounds P(Ai) = αi with αi+1 ≥ αi Ai αi Extreme points [19] G Maximum number is 2n−1 G Algorithm using nested structure of sets Ai Prac Rep 29
  • 30. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A basic distribution: simple support G Set E of most plausible values G Confidence degree α = P(E) Extends to multiple sets E1,...,Ep → Confidence degrees over nested sets [18] pH value ∈ [4.5,5.5] with α = 0.8 (∼ "quite probable") π 3 4 4.5 5.5 6 7 0 0.2 0.4 0.6 0.8 1.0 Prac Rep 30
  • 31. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Partially specified probabilities [1] [8] Triangular distribution M,[a,b] encompasses all probabilities with G mode/reference value M G support domain [a,b]. Getting back to pH G M = 5 G [a,b] = [3,7] 1 pH π 5 73 Prac Rep 31
  • 32. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Normalized likelihood as possibilities [9] [2] π(θ) = L(θ|x)/maxθ∗∈Θ L(θ∗ |x) Binomial situation: G θ = success probability G x number of observed successes G x= 4 succ. out of 11 G x= 20 succ. out of 55 θ 1 π 4/11 Prac Rep 32
  • 33. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Other examples G Statistical inequalities (e.g., Chebyshev inequality) [8] G Linguistic information (fuzzy sets) [5] G Approaches based on nested models Prac Rep 33
  • 34. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 34
  • 35. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. P-boxes [6] Definition When X ordered, bounds on events of the kind: Ai = {x1,...,xi} Each bounded by F(xi) ≤ P(Ai) ≤ F(xi) 0.5 1.0 x1 x2 x3 x4 x5 x6 x7 Characteristics of P G Necessitates at most 2n values G P is an ∞-monotone measure Prac Rep 35
  • 36. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. In general Definition A set of nested events A1 ⊆ ... ⊆ An Each bounded by αi ≤ P(Ai) ≤ βi 0.5 1.0 x1 x2 x3 x4 x5 x6 x7 Extreme points [15] G At most equal to Pell number Kn = 2Kn−1 +Kn−2 G Algorithm based on a tree structure construction Prac Rep 36
  • 37. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. P-box on reals [11] A pair [F,F] of cumulative distributions Bounds over events [−∞,x] G Percentiles by experts; G Kolmogorov-Smirnov bounds; Can be extended to any pre-ordered space [6], [21] ⇒ multivariate spaces! Expert providing percentiles 0 ≤ P([−∞,12]) ≤ 0.2 0.2 ≤ P([−∞,24]) ≤ 0.4 0.6 ≤ P([−∞,36]) ≤ 0.8 0.5 1.0 6 12 18 24 30 36 42 E1 E2 E3 E4 E5 Prac Rep 37
  • 38. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 38
  • 39. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Probability intervals [4] Definition Elements {x1,...,xn}. Each bounded by p(xi) ∈ [p(xi),p(xi)] x1 x2 x3 x4 x5 x6 Characteristics of P G Necessitates at most 2n values G P is an 2-monotone measure Extreme points [4] G Specific algorithm to extract G If n even, maximum number is n+1 n/2 n 2 Prac Rep 39
  • 40. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Probability intervals: example Linguistic assessment G x is very probable G x has a good chance G x is very unlikely G x probability is about α ⇒ Numerical translation G p(x) ≥ 0.75 G 0.4 ≤ p(x) ≤ 0.85 G p(x) ≤ 0.25 G α−0.1 ≤ p(x) ≤ α+1 Prac Rep 40
  • 41. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 41
  • 42. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Comparative probabilities definitions Comparative probabilities on X : assessments P(A) ≥ P(B) event A "at least as probable as" event B. Some comments G studied from the axiomatic point of view [13, 20] G few studies on their numerical aspects [17] G interesting for qualitative uncertainty modeling/representation, expert elicitation, . . . Prac Rep 42
  • 43. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A specific case: elementary comparisons [14] Elementary comparisons Comparative probability orderings of the states X = {x1,...,xn} in the form of a subset L of {1,...,n}×{1,...,n}. The set of probability measures compatible with this information is P (L ) = {p ∈ PX |∀(i,j) ∈ L ,p(xi) ≥ p(xj)}, Prac Rep 43
  • 44. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Why focusing on this case? Practical interest G multinomial models (e.g., imprecise prior for Dirichlet), modal value elicitation G direct extension to define imprecise belief functions Easy to represent/manipulate G Through a graph G = (X ,L ) with states as nodes and relation L for edges G Example: given X = {x1,...,x5},L = {(1,3),(1,4),(2,5),(4,5)}, its associated graph G is: x1 x3 x4 x2 x5 Prac Rep 44
  • 45. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Some properties Characteristics of P G Necessitates at most n2 values G No guarantee that P is a 2-monotone measure Extreme points [14] G Algorithm identifying subsets of disconnected nodes G maximal number is 2n−1 Prac Rep 45
  • 46. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise list (acc. to my knowledge) Name Monot. Max. |const.| Max. |E (P )| Algo. to get E (P ) Proba ∞ n 1 Yes Vacuous ∞ 1 n Yes 2-mon 2 2n n! Yes [3] ∞-mon ∞ 2n n! No Lin-vac. ∞ n+1 n Yes Pari-mutuel 2 n+1 ? (n) No Possibility ∞ n 2n−1 Yes [19] P-box (gen.) ∞ 2n Kn (Pell) Yes [15] Prob. int. 2 2n n+1 n/2 n 2 Yes [4] Elem. Compa. × n2 2n−1 Yes [14] Prac Rep 46
  • 47. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. A concise final graph ProbaVacuous Linear-vacuous Pari-mutuel Possibilities P-boxes Prob. int Compa. ∞-monotone 2-monotone Model A Model B A special case of B Prac Rep 47
  • 48. Introduction Basics Practical Representations Applications Basics Possibility distributions P-boxes Probability intervals Elem. Compa. Some open questions G study the numerical aspects of comparative probabilities with numbers/general events. G study the potential link between possibilities and elementary comparative probabilities (share same number of extreme points, induce ordering between states). G study restricted bounds/information over specific families of events, other than nested/elementary ones (e.g., events of at most k states). G look at probability sets induced by bounding specific distances to p0, in particular L1,L2,L∞ norms. Prac Rep 48
  • 49. Introduction Basics Practical Representations Applications Plan G Introduction G Basics of imprecise probabilities G A tour of practical representations G Illustrative applications H Numerical Signal processing [7] H Camembert ripening [12] Prac Rep 49
  • 50. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Signal processing: introduction G Impulse response µ filter µ G Filtering: convolving kernel µ and observed output f(x) Prac Rep 50
  • 51. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Link with probability G If µ positive and R µ(x) dx = 1 G µ equivalent to probability density G Convolution: compute mathematical expectation Eµ(f) G Numerical filtering: discretize (sampling) µ and f f f x 0 µ x 0 µ G µ > 0, xi µ(xi) = 1 Prac Rep 51
  • 52. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Which bandwidth? R x µ ∆??? → use imprecise probabilistic models to model sets of bandwidth → possiblities/p-boxes with sets centred around x Prac Rep 52
  • 53. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Example on simulated signal time (msec) signalamplitude maxitive upper enveloppe maxitive lower enveloppe cloudy upper enveloppe cloudy lower enveloppe original signal Prac Rep 53
  • 54. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Application: pepper/salt noise removal Original image Noisy image Prac Rep 54
  • 55. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Results Zoom on 2 parts Résultats CWMF ROAD Us Prac Rep 55
  • 56. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Motivations A complex system The Camembert-type cheese ripening process t j1 j14Cheese making Cheese ripening (∼ 13◦ C, ∼ 95%hum) Warehouse (∼ 4◦ C) P( t = 14 |t = 0 ) G Multi-scale modeling; from microbial activities to sensory properties G Dynamic probabilistic model G Knowledge is fragmented, heterogeneous and incomplete G Difficulties to learn precise model parameters Use of -contamination for a robustness analysis of the model Prac Rep 56
  • 57. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Experiments The network T(t) Km(t) lo(t) Km(t+1) lo(t+1) Time slice t Time slice t +1 Unrolled over 14 time steps (days) T(1) Km(1) lo(1) T(2) Km(2) lo(2) Km(14) lo(14) t j1 j14Cheese making Cheese ripening (∼ 13◦ C, ∼ 95%hum) Warehouse (∼ 4◦ C) ... Prac Rep 57
  • 58. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Propagation results Forward propagation, ∀t ∈ 1,τ ,T(t) = 12o C (average ripening room temperature) : Ext{Km(t)|{Km(1),lo(1),T(1),...,T(τ)}} Ext{lo(t)|{Km(1),lo(1),T(1),...,T(τ)}} no physical constraints with added physical constraints Prac Rep 58
  • 59. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening Conclusions Use of practical representations G +: "Easy" robustness analysis of precise methods, or approximation of imprecise ones G +: allow experts to express imprecision or partial information G +: often easier to explain/represent than general ones G -: usually focus on specific events G -: their form may not be conserved by information processing Prac Rep 59
  • 60. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References I [1] C. Baudrit and D. Dubois. Practical representations of incomplete probabilistic knowledge. Computational Statistics and Data Analysis, 51(1):86–108, 2006. [2] M. Cattaneo. Likelihood-based statistical decisions. In Proc. 4th International Symposium on Imprecise Probabilities and Their Applications, pages 107–116, 2005. [3] A. Chateauneuf and J.-Y. Jaffray. Some characterizations of lower probabilities and other monotone capacities through the use of Mobius inversion. Mathematical Social Sciences, 17(3):263–283, 1989. Prac Rep 60
  • 61. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References II [4] L. de Campos, J. Huete, and S. Moral. Probability intervals: a tool for uncertain reasoning. I. J. of Uncertainty, Fuzziness and Knowledge-Based Systems, 2:167–196, 1994. [5] G. de Cooman and P. Walley. A possibilistic hierarchical model for behaviour under uncertainty. Theory and Decision, 52:327–374, 2002. [6] S. Destercke, D. Dubois, and E. Chojnacki. Unifying practical uncertainty representations: I generalized p-boxes. Int. J. of Approximate Reasoning, 49:649–663, 2008. Prac Rep 61
  • 62. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References III [7] S. Destercke and O. Strauss. Filtering with clouds. Soft Computing, 16(5):821–831, 2012. [8] D. Dubois, L. Foulloy, G. Mauris, and H. Prade. Probability-possibility transformations, triangular fuzzy sets, and probabilistic inequalities. Reliable Computing, 10:273–297, 2004. [9] D. Dubois, S. Moral, and H. Prade. A semantics for possibility theory based on likelihoods,. Journal of Mathematical Analysis and Applications, 205(2):359 – 380, 1997. Prac Rep 62
  • 63. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References IV [10] D. Dubois and H. Prade. Practical methods for constructing possibility distributions. International Journal of Intelligent Systems, 31(3):215–239, 2016. [11] S. Ferson, L. Ginzburg, V. Kreinovich, D. Myers, and K. Sentz. Constructing probability boxes and dempster-shafer structures. Technical report, Sandia National Laboratories, 2003. [12] M. Hourbracq, C. Baudrit, P.-H. Wuillemin, and S. Destercke. Dynamic credal networks: introduction and use in robustness analysis. In Proceedings of the Eighth International Symposium on Imprecise Probability: Theories and Applications, pages 159–169, 2013. Prac Rep 63
  • 64. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References V [13] B. O. Koopman. The axioms and algebra of intuitive probability. Annals of Mathematics, pages 269–292, 1940. [14] E. Miranda and S. Destercke. Extreme points of the credal sets generated by comparative probabilities. Journal of Mathematical Psychology, 64:44–57, 2015. [15] I. Montes and S. Destercke. On extreme points of p-boxes and belief functions. In Int. Conf. on Soft Methods in Probability and Statistics (SMPS), 2016. Prac Rep 64
  • 65. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References VI [16] R. Pelessoni, P. Vicig, and M. Zaffalon. Inference and risk measurement with the pari-mutuel model. International journal of approximate reasoning, 51(9):1145–1158, 2010. [17] G. Regoli. Comparative probability orderings. Technical report, Society for Imprecise Probabilities: Theories and Applications, 1999. [18] S. Sandri, D. Dubois, and H. Kalfsbeek. Elicitation, assessment and pooling of expert judgments using possibility theory. IEEE Trans. on Fuzzy Systems, 3(3):313–335, August 1995. Prac Rep 65
  • 66. Introduction Basics Practical Representations Applications Numerical filtering Camembert ripening References VII [19] G. Schollmeyer. On the number and characterization of the extreme points of the core of necessity measures on finite spaces. ISIPTA conference, 2016. [20] P. Suppes, G. Wright, and P. Ayton. Qualitative theory of subjective probability. Subjective probability, pages 17–38, 1994. [21] M. C. M. Troffaes and S. Destercke. Probability boxes on totally preordered spaces for multivariate modelling. Int. J. Approx. Reasoning, 52(6):767–791, 2011. Prac Rep 66