1. The document discusses practical representations of imprecise probabilities, which represent uncertainty as a set of probabilities rather than a single probability.
2. It provides an overview of several practical representations, including possibility distributions, P-boxes, probability intervals, and elementary comparative probabilities.
3. The representations aim to be computationally tractable by having a reasonable number of extreme points and satisfying properties like n-monotonicity.
Disentangling the origin of chemical differences using GHOST
Madrid easy
1. Practical representations of probability sets: a
guided tour with applications
Sébastien Destercke
in collaboration with E. Miranda, I. Montes, M. Troffaes, D.
Dubois, O. Strauss, C. Baudrit, P.H. Wuillemin.
CNRS researcher, Laboratoire Heudiasyc, Compiègne
Madrid Seminar
Prac Rep 1
2. Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
Prac Rep 2
4. Introduction Basics Practical Representations Applications
Heudiasyc and LABEX MS2T activities
Heudiasyc
G 140 members
G 6M budget
G 4 teams:
H Uncertainty and machine
learning
H Automatic and robotic
H Artificial intelligence
H Operational research and
networks
LABEX MS2T
G Topic: systems of systems
G 3 laboratories:
H Heudiasyc
H BMBI: Bio-mechanic
H Roberval: mechanic
If interested in collaborations, let
me know
Prac Rep 4
5. Introduction Basics Practical Representations Applications
Talk in a nutshell
What is this talk about
1. (very) Basics of imprecise probability
2. A review of practical representations
3. Some applications
What is this talk not about
G Deep mathematics of imprecise probabilities (you can ask Nacho or
Quique)
G Imprecise parametric models
Prac Rep 5
6. Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
Prac Rep 6
7. Introduction Basics Practical Representations Applications
Imprecise probabilities
What?
Representing uncertainty as a convex set P of probabilities rather than a
single one
Why?
G precise probabilities inadequate to model lack of information;
G generalize set-uncertainty and probabilistic uncertainty;
G can model situations where probabilistic information is partial;
G allow axiomatically alternatives to possibly be incomparable
Prac Rep 7
8. Introduction Basics Practical Representations Applications
Probabilities
Probability mass on finite space X = {x1,...,xn} equivalent to a n
dimensional vector
p := (p(x1),...,p(xn))
Limited to the set PX of all probabilities
p(x) > 0,
x∈X
p(x) = 1 and
The set PX is the (n−1)-unit simplex.
Prac Rep 8
9. Introduction Basics Practical Representations Applications
Point in unit simplex
p(x1) = 0.2, p(x2) = 0.5, p(x3) = 0.3
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
∝
p(X
1 )
∝p(x2)
∝
p(x 3
)
Prac Rep 9
10. Introduction Basics Practical Representations Applications
Imprecise probability
Set P defined as a set of n constraints
E(fi) ≤
x∈X
fi(x)p(x) ≤ E(fi)
where fi :→ R bounded functions
Example
2p(x2)−p(x3) ≥ 0
f(x1) = 0,f(x2) = 2,f(x3) = −1,E(f) = 0
Lower/upper probabilities
Bounds P(A),P(A) on event A equivalent to
P(A) ≤
x∈A
p(x) ≤ P(A)
Prac Rep 10
11. Introduction Basics Practical Representations Applications
Set P example
2p(x2)−p(x3) ≥ 0
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
Prac Rep 11
12. Introduction Basics Practical Representations Applications
Credal set example
2p(x2)−p(x3) ≥ 0
2p(x1)−p(x2)−p(x3) ≥ 0
P
p(x3)
p(x1)
p(x2)
1
1
1
p(x2)
p(x3) p(x1)
Prac Rep 12
13. Introduction Basics Practical Representations Applications
Natural extension
From an initial set P defined by constraints, we can compute
G The lower expectation E(g) of any function g as
E(g) = inf
p∈P
E(g)
G The lower probability P(A) of any event A as
P(A) = inf
p∈P
P(A)
Prac Rep 13
14. Introduction Basics Practical Representations Applications
Some usual problems
G Computing E(g) = infp∈P E(g) of new function g
G Updating P (θ|x) = L(x|θ)P (θ)
G Computing conditional E(f|A)
G Simulating/sampling P
G Building joint over variables X1,...,Xn
can be difficult to perform in general → practical representations reduce
computational cost
Prac Rep 14
15. Introduction Basics Practical Representations Applications
What makes a representation "practical"
G A reasonable, algorithmically enumerable number of extreme points
reminder
p ∈ P extreme iff p = λp1 +(1−λ)p2 with λ ∈ (0,1) implies p1 = p2 = p.
We will denote E (P ) the set of extreme points of P
G n-monotone property of P
2-monotonicity (sub-modularity, convexity)
P(A∪B)+P(A∩B) ≥ P(A)+P(B) for any A,B ⊆ X
∞-monotonicity
P(∪n
i=1Ai) ≥
A ⊆{A1,...,An}
−1|A|+1
P(∪Ai ∈A Ai) for any A1,...,An ⊆ X and n > 0
Prac Rep 15
16. Introduction Basics Practical Representations Applications
Extreme points: illustration
G p(x1) = 1,p(x2) = 0,p(x3) = 0
G p(x1) = 0,p(x2) = 1,p(x3) = 0
G p(x1) = 0.25,p(x2) = 0.25,p(x3) = 0.5
p(x2)
p(x3) p(x1)
Prac Rep 16
17. Introduction Basics Practical Representations Applications
Extreme points: utility
G Computing E(g) → minimal E on ext. points
G Updating → update extreme points, take convex hull
G Conditional E(f|A) → minimal E(f|A) on ext. points
G Simulating P → take convex mixtures of ext. points
G Joint over variables X1,...,Xn → convex hull of joint extreme
Again, if number of extreme points is limited, or inner approximation (by
sampling) acceptable.
Prac Rep 17
18. Introduction Basics Practical Representations Applications
2-monotonicity
Computing E(g)
Choquet integral
E(g) = infg +
supg
infg
P({g ≥ t})dt
In finite spaces → sorting n values of g and compute P(A) for n events
Conditioning
P(A|B) =
P(A∩B)
P(A∩B)+P(Ac ∩B)
And P(A|B) remains 2-monotone (can be used to get E(f|A))
Prac Rep 18
19. Introduction Basics Practical Representations Applications
∞-monotonicity
If P is ∞-monotone, its Möbius inverse m : 2X
→ R
m(A) =
B⊆A
−1|AB|
P(B)
is positive and sums up to one, and is often called belief function
Simulating P
Sampling m and considering the associated set A
Joint model of X1,...,XN
If m1,m2 corresponds to inverses of X1,X2, consider joint m12 s.t.
m12(A×B) = m1(A)·m2(B)
G still ∞-monotone
G outer-approximate other def. of independence between P1, P2
Prac Rep 19
20. Introduction Basics Practical Representations Applications
2-monotonicity and extreme points [3]
Generating extreme points if P 2-monotone:
1. Pick a permutation σ : [1,n] → [1,n] of X
2. Consider sets Aσ
i
= {xσ(1),...,xσ(i)}
3. define Pσ
({xσ(i)}) = P(Aσ
i
)−P(Aσ
i−1
) for i = 1,...,n (Aσ
0
= )
4. then Pσ
∈ E (P )
Some comments
G Maximal value of |E (P )| = n!
G We can have Pσ1 = Pσ2 with σ1 = σ2 → |E (P )| often less than n!
Prac Rep 20
21. Introduction Basics Practical Representations Applications
Example
G X = {x1,x2,x3}
G σ(1) = 2,σ(2) = 3,σ(3) = 1
G Aσ
0
= ,Aσ
1
= {x2},Aσ
2
= {x2,x3},Aσ
3
= X
G Pσ
({xσ(1)}) = Pσ
({x2}) = P({x2})−P( ) = P({x2})
G Pσ
({xσ(2)}) = Pσ
({x3}) = P({x2,x3})−P({x2})
G Pσ
({xσ(3)}) = Pσ
({x1}) = P(X )−P({x2,x3}) = 1−P({x2,x3})
Prac Rep 21
22. Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
H Basics
H Possibility distributions
H P-boxes
H Probability intervals
H Elementary Comparative probabilities
G Illustrative applications
Prac Rep 22
23. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Two very basic models
Probability
G P({xi}) = P({xi}) = P({xi})
G ∞-monotone, n constraints, |E | = 1
Vacuous model PX
Only support X of probability is known
G P(X ) = 1
G ∞-monotone, 1 constraints, |E (P )| = n (Dirac distribution)
Easily extends to vacuous on set A (can be used in robust optimisation,
decision under risk, interval-analysis)
Prac Rep 23
24. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 24
25. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Neighbourhood models
Build a neighbourhood around a given probability P0
Linear vacuous/ -contamination
G P(A) = (1− )P0(A)+( )PX (A)
G ∞-monotone, n+1 constraints, |E (P )| = n
G ∈ [0,1]: unreliability of information P0
Pari-Mutuel [16]
G P(A) = max{(1+ )P0(A)− ,0}
G 2-monotone, n+1 constraints, |E (P )| =? (n?)
G ∈ [0,1]: unreliability of information P0
Other models exist, such as odds-ratio or distance-based (all q s.t. d(p,q) < δ)
→ often not attractive for |E (P )|/monotonicity, but may have nice properties
(odds/ratio: updating, square/log distances: convex continuous neighbourhood)
Prac Rep 25
27. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 27
28. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Possibility distributions [10]
Definition
Distribution π : X → [0,1]
with π(x) = 1 for at least one x
P given by
P(A) = min
x∈Ac
1−π(x),
which is a necessity measure
π
Characteristics of P
G Necessitates at most n values
G P is an ∞-monotone measure
Prac Rep 28
29. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Possibility distributions
Alternative definition
Provide nested events
A1 ⊆ ... ⊆ An.
Give lower confidence bounds
P(Ai) = αi
with αi+1 ≥ αi
Ai
αi
Extreme points [19]
G Maximum number is 2n−1
G Algorithm using nested structure of sets Ai
Prac Rep 29
30. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A basic distribution: simple support
G Set E of most plausible
values
G Confidence degree α = P(E)
Extends to multiple sets E1,...,Ep
→ Confidence degrees over
nested sets [18]
pH value ∈ [4.5,5.5] with
α = 0.8 (∼ "quite probable")
π
3 4 4.5 5.5 6 7
0
0.2
0.4
0.6
0.8
1.0
Prac Rep 30
31. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Partially specified probabilities [1] [8]
Triangular distribution M,[a,b]
encompasses all probabilities
with
G mode/reference value M
G support domain [a,b].
Getting back to pH
G M = 5
G [a,b] = [3,7]
1
pH
π
5 73
Prac Rep 31
32. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Normalized likelihood as possibilities [9] [2]
π(θ) = L(θ|x)/maxθ∗∈Θ L(θ∗
|x)
Binomial situation:
G θ = success probability
G x number of observed
successes
G x= 4 succ. out of 11
G x= 20 succ. out of 55
θ
1
π
4/11
Prac Rep 32
33. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Other examples
G Statistical inequalities (e.g., Chebyshev inequality) [8]
G Linguistic information (fuzzy sets) [5]
G Approaches based on nested models
Prac Rep 33
34. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 34
35. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
P-boxes [6]
Definition
When X ordered, bounds on
events of the kind:
Ai = {x1,...,xi}
Each bounded by
F(xi) ≤ P(Ai) ≤ F(xi)
0.5
1.0
x1 x2 x3 x4 x5 x6 x7
Characteristics of P
G Necessitates at most 2n values
G P is an ∞-monotone measure
Prac Rep 35
36. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
In general
Definition
A set of nested events
A1 ⊆ ... ⊆ An
Each bounded by
αi ≤ P(Ai) ≤ βi
0.5
1.0
x1 x2 x3 x4 x5 x6 x7
Extreme points [15]
G At most equal to Pell number Kn = 2Kn−1 +Kn−2
G Algorithm based on a tree structure construction
Prac Rep 36
37. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
P-box on reals [11]
A pair [F,F] of cumulative
distributions
Bounds over events [−∞,x]
G Percentiles by experts;
G Kolmogorov-Smirnov bounds;
Can be extended to any
pre-ordered space [6], [21] ⇒
multivariate spaces!
Expert providing percentiles
0 ≤ P([−∞,12]) ≤ 0.2
0.2 ≤ P([−∞,24]) ≤ 0.4
0.6 ≤ P([−∞,36]) ≤ 0.8
0.5
1.0
6 12 18 24 30 36 42
E1
E2
E3
E4
E5
Prac Rep 37
38. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 38
39. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Probability intervals [4]
Definition
Elements {x1,...,xn}.
Each bounded by
p(xi) ∈ [p(xi),p(xi)] x1 x2 x3 x4 x5 x6
Characteristics of P
G Necessitates at most 2n values
G P is an 2-monotone measure
Extreme points [4]
G Specific algorithm to extract
G If n even, maximum number is n+1
n/2
n
2
Prac Rep 39
40. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Probability intervals: example
Linguistic assessment
G x is very probable
G x has a good chance
G x is very unlikely
G x probability is about α
⇒
Numerical translation
G p(x) ≥ 0.75
G 0.4 ≤ p(x) ≤ 0.85
G p(x) ≤ 0.25
G α−0.1 ≤ p(x) ≤ α+1
Prac Rep 40
41. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 41
42. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Comparative probabilities
definitions
Comparative probabilities on X : assessments
P(A) ≥ P(B)
event A "at least as probable as" event B.
Some comments
G studied from the axiomatic point of view [13, 20]
G few studies on their numerical aspects [17]
G interesting for qualitative uncertainty modeling/representation,
expert elicitation, . . .
Prac Rep 42
43. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A specific case: elementary comparisons [14]
Elementary comparisons
Comparative probability orderings of the states X = {x1,...,xn} in the
form of a subset L of {1,...,n}×{1,...,n}.
The set of probability measures compatible with this information is
P (L ) = {p ∈ PX |∀(i,j) ∈ L ,p(xi) ≥ p(xj)},
Prac Rep 43
44. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Why focusing on this case?
Practical interest
G multinomial models (e.g., imprecise prior for Dirichlet), modal value
elicitation
G direct extension to define imprecise belief functions
Easy to represent/manipulate
G Through a graph G = (X ,L ) with states as nodes and relation L
for edges
G Example: given X = {x1,...,x5},L = {(1,3),(1,4),(2,5),(4,5)}, its
associated graph G is:
x1
x3 x4
x2
x5
Prac Rep 44
45. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Some properties
Characteristics of P
G Necessitates at most n2
values
G No guarantee that P is a 2-monotone measure
Extreme points [14]
G Algorithm identifying subsets of disconnected nodes
G maximal number is 2n−1
Prac Rep 45
46. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise list (acc. to my knowledge)
Name Monot. Max. |const.| Max. |E (P )| Algo. to get E (P )
Proba ∞ n 1 Yes
Vacuous ∞ 1 n Yes
2-mon 2 2n
n! Yes [3]
∞-mon ∞ 2n
n! No
Lin-vac. ∞ n+1 n Yes
Pari-mutuel 2 n+1 ? (n) No
Possibility ∞ n 2n−1
Yes [19]
P-box (gen.) ∞ 2n Kn (Pell) Yes [15]
Prob. int. 2 2n n+1
n/2
n
2 Yes [4]
Elem. Compa. × n2
2n−1
Yes [14]
Prac Rep 46
47. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
A concise final graph
ProbaVacuous
Linear-vacuous Pari-mutuel
Possibilities
P-boxes
Prob. int
Compa.
∞-monotone
2-monotone
Model A
Model B
A special case of B
Prac Rep 47
48. Introduction Basics Practical Representations Applications
Basics Possibility distributions P-boxes Probability intervals Elem. Compa.
Some open questions
G study the numerical aspects of comparative probabilities with
numbers/general events.
G study the potential link between possibilities and elementary
comparative probabilities (share same number of extreme points,
induce ordering between states).
G study restricted bounds/information over specific families of events,
other than nested/elementary ones (e.g., events of at most k states).
G look at probability sets induced by bounding specific distances to p0,
in particular L1,L2,L∞ norms.
Prac Rep 48
49. Introduction Basics Practical Representations Applications
Plan
G Introduction
G Basics of imprecise probabilities
G A tour of practical representations
G Illustrative applications
H Numerical Signal processing [7]
H Camembert ripening [12]
Prac Rep 49
50. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Signal processing: introduction
G Impulse response µ
filter
µ
G Filtering: convolving kernel µ and observed output f(x)
Prac Rep 50
51. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Link with probability
G If µ positive and R µ(x) dx = 1
G µ equivalent to probability density
G Convolution: compute mathematical expectation Eµ(f)
G Numerical filtering: discretize (sampling) µ and f
f f
x
0
µ
x
0
µ
G µ > 0, xi
µ(xi) = 1
Prac Rep 51
52. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Which bandwidth?
R
x
µ
∆???
→ use imprecise probabilistic models to model sets of bandwidth
→ possiblities/p-boxes with sets centred around x
Prac Rep 52
53. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Example on simulated signal
time (msec)
signalamplitude
maxitive upper enveloppe
maxitive lower enveloppe
cloudy upper enveloppe
cloudy lower enveloppe
original signal
Prac Rep 53
55. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Results
Zoom on 2 parts
Résultats
CWMF ROAD Us
Prac Rep 55
56. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Motivations
A complex system
The Camembert-type cheese ripening process
t
j1 j14Cheese making Cheese ripening (∼ 13◦
C, ∼ 95%hum) Warehouse (∼ 4◦
C)
P( t = 14
|t = 0
)
G Multi-scale modeling; from microbial activities to sensory properties
G Dynamic probabilistic model
G Knowledge is fragmented, heterogeneous and incomplete
G Difficulties to learn precise model parameters
Use of -contamination for a robustness analysis of the model
Prac Rep 56
57. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Experiments
The network
T(t)
Km(t)
lo(t)
Km(t+1)
lo(t+1)
Time slice t Time slice t +1
Unrolled over 14 time steps (days)
T(1)
Km(1)
lo(1)
T(2)
Km(2)
lo(2)
Km(14)
lo(14)
t
j1 j14Cheese making Cheese ripening (∼ 13◦
C, ∼ 95%hum) Warehouse (∼ 4◦
C)
...
Prac Rep 57
58. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Propagation results
Forward propagation, ∀t ∈ 1,τ ,T(t) = 12o
C (average ripening room temperature) :
Ext{Km(t)|{Km(1),lo(1),T(1),...,T(τ)}}
Ext{lo(t)|{Km(1),lo(1),T(1),...,T(τ)}}
no physical constraints with added physical constraints
Prac Rep 58
59. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
Conclusions
Use of practical representations
G +: "Easy" robustness analysis of precise methods, or approximation
of imprecise ones
G +: allow experts to express imprecision or partial information
G +: often easier to explain/represent than general ones
G -: usually focus on specific events
G -: their form may not be conserved by information processing
Prac Rep 59
60. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References I
[1] C. Baudrit and D. Dubois.
Practical representations of incomplete probabilistic knowledge.
Computational Statistics and Data Analysis, 51(1):86–108, 2006.
[2] M. Cattaneo.
Likelihood-based statistical decisions.
In Proc. 4th International Symposium on Imprecise Probabilities and
Their Applications, pages 107–116, 2005.
[3] A. Chateauneuf and J.-Y. Jaffray.
Some characterizations of lower probabilities and other monotone
capacities through the use of Mobius inversion.
Mathematical Social Sciences, 17(3):263–283, 1989.
Prac Rep 60
61. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References II
[4] L. de Campos, J. Huete, and S. Moral.
Probability intervals: a tool for uncertain reasoning.
I. J. of Uncertainty, Fuzziness and Knowledge-Based Systems,
2:167–196, 1994.
[5] G. de Cooman and P. Walley.
A possibilistic hierarchical model for behaviour under uncertainty.
Theory and Decision, 52:327–374, 2002.
[6] S. Destercke, D. Dubois, and E. Chojnacki.
Unifying practical uncertainty representations: I generalized
p-boxes.
Int. J. of Approximate Reasoning, 49:649–663, 2008.
Prac Rep 61
62. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References III
[7] S. Destercke and O. Strauss.
Filtering with clouds.
Soft Computing, 16(5):821–831, 2012.
[8] D. Dubois, L. Foulloy, G. Mauris, and H. Prade.
Probability-possibility transformations, triangular fuzzy sets, and
probabilistic inequalities.
Reliable Computing, 10:273–297, 2004.
[9] D. Dubois, S. Moral, and H. Prade.
A semantics for possibility theory based on likelihoods,.
Journal of Mathematical Analysis and Applications, 205(2):359 –
380, 1997.
Prac Rep 62
63. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References IV
[10] D. Dubois and H. Prade.
Practical methods for constructing possibility distributions.
International Journal of Intelligent Systems, 31(3):215–239, 2016.
[11] S. Ferson, L. Ginzburg, V. Kreinovich, D. Myers, and K. Sentz.
Constructing probability boxes and dempster-shafer structures.
Technical report, Sandia National Laboratories, 2003.
[12] M. Hourbracq, C. Baudrit, P.-H. Wuillemin, and S. Destercke.
Dynamic credal networks: introduction and use in robustness
analysis.
In Proceedings of the Eighth International Symposium on Imprecise
Probability: Theories and Applications, pages 159–169, 2013.
Prac Rep 63
64. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References V
[13] B. O. Koopman.
The axioms and algebra of intuitive probability.
Annals of Mathematics, pages 269–292, 1940.
[14] E. Miranda and S. Destercke.
Extreme points of the credal sets generated by comparative
probabilities.
Journal of Mathematical Psychology, 64:44–57, 2015.
[15] I. Montes and S. Destercke.
On extreme points of p-boxes and belief functions.
In Int. Conf. on Soft Methods in Probability and Statistics (SMPS),
2016.
Prac Rep 64
65. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References VI
[16] R. Pelessoni, P. Vicig, and M. Zaffalon.
Inference and risk measurement with the pari-mutuel model.
International journal of approximate reasoning, 51(9):1145–1158,
2010.
[17] G. Regoli.
Comparative probability orderings.
Technical report, Society for Imprecise Probabilities: Theories and
Applications, 1999.
[18] S. Sandri, D. Dubois, and H. Kalfsbeek.
Elicitation, assessment and pooling of expert judgments using
possibility theory.
IEEE Trans. on Fuzzy Systems, 3(3):313–335, August 1995.
Prac Rep 65
66. Introduction Basics Practical Representations Applications
Numerical filtering Camembert ripening
References VII
[19] G. Schollmeyer.
On the number and characterization of the extreme points of the
core of necessity measures on finite spaces.
ISIPTA conference, 2016.
[20] P. Suppes, G. Wright, and P. Ayton.
Qualitative theory of subjective probability.
Subjective probability, pages 17–38, 1994.
[21] M. C. M. Troffaes and S. Destercke.
Probability boxes on totally preordered spaces for multivariate
modelling.
Int. J. Approx. Reasoning, 52(6):767–791, 2011.
Prac Rep 66