SlideShare a Scribd company logo
1 of 23
Download to read offline
Equational axioms for probability calculus and
modelling of Likelihood ratio transfer mediated
reasoning
Jan Bergstra
Informatics Institute, Faculty of Science
University of Amsterdam
j.a.bergstra@uva.nl
ESTEC March 8, 2019
Jan Bergstra Informatics Institute ESTEC March 8, 2019 1 / 23
Commutative rings
(x + y) + z = x + (y + z) (1)
x + y = y + x (2)
x + 0 = x (3)
x + (−x) = 0 (4)
(x · y) · z = x · (y · z) (5)
x · y = y · x (6)
1 · x = x (7)
x · (y + z) = x · y + x · z (8)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 2 / 23
Division by zero
Add a function symbol for inverse (x−1) and have division (x/y or x
y ) as
a derived operator.
Now what about 0−1? A survey of the 8 known options.
0−1 = 0, material inverse (material division): meadows,
0−1 = 1, (inverse not involutive),
0−1 = 17, (ad hoc value),
0−1 = ⊥ (error value) common inverse,
0−1 = ∞ (unsigned infinite), natural inverse: wheels,
0 · 0−1 = ⊥,
0−1 = +∞ (positive signed infinite), transrational numbers,
transreal numbers,
∞ + (−∞) = ⊥),
0−1 ↑ undefined (divergence), partial inverse.
0 · 0−1 = 1 formal multiplicative inverse.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 3 / 23
Meadows: Md = CR + (9) + (10)
(x−1
)−1
= x (9)
x · (x · x−1
) = x (10)
We find: Md 0−1 = 0 · (0 · 0−1) = 0.
x/y = x/y =
x
y
= x ÷ y = x · y−1
(11)
Defining equations for the different operator symbols for division
Jan Bergstra Informatics Institute ESTEC March 8, 2019 4 / 23
Signed meadows
Idea for sign function s(0) = 0, x > 0 → s(x) = 1, x < 0 → s(x) = −1.
Axioms (without ordering):
s(x · x−1
) = x · x−1
(12)
s(1 − x · x−1
) = 1 − x · x−1
(13)
s(−1) = −1 (14)
s(x−1
) = s(x) (15)
s(x · y) = s(x) · s(y) (16)
0s(x)−s(y) · (s(x + y) − s(x)) = 0 (17)
|x| = s(x) · x
Sign: axioms for the sign operator & abs. value. Completeness result:
R0(s) |= t = r ⇐⇒ Md + Sign t = r
Jan Bergstra Informatics Institute ESTEC March 8, 2019 5 / 23
Event space
Events viewed as propositions about samples.
(x ∨ y) ∧ y = y (18)
(x ∧ y) ∨ y = y (19)
x ∧ (y ∨ z) = (y ∧ x) ∨ (z ∧ x) (20)
x ∨ (y ∧ z) = (y ∨ x) ∧ (z ∨ x) (21)
x ∧ ¬x = ⊥ (22)
x ∨ ¬x = (23)
BA: a self-dual equational basis for Boolean algebras (Padmanabhan
1983)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 6 / 23
Probability functions
P( ) = 1 (24)
P(⊥) = 0 (25)
P(x) = |P(x)| (26)
P(x ∨ y) = P(x) + P(y) − P(x ∧ y) (27)
P(x ∧ y) · P(y) · P(y)−1
= P(x ∧ y) (28)
P(x | y) = P(x ∧ y) · P(y)−1
PFP: a version of Kolmogorov’s axioms for a probability function.
Completeness:
Md + Sign + BA + PFP proves all equations t = r which hold in any
structure made from a Boolean algebra E and a probability function
P : E → R0.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 7 / 23
Formalizing Kolmogorov’s axioms & Bayes’ rule
Original presentation of Kolmogorov’s axioms: use set theory and real
numbers and define which P’s are probability functions.
Given this definition Md + Sign + BA + PFP is a formalisation of that
definition. In the completeness statement the definition is used and its
correspondence with the formalisation is stated.
Bayes’ rule is derivable from Md + Sign + BA + PFP (without using
P(x ∨ y) = P(x) + P(y) − P(x ∧ y))
P(x | y) =
P(y | x) · P(x)
P(y)
(with inverse instead of division: P(x | y) = P(y | x) · P(x) · P(y)−1.)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 8 / 23
Proof of Bayes’ rule (BR) from Md + Sign + BA + PFP
P(x | y) =
P(x ∧ y)
P(y)
=
P(y ∧ x)
P(y)
=
P(y ∧ x) · P(x) · P(x)−1
P(y)
=
P(y∧x)
P(x) · P(x)
P(y)
=
P(y | x) · P(x)
P(y)
In the presence of Md + BA + ”definition of conditional probability”,
BR follows from equation no. 27 (P(x ∧ y) · P(y) · P(y)−1 = P(x ∧ y)).
In fact this works both ways: BR implies equation 27.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 9 / 23
Second form of Bayes’ rule
BR2, a second form of Bayes’ rule
P(x | y) =
P(y | x) · P(x)
P(y | z) · P(z) + P(y | ¬z) · P(¬z)
.
BR2 is derivable from Md + Sign + BA + PFP and is equivalent
with P(x ∨ y) = P(x) + P(y) − P(x ∧ y).
BR2 is stronger than BR.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 10 / 23
PFP: Alternative axioms for a probability function
P( ) = 1
P(⊥) = 0
P(x) = |P(x)|
P(x | y) =
P(x ∧ y)
P(y)
P(x | y) =
P(y | x) · P(x)
P(y | z) · P(z) + P(y | ¬z) · P(¬z)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 11 / 23
Why making inverse total? Four arguments!
1 Raising a run time exception at division by 0 may create a system
risk (if other exceptions are also raised in a real time context).
Proving software correctness (in advance) over a meadow
prevents such exceptions from being raised.
2 Several software verification tools use a total version of division,
because the (any) logic of partial functions is significantly more
complicated than the logic of total functions.
3 Limitation to classical two-valued logic. See next page.
4 Simplification of theoretical work: fewer cases to be distinguished,
fewer (negative) conditions occur.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 12 / 23
Why making inverse total? Limitation to classical
two-valued logic
It is a common idea that the follwoing assertion Φ is valid:
Φ ≡ x = 0 → x/x = 1
The idea is that the condition prevents one from having to divide by
zero and that one is comfortable with: ∀x.Φ(x). But how can that be?
Substitution of 0 for x must be allowed and must turn Φ(x) into a valid
assertion so that also Φ(0) holds, i.e.
0 = 0 → 0/0 = 1
and in other words: 0 = 0 ∨ 0/0 = 1. But for the latter to hold (in
classical 2-valued logic) both parts of the disjunction must have a truth
value. Thus we must know either 0/0 = 1 or ¬(0/0 = 1). However,
when viewing 0/0 as undefined (or even worse, as incorrectly typed)
neither of these assertions is plausible.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 13 / 23
Bayesian reasoning
CHALLENGE: understand courtroom Bayesian reasoning from first
principles (that is principles which are found in basic papers).
Conclusion: not at all easy. It is an oversimplification to say that judges
should acquire the theoretical background which consists of a few
formulae and their application.The whole subject is deeply puzzling.
Principal agents:
TOF (trier of fact, the judge or a jury),
MOE (moderator of evidence, “getuige deskundige”),
a defendant, a prosecutor, several lawyers.
Here focus on TOF and MOE.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 14 / 23
Subjective probability: a crash course
Exam question to person X: what is the probability Pbom that there are
birds on the moon (not in a spacecraft)? Survey of answers by X with
a corresponding assessment (VERY LOW< LOW < DEFECTIVE <
ADEQUATE) of the “probability theory competence (ptc)” of X:
1 X replies that (s)he must visit the moon before answering the
question (ptc VERY LOW because X does not understand the
concept of prior odds).
2 Pbom = 0: valid answer (ptc ADEQUATE).
3 Pbom = 10−5: valid answer (ptc ADEQUATE).
4 Pbom > 0: X has not understood how to work with (subjective)
probabilities as these must be precise! (ptc DEFECTIVE).
5 10−20 ≤ Pbom ≤ 10−10: X has not understood how to work with
(subjective) probabilities, no intervals! (ptc DEFECTIVE, though
NFI experts may produce such intervals for likelihood ratio’s)
6 I don’t know: X has not understood the concept of probability at
all. (Because precisely by assigning a value to Pbom, X may
express his/her lack of knowledge. ptc LOW)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 15 / 23
Credal state (partial belief state, most beliefs missed)
H (hypothesis: e.g. the defendant is guilty of criminal action C).
E some assertion about evidence of relevance for H.
H and E are propositions, also called events.
All agents at each moment maintain a proposition space with
probability function (credal state, subjective probability).
for TOF proposition space (= event space) ETOF with probability
function PTOF on ETOF .
for MOE: EMOE with probability function PMOE on EMOE .
proposition kinetics: the event space changes (for instance E is
added to ETOF , or is removed from ETOF ).
conditioning: modification (update) of probability function on the
basis of newly acquired information.
Bayes conditioning, (for processing the information that “L is true”)
Jeffrey conditioning, (for processing the information that “P(L) = p”)
single likelihood Adams conditioning, (for processing a new value
for a conditional probability, i.e. a likelihood),
double likelihood Adams conditioning), (for processing a new value
for a likelihood ratio).
Jan Bergstra Informatics Institute ESTEC March 8, 2019 16 / 23
Probability function transformations: a survey
Bayes conditioning (without proposition kinetics). Suppose
SA = SA(L, M, N) and PA(M) = p > 0. Then PA is
obtained by Bayes conditioning if it satisfies the following
equation:
PA = P0
A(•|M)
Jeffrey notation: for all X ∈ SA, PA(X) = P0
A(X|M).
Bayes conditioning with proposition kinetics. Now the resulting credal
state is (SA(L, N), PA) M has been removed from the
proposition space.
Bayes conditioning on a non-primitive proposition. SA = SA(L, M, N).
Φ is a closed propositional sentence making use of
primitives L, N, and M. PA(Φ) = p > 0. Then PA is
obtained by Bayes conditioning on Φ if it satisfies:
PA = P0
A(•|Φ)
the proposition space is not modified.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 17 / 23
Jeffrey conditioning. Let for example SA = SA(L, M, N). Suppose
PA(M) > 0. Then PA is obtained by Jeffrey conditioning if
for some p ∈ [0, 1] it satisfies the following equation:
PA = p · P0
A(•|M) + (1 − p) · P0
A(•|¬M)
Jefrey conditioning involves no proposition kinetics.
Proposition space reduction. Consider SA = SA(L, M, N), one may
wish to forget about say M. Proposition kinetics now
leads to a reduced proposition space SA(L, N) in which
only the propositions generated by L and N are left.
Parametrized proposition space expansion. Let SA = SA(H). One may
wish to expand SA to a proposition space by introducing
M to it in such a manner that a subsequent reduct brings
one back in SA.
PA(H) is left unchanged and PA(H ∧ M) and PA(¬H ∧ M)
are be fixed with definite values serving as parameters for
the transformation.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 18 / 23
Single likelihood Adams conditioning. Let 0 < l ≤ 1 be a rational
number. Assume that H and E are among the generators
of SA. Single likelihood Adams conditioning leaves the
proposition space unchanged and transforms the
probability function PA to Ql.
Ql =
PA(H ∧ E ∧ •) · l
P0
A
(E|H)
+ PA(H ∧ ¬E ∧ •) · 1−l
P0
A
(¬E|H)
+
PA(¬H ∧ •)
Double likelihood Adams conditioning. Let 0 < l, l ≤ 1 be two rational
numbers. H and E are among the generators of SA.
Double likelihood Adams conditioning leaves the
proposition space SA of A unchanged and transforms the
probability function PA to
Ql,l =
PA(H ∧ E ∧ •) · l
P0
A
(E|H)
+ PA(H ∧ ¬E ∧ •) · 1−l
P0
A
(¬E|H)
+
PA(¬H ∧ E ∧ •) · l
P0
A
(E|¬H)
+ PA(¬H ∧ ¬E ∧ •) · 1−l
P0
A
(¬E|¬H)
Jan Bergstra Informatics Institute ESTEC March 8, 2019 19 / 23
LRTMR protocol (likelihood ratio transfer mediated
reasoning)
Often the term likelihood is used to denote a certain conditional
probability. We write Lα for a likelihood and LR0
α for a particular ratio of
likelihoods, commonly referred to as a likelihood ratio.
Lα(X, Y) = Pα(X|Y) and LRα(X, Y, ¬Y) =
Lα(X, Y)
Lα(X, ¬Y)
It is now assumed that both E and H are among the generators of both
proposition spaces STOF and SMOE . Further TOF and MOE have prior
credal states (STOF , PTOF ) and (SMOE , PMOE ). The reasoning protocol
LRTMR involves the following steps:
It is checked by MOE that 0 < PMOE (H) < 1 and
0 < PMOE (E) < 1, otherwise MOE raises an exception and the
protocol aborts.
MOE determines the value r of the likelihood ratio
LRMOE (E, H, ¬H) = LMOE (E,H)
LMOE (E,¬H) = PMOE (E|H)
PMOE (E|¬H) with respect to its
probability function PMOE .
Jan Bergstra Informatics Institute ESTEC March 8, 2019 20 / 23
MOE communicates to TOF the value r and a description of
LRMOE (E, H, ¬H), that is a description of what propositions r is a
likelihood ratio of.
MOE communicates its newly acquired information to TOF that it
now considers PMOE (E) = 1, i.e. E being true, to be an adequate
representation of the state of affairs (Thus MOE has updated its
probability function.)
TOF trusts MOE to the extent that TOF prefers those of MOE’s
quantitative values that MOE communicates over its own values
for the same probabilities, likelihoods, and likelihood ratios.
TOF takes all information into account and applies various
conditioning operators to end up with its new (updated, posterior)
belief function PTOF .
TOF becomes aware of it having updated its beliefs, with the effect
that PTOF (H) = r·PTOF (H)
1+(r−1)·PTOF (H). TOF checks whether a threshold
is exceeded so that a sound judgement on H can be made.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 21 / 23
Some conclusions
1 Upon receiving the value of the likelihood ratio r from MOE, TOF
can (and must) update its probability function by means of the
double likelihood Adams conditioning. (DOGMA: agents must
always and immediately take all new information into account by
updating their credal states).
2 Upon subsequently receiving the information that E is true
(according to MOE) TOF applies Bayes conditioning (after Adams
conditioning).
3 MOE must first transfer the likelihood ratio. Only thereafter MOE
contemplates the truth of E. (If MOE first settles the truth of E
then the likelihood ratio equals 1 and the protocol collapses, or
MOE fails to communicate its proper beliefs).
4 MOE communicates the truth of E in a separate (second)
message, after having updated its own probability function.
5 After the first message of MOE, TOF must apply Adams
conditioning. This is missed by all accounts that I have read.
Jan Bergstra Informatics Institute ESTEC March 8, 2019 22 / 23
Further remarks
MOE may transfer both likelihoods in separate successive
messages. Then TOF can apply single likelihood Adams
conditioning after both messages, with the same effect as in the
protocol.
In principle TOF may receive likelihood ratio’s regarding different
pieces of evidence E, E , E etc. But then E , E , E must be
independent (this requires a highly non-trivial bookkeeping by
TOF).
For TOF there is no way around subjective probability.
It is not clear from the literature of forensic science if MOE is
supposed to think in terms of subjective probability as well. (Not a
necessity as TOF may freely turn MOE’s “objective” probabilities
into its own subjective probabilities, but opinions diverge.)
If MOE must adhere to subjective probability then (i) single
message reporting is not an option, and (ii) TOF must apply at
least two successive updates of its probability function (even in the
simplest case).
Jan Bergstra Informatics Institute ESTEC March 8, 2019 23 / 23

More Related Content

What's hot

2. polynomial interpolation
2. polynomial interpolation2. polynomial interpolation
2. polynomial interpolation
EasyStudy3
 

What's hot (20)

asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
 
Convex Analysis and Duality (based on "Functional Analysis and Optimization" ...
Convex Analysis and Duality (based on "Functional Analysis and Optimization" ...Convex Analysis and Duality (based on "Functional Analysis and Optimization" ...
Convex Analysis and Duality (based on "Functional Analysis and Optimization" ...
 
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
Discussion of ABC talk by Stefano Cabras, Padova, March 21, 2013
 
CISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergenceCISEA 2019: ABC consistency and convergence
CISEA 2019: ABC consistency and convergence
 
Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4Slides econometrics-2018-graduate-4
Slides econometrics-2018-graduate-4
 
Side 2019 #4
Side 2019 #4Side 2019 #4
Side 2019 #4
 
(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?(Approximate) Bayesian computation as a new empirical Bayes (something)?
(Approximate) Bayesian computation as a new empirical Bayes (something)?
 
Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2Slides econometrics-2018-graduate-2
Slides econometrics-2018-graduate-2
 
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...On the vexing dilemma of hypothesis testing and the predicted demise of the B...
On the vexing dilemma of hypothesis testing and the predicted demise of the B...
 
Predictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big dataPredictive Modeling in Insurance in the context of (possibly) big data
Predictive Modeling in Insurance in the context of (possibly) big data
 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
 
Side 2019, part 2
Side 2019, part 2Side 2019, part 2
Side 2019, part 2
 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
 
random forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimationrandom forests for ABC model choice and parameter estimation
random forests for ABC model choice and parameter estimation
 
Machine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & InsuranceMachine Learning in Actuarial Science & Insurance
Machine Learning in Actuarial Science & Insurance
 
Lausanne 2019 #2
Lausanne 2019 #2Lausanne 2019 #2
Lausanne 2019 #2
 
Slides econometrics-2018-graduate-3
Slides econometrics-2018-graduate-3Slides econometrics-2018-graduate-3
Slides econometrics-2018-graduate-3
 
Side 2019 #7
Side 2019 #7Side 2019 #7
Side 2019 #7
 
Approximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forestsApproximate Bayesian model choice via random forests
Approximate Bayesian model choice via random forests
 
2. polynomial interpolation
2. polynomial interpolation2. polynomial interpolation
2. polynomial interpolation
 

Similar to Equational axioms for probability calculus and modelling of Likelihood ratio transfer mediated reasoning

Slides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometrySlides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometry
Frank Nielsen
 

Similar to Equational axioms for probability calculus and modelling of Likelihood ratio transfer mediated reasoning (20)

Probability Cheatsheet.pdf
Probability Cheatsheet.pdfProbability Cheatsheet.pdf
Probability Cheatsheet.pdf
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheet
 
Slides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometrySlides: Hypothesis testing, information divergence and computational geometry
Slides: Hypothesis testing, information divergence and computational geometry
 
Probability cheatsheet
Probability cheatsheetProbability cheatsheet
Probability cheatsheet
 
Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...Maximum likelihood estimation of regularisation parameters in inverse problem...
Maximum likelihood estimation of regularisation parameters in inverse problem...
 
Slides ACTINFO 2016
Slides ACTINFO 2016Slides ACTINFO 2016
Slides ACTINFO 2016
 
A Fixed Point Theorem Using Common Property (E. A.) In PM Spaces
A Fixed Point Theorem Using Common Property (E. A.) In PM SpacesA Fixed Point Theorem Using Common Property (E. A.) In PM Spaces
A Fixed Point Theorem Using Common Property (E. A.) In PM Spaces
 
Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...Probability based learning (in book: Machine learning for predictve data anal...
Probability based learning (in book: Machine learning for predictve data anal...
 
Bayesian_Decision_Theory-3.pdf
Bayesian_Decision_Theory-3.pdfBayesian_Decision_Theory-3.pdf
Bayesian_Decision_Theory-3.pdf
 
Finance Enginering from Columbia.pdf
Finance Enginering from Columbia.pdfFinance Enginering from Columbia.pdf
Finance Enginering from Columbia.pdf
 
CLIM: Transition Workshop - Projected Data Assimilation - Erik Van Vleck, Ma...
CLIM: Transition Workshop - Projected Data Assimilation  - Erik Van Vleck, Ma...CLIM: Transition Workshop - Projected Data Assimilation  - Erik Van Vleck, Ma...
CLIM: Transition Workshop - Projected Data Assimilation - Erik Van Vleck, Ma...
 
Generative models : VAE and GAN
Generative models : VAE and GANGenerative models : VAE and GAN
Generative models : VAE and GAN
 
Madrid easy
Madrid easyMadrid easy
Madrid easy
 
Tutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian NetworksTutorial on Belief Propagation in Bayesian Networks
Tutorial on Belief Propagation in Bayesian Networks
 
Proba stats-r1-2017
Proba stats-r1-2017Proba stats-r1-2017
Proba stats-r1-2017
 
Math Assignment Help
Math Assignment HelpMath Assignment Help
Math Assignment Help
 
BAYSM'14, Wien, Austria
BAYSM'14, Wien, AustriaBAYSM'14, Wien, Austria
BAYSM'14, Wien, Austria
 
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
11.[29 35]a unique common fixed point theorem under psi varphi contractive co...
 
Bayes Classification
Bayes ClassificationBayes Classification
Bayes Classification
 
On non-negative unbiased estimators
On non-negative unbiased estimatorsOn non-negative unbiased estimators
On non-negative unbiased estimators
 

More from Advanced-Concepts-Team

Pablo Gomez - Solving Large-scale Challenges with ESA Datalabs
Pablo Gomez - Solving Large-scale Challenges with ESA DatalabsPablo Gomez - Solving Large-scale Challenges with ESA Datalabs
Pablo Gomez - Solving Large-scale Challenges with ESA Datalabs
Advanced-Concepts-Team
 
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
Advanced-Concepts-Team
 
EDEN ISS - A space greenhouse analogue in Antarctica
EDEN ISS - A space greenhouse analogue in AntarcticaEDEN ISS - A space greenhouse analogue in Antarctica
EDEN ISS - A space greenhouse analogue in Antarctica
Advanced-Concepts-Team
 
How to give a robot a soul
How to give a robot a soulHow to give a robot a soul
How to give a robot a soul
Advanced-Concepts-Team
 
Vernal pools a new ecosystem for astrobiology studies
Vernal pools a new ecosystem for astrobiology studiesVernal pools a new ecosystem for astrobiology studies
Vernal pools a new ecosystem for astrobiology studies
Advanced-Concepts-Team
 

More from Advanced-Concepts-Team (20)

2024.03.22 - Mike Heddes - Introduction to Hyperdimensional Computing.pdf
2024.03.22 - Mike Heddes - Introduction to Hyperdimensional Computing.pdf2024.03.22 - Mike Heddes - Introduction to Hyperdimensional Computing.pdf
2024.03.22 - Mike Heddes - Introduction to Hyperdimensional Computing.pdf
 
Isabelle Diacaire - From Ariadnas to Industry R&D in optics and photonics
Isabelle Diacaire - From Ariadnas to Industry R&D in optics and photonicsIsabelle Diacaire - From Ariadnas to Industry R&D in optics and photonics
Isabelle Diacaire - From Ariadnas to Industry R&D in optics and photonics
 
The ExoGRAVITY project - observations of exoplanets from the ground with opti...
The ExoGRAVITY project - observations of exoplanets from the ground with opti...The ExoGRAVITY project - observations of exoplanets from the ground with opti...
The ExoGRAVITY project - observations of exoplanets from the ground with opti...
 
MOND_famaey.pdf
MOND_famaey.pdfMOND_famaey.pdf
MOND_famaey.pdf
 
Pablo Gomez - Solving Large-scale Challenges with ESA Datalabs
Pablo Gomez - Solving Large-scale Challenges with ESA DatalabsPablo Gomez - Solving Large-scale Challenges with ESA Datalabs
Pablo Gomez - Solving Large-scale Challenges with ESA Datalabs
 
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
Jonathan Sauder - Miniaturizing Mechanical Systems for CubeSats: Design Princ...
 
Towards an Artificial Muse for new Ideas in Quantum Physics
Towards an Artificial Muse for new Ideas in Quantum PhysicsTowards an Artificial Muse for new Ideas in Quantum Physics
Towards an Artificial Muse for new Ideas in Quantum Physics
 
EDEN ISS - A space greenhouse analogue in Antarctica
EDEN ISS - A space greenhouse analogue in AntarcticaEDEN ISS - A space greenhouse analogue in Antarctica
EDEN ISS - A space greenhouse analogue in Antarctica
 
How to give a robot a soul
How to give a robot a soulHow to give a robot a soul
How to give a robot a soul
 
Information processing with artificial spiking neural networks
Information processing with artificial spiking neural networksInformation processing with artificial spiking neural networks
Information processing with artificial spiking neural networks
 
Exploring Architected Materials Using Machine Learning
Exploring Architected Materials Using Machine LearningExploring Architected Materials Using Machine Learning
Exploring Architected Materials Using Machine Learning
 
Electromagnetically Actuated Systems for Modular, Self-Assembling and Self-Re...
Electromagnetically Actuated Systems for Modular, Self-Assembling and Self-Re...Electromagnetically Actuated Systems for Modular, Self-Assembling and Self-Re...
Electromagnetically Actuated Systems for Modular, Self-Assembling and Self-Re...
 
HORUS: Peering into Lunar Shadowed Regions with AI
HORUS: Peering into Lunar Shadowed Regions with AIHORUS: Peering into Lunar Shadowed Regions with AI
HORUS: Peering into Lunar Shadowed Regions with AI
 
META-SPACE: Psycho-physiologically Adaptive and Personalized Virtual Reality ...
META-SPACE: Psycho-physiologically Adaptive and Personalized Virtual Reality ...META-SPACE: Psycho-physiologically Adaptive and Personalized Virtual Reality ...
META-SPACE: Psycho-physiologically Adaptive and Personalized Virtual Reality ...
 
The Large Interferometer For Exoplanets (LIFE) II: Key Methods and Technologies
The Large Interferometer For Exoplanets (LIFE) II: Key Methods and TechnologiesThe Large Interferometer For Exoplanets (LIFE) II: Key Methods and Technologies
The Large Interferometer For Exoplanets (LIFE) II: Key Methods and Technologies
 
Black Holes and Bright Quasars
Black Holes and Bright QuasarsBlack Holes and Bright Quasars
Black Holes and Bright Quasars
 
In vitro simulation of spaceflight environment to elucidate combined effect o...
In vitro simulation of spaceflight environment to elucidate combined effect o...In vitro simulation of spaceflight environment to elucidate combined effect o...
In vitro simulation of spaceflight environment to elucidate combined effect o...
 
The Large Interferometer For Exoplanets (LIFE): the science of characterising...
The Large Interferometer For Exoplanets (LIFE): the science of characterising...The Large Interferometer For Exoplanets (LIFE): the science of characterising...
The Large Interferometer For Exoplanets (LIFE): the science of characterising...
 
Vernal pools a new ecosystem for astrobiology studies
Vernal pools a new ecosystem for astrobiology studiesVernal pools a new ecosystem for astrobiology studies
Vernal pools a new ecosystem for astrobiology studies
 
Keeping a Sentinel Eye on the Volcanoes – from Space!
Keeping a Sentinel Eye on the Volcanoes – from Space!Keeping a Sentinel Eye on the Volcanoes – from Space!
Keeping a Sentinel Eye on the Volcanoes – from Space!
 

Recently uploaded

biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
1301aanya
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Sérgio Sacani
 
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
Scintica Instrumentation
 
Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.
Silpa
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
levieagacer
 
Human genetics..........................pptx
Human genetics..........................pptxHuman genetics..........................pptx
Human genetics..........................pptx
Silpa
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
seri bangash
 
CYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptxCYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptx
Silpa
 

Recently uploaded (20)

biology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGYbiology HL practice questions IB BIOLOGY
biology HL practice questions IB BIOLOGY
 
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIACURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
CURRENT SCENARIO OF POULTRY PRODUCTION IN INDIA
 
Use of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptxUse of mutants in understanding seedling development.pptx
Use of mutants in understanding seedling development.pptx
 
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptxClimate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
Climate Change Impacts on Terrestrial and Aquatic Ecosystems.pptx
 
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 bAsymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
Asymmetry in the atmosphere of the ultra-hot Jupiter WASP-76 b
 
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
(May 9, 2024) Enhanced Ultrafast Vector Flow Imaging (VFI) Using Multi-Angle ...
 
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
Human & Veterinary Respiratory Physilogy_DR.E.Muralinath_Associate Professor....
 
Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.Phenolics: types, biosynthesis and functions.
Phenolics: types, biosynthesis and functions.
 
Module for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learningModule for Grade 9 for Asynchronous/Distance learning
Module for Grade 9 for Asynchronous/Distance learning
 
Human genetics..........................pptx
Human genetics..........................pptxHuman genetics..........................pptx
Human genetics..........................pptx
 
The Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptxThe Mariana Trench remarkable geological features on Earth.pptx
The Mariana Trench remarkable geological features on Earth.pptx
 
Grade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its FunctionsGrade 7 - Lesson 1 - Microscope and Its Functions
Grade 7 - Lesson 1 - Microscope and Its Functions
 
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRingsTransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
TransientOffsetin14CAftertheCarringtonEventRecordedbyPolarTreeRings
 
Dr. E. Muralinath_ Blood indices_clinical aspects
Dr. E. Muralinath_ Blood indices_clinical  aspectsDr. E. Muralinath_ Blood indices_clinical  aspects
Dr. E. Muralinath_ Blood indices_clinical aspects
 
Chemistry 5th semester paper 1st Notes.pdf
Chemistry 5th semester paper 1st Notes.pdfChemistry 5th semester paper 1st Notes.pdf
Chemistry 5th semester paper 1st Notes.pdf
 
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRLGwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
Gwalior ❤CALL GIRL 84099*07087 ❤CALL GIRLS IN Gwalior ESCORT SERVICE❤CALL GIRL
 
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICEPATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
PATNA CALL GIRLS 8617370543 LOW PRICE ESCORT SERVICE
 
Role of AI in seed science Predictive modelling and Beyond.pptx
Role of AI in seed science  Predictive modelling and  Beyond.pptxRole of AI in seed science  Predictive modelling and  Beyond.pptx
Role of AI in seed science Predictive modelling and Beyond.pptx
 
CYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptxCYTOGENETIC MAP................ ppt.pptx
CYTOGENETIC MAP................ ppt.pptx
 
Atp synthase , Atp synthase complex 1 to 4.
Atp synthase , Atp synthase complex 1 to 4.Atp synthase , Atp synthase complex 1 to 4.
Atp synthase , Atp synthase complex 1 to 4.
 

Equational axioms for probability calculus and modelling of Likelihood ratio transfer mediated reasoning

  • 1. Equational axioms for probability calculus and modelling of Likelihood ratio transfer mediated reasoning Jan Bergstra Informatics Institute, Faculty of Science University of Amsterdam j.a.bergstra@uva.nl ESTEC March 8, 2019 Jan Bergstra Informatics Institute ESTEC March 8, 2019 1 / 23
  • 2. Commutative rings (x + y) + z = x + (y + z) (1) x + y = y + x (2) x + 0 = x (3) x + (−x) = 0 (4) (x · y) · z = x · (y · z) (5) x · y = y · x (6) 1 · x = x (7) x · (y + z) = x · y + x · z (8) Jan Bergstra Informatics Institute ESTEC March 8, 2019 2 / 23
  • 3. Division by zero Add a function symbol for inverse (x−1) and have division (x/y or x y ) as a derived operator. Now what about 0−1? A survey of the 8 known options. 0−1 = 0, material inverse (material division): meadows, 0−1 = 1, (inverse not involutive), 0−1 = 17, (ad hoc value), 0−1 = ⊥ (error value) common inverse, 0−1 = ∞ (unsigned infinite), natural inverse: wheels, 0 · 0−1 = ⊥, 0−1 = +∞ (positive signed infinite), transrational numbers, transreal numbers, ∞ + (−∞) = ⊥), 0−1 ↑ undefined (divergence), partial inverse. 0 · 0−1 = 1 formal multiplicative inverse. Jan Bergstra Informatics Institute ESTEC March 8, 2019 3 / 23
  • 4. Meadows: Md = CR + (9) + (10) (x−1 )−1 = x (9) x · (x · x−1 ) = x (10) We find: Md 0−1 = 0 · (0 · 0−1) = 0. x/y = x/y = x y = x ÷ y = x · y−1 (11) Defining equations for the different operator symbols for division Jan Bergstra Informatics Institute ESTEC March 8, 2019 4 / 23
  • 5. Signed meadows Idea for sign function s(0) = 0, x > 0 → s(x) = 1, x < 0 → s(x) = −1. Axioms (without ordering): s(x · x−1 ) = x · x−1 (12) s(1 − x · x−1 ) = 1 − x · x−1 (13) s(−1) = −1 (14) s(x−1 ) = s(x) (15) s(x · y) = s(x) · s(y) (16) 0s(x)−s(y) · (s(x + y) − s(x)) = 0 (17) |x| = s(x) · x Sign: axioms for the sign operator & abs. value. Completeness result: R0(s) |= t = r ⇐⇒ Md + Sign t = r Jan Bergstra Informatics Institute ESTEC March 8, 2019 5 / 23
  • 6. Event space Events viewed as propositions about samples. (x ∨ y) ∧ y = y (18) (x ∧ y) ∨ y = y (19) x ∧ (y ∨ z) = (y ∧ x) ∨ (z ∧ x) (20) x ∨ (y ∧ z) = (y ∨ x) ∧ (z ∨ x) (21) x ∧ ¬x = ⊥ (22) x ∨ ¬x = (23) BA: a self-dual equational basis for Boolean algebras (Padmanabhan 1983) Jan Bergstra Informatics Institute ESTEC March 8, 2019 6 / 23
  • 7. Probability functions P( ) = 1 (24) P(⊥) = 0 (25) P(x) = |P(x)| (26) P(x ∨ y) = P(x) + P(y) − P(x ∧ y) (27) P(x ∧ y) · P(y) · P(y)−1 = P(x ∧ y) (28) P(x | y) = P(x ∧ y) · P(y)−1 PFP: a version of Kolmogorov’s axioms for a probability function. Completeness: Md + Sign + BA + PFP proves all equations t = r which hold in any structure made from a Boolean algebra E and a probability function P : E → R0. Jan Bergstra Informatics Institute ESTEC March 8, 2019 7 / 23
  • 8. Formalizing Kolmogorov’s axioms & Bayes’ rule Original presentation of Kolmogorov’s axioms: use set theory and real numbers and define which P’s are probability functions. Given this definition Md + Sign + BA + PFP is a formalisation of that definition. In the completeness statement the definition is used and its correspondence with the formalisation is stated. Bayes’ rule is derivable from Md + Sign + BA + PFP (without using P(x ∨ y) = P(x) + P(y) − P(x ∧ y)) P(x | y) = P(y | x) · P(x) P(y) (with inverse instead of division: P(x | y) = P(y | x) · P(x) · P(y)−1.) Jan Bergstra Informatics Institute ESTEC March 8, 2019 8 / 23
  • 9. Proof of Bayes’ rule (BR) from Md + Sign + BA + PFP P(x | y) = P(x ∧ y) P(y) = P(y ∧ x) P(y) = P(y ∧ x) · P(x) · P(x)−1 P(y) = P(y∧x) P(x) · P(x) P(y) = P(y | x) · P(x) P(y) In the presence of Md + BA + ”definition of conditional probability”, BR follows from equation no. 27 (P(x ∧ y) · P(y) · P(y)−1 = P(x ∧ y)). In fact this works both ways: BR implies equation 27. Jan Bergstra Informatics Institute ESTEC March 8, 2019 9 / 23
  • 10. Second form of Bayes’ rule BR2, a second form of Bayes’ rule P(x | y) = P(y | x) · P(x) P(y | z) · P(z) + P(y | ¬z) · P(¬z) . BR2 is derivable from Md + Sign + BA + PFP and is equivalent with P(x ∨ y) = P(x) + P(y) − P(x ∧ y). BR2 is stronger than BR. Jan Bergstra Informatics Institute ESTEC March 8, 2019 10 / 23
  • 11. PFP: Alternative axioms for a probability function P( ) = 1 P(⊥) = 0 P(x) = |P(x)| P(x | y) = P(x ∧ y) P(y) P(x | y) = P(y | x) · P(x) P(y | z) · P(z) + P(y | ¬z) · P(¬z) Jan Bergstra Informatics Institute ESTEC March 8, 2019 11 / 23
  • 12. Why making inverse total? Four arguments! 1 Raising a run time exception at division by 0 may create a system risk (if other exceptions are also raised in a real time context). Proving software correctness (in advance) over a meadow prevents such exceptions from being raised. 2 Several software verification tools use a total version of division, because the (any) logic of partial functions is significantly more complicated than the logic of total functions. 3 Limitation to classical two-valued logic. See next page. 4 Simplification of theoretical work: fewer cases to be distinguished, fewer (negative) conditions occur. Jan Bergstra Informatics Institute ESTEC March 8, 2019 12 / 23
  • 13. Why making inverse total? Limitation to classical two-valued logic It is a common idea that the follwoing assertion Φ is valid: Φ ≡ x = 0 → x/x = 1 The idea is that the condition prevents one from having to divide by zero and that one is comfortable with: ∀x.Φ(x). But how can that be? Substitution of 0 for x must be allowed and must turn Φ(x) into a valid assertion so that also Φ(0) holds, i.e. 0 = 0 → 0/0 = 1 and in other words: 0 = 0 ∨ 0/0 = 1. But for the latter to hold (in classical 2-valued logic) both parts of the disjunction must have a truth value. Thus we must know either 0/0 = 1 or ¬(0/0 = 1). However, when viewing 0/0 as undefined (or even worse, as incorrectly typed) neither of these assertions is plausible. Jan Bergstra Informatics Institute ESTEC March 8, 2019 13 / 23
  • 14. Bayesian reasoning CHALLENGE: understand courtroom Bayesian reasoning from first principles (that is principles which are found in basic papers). Conclusion: not at all easy. It is an oversimplification to say that judges should acquire the theoretical background which consists of a few formulae and their application.The whole subject is deeply puzzling. Principal agents: TOF (trier of fact, the judge or a jury), MOE (moderator of evidence, “getuige deskundige”), a defendant, a prosecutor, several lawyers. Here focus on TOF and MOE. Jan Bergstra Informatics Institute ESTEC March 8, 2019 14 / 23
  • 15. Subjective probability: a crash course Exam question to person X: what is the probability Pbom that there are birds on the moon (not in a spacecraft)? Survey of answers by X with a corresponding assessment (VERY LOW< LOW < DEFECTIVE < ADEQUATE) of the “probability theory competence (ptc)” of X: 1 X replies that (s)he must visit the moon before answering the question (ptc VERY LOW because X does not understand the concept of prior odds). 2 Pbom = 0: valid answer (ptc ADEQUATE). 3 Pbom = 10−5: valid answer (ptc ADEQUATE). 4 Pbom > 0: X has not understood how to work with (subjective) probabilities as these must be precise! (ptc DEFECTIVE). 5 10−20 ≤ Pbom ≤ 10−10: X has not understood how to work with (subjective) probabilities, no intervals! (ptc DEFECTIVE, though NFI experts may produce such intervals for likelihood ratio’s) 6 I don’t know: X has not understood the concept of probability at all. (Because precisely by assigning a value to Pbom, X may express his/her lack of knowledge. ptc LOW) Jan Bergstra Informatics Institute ESTEC March 8, 2019 15 / 23
  • 16. Credal state (partial belief state, most beliefs missed) H (hypothesis: e.g. the defendant is guilty of criminal action C). E some assertion about evidence of relevance for H. H and E are propositions, also called events. All agents at each moment maintain a proposition space with probability function (credal state, subjective probability). for TOF proposition space (= event space) ETOF with probability function PTOF on ETOF . for MOE: EMOE with probability function PMOE on EMOE . proposition kinetics: the event space changes (for instance E is added to ETOF , or is removed from ETOF ). conditioning: modification (update) of probability function on the basis of newly acquired information. Bayes conditioning, (for processing the information that “L is true”) Jeffrey conditioning, (for processing the information that “P(L) = p”) single likelihood Adams conditioning, (for processing a new value for a conditional probability, i.e. a likelihood), double likelihood Adams conditioning), (for processing a new value for a likelihood ratio). Jan Bergstra Informatics Institute ESTEC March 8, 2019 16 / 23
  • 17. Probability function transformations: a survey Bayes conditioning (without proposition kinetics). Suppose SA = SA(L, M, N) and PA(M) = p > 0. Then PA is obtained by Bayes conditioning if it satisfies the following equation: PA = P0 A(•|M) Jeffrey notation: for all X ∈ SA, PA(X) = P0 A(X|M). Bayes conditioning with proposition kinetics. Now the resulting credal state is (SA(L, N), PA) M has been removed from the proposition space. Bayes conditioning on a non-primitive proposition. SA = SA(L, M, N). Φ is a closed propositional sentence making use of primitives L, N, and M. PA(Φ) = p > 0. Then PA is obtained by Bayes conditioning on Φ if it satisfies: PA = P0 A(•|Φ) the proposition space is not modified. Jan Bergstra Informatics Institute ESTEC March 8, 2019 17 / 23
  • 18. Jeffrey conditioning. Let for example SA = SA(L, M, N). Suppose PA(M) > 0. Then PA is obtained by Jeffrey conditioning if for some p ∈ [0, 1] it satisfies the following equation: PA = p · P0 A(•|M) + (1 − p) · P0 A(•|¬M) Jefrey conditioning involves no proposition kinetics. Proposition space reduction. Consider SA = SA(L, M, N), one may wish to forget about say M. Proposition kinetics now leads to a reduced proposition space SA(L, N) in which only the propositions generated by L and N are left. Parametrized proposition space expansion. Let SA = SA(H). One may wish to expand SA to a proposition space by introducing M to it in such a manner that a subsequent reduct brings one back in SA. PA(H) is left unchanged and PA(H ∧ M) and PA(¬H ∧ M) are be fixed with definite values serving as parameters for the transformation. Jan Bergstra Informatics Institute ESTEC March 8, 2019 18 / 23
  • 19. Single likelihood Adams conditioning. Let 0 < l ≤ 1 be a rational number. Assume that H and E are among the generators of SA. Single likelihood Adams conditioning leaves the proposition space unchanged and transforms the probability function PA to Ql. Ql = PA(H ∧ E ∧ •) · l P0 A (E|H) + PA(H ∧ ¬E ∧ •) · 1−l P0 A (¬E|H) + PA(¬H ∧ •) Double likelihood Adams conditioning. Let 0 < l, l ≤ 1 be two rational numbers. H and E are among the generators of SA. Double likelihood Adams conditioning leaves the proposition space SA of A unchanged and transforms the probability function PA to Ql,l = PA(H ∧ E ∧ •) · l P0 A (E|H) + PA(H ∧ ¬E ∧ •) · 1−l P0 A (¬E|H) + PA(¬H ∧ E ∧ •) · l P0 A (E|¬H) + PA(¬H ∧ ¬E ∧ •) · 1−l P0 A (¬E|¬H) Jan Bergstra Informatics Institute ESTEC March 8, 2019 19 / 23
  • 20. LRTMR protocol (likelihood ratio transfer mediated reasoning) Often the term likelihood is used to denote a certain conditional probability. We write Lα for a likelihood and LR0 α for a particular ratio of likelihoods, commonly referred to as a likelihood ratio. Lα(X, Y) = Pα(X|Y) and LRα(X, Y, ¬Y) = Lα(X, Y) Lα(X, ¬Y) It is now assumed that both E and H are among the generators of both proposition spaces STOF and SMOE . Further TOF and MOE have prior credal states (STOF , PTOF ) and (SMOE , PMOE ). The reasoning protocol LRTMR involves the following steps: It is checked by MOE that 0 < PMOE (H) < 1 and 0 < PMOE (E) < 1, otherwise MOE raises an exception and the protocol aborts. MOE determines the value r of the likelihood ratio LRMOE (E, H, ¬H) = LMOE (E,H) LMOE (E,¬H) = PMOE (E|H) PMOE (E|¬H) with respect to its probability function PMOE . Jan Bergstra Informatics Institute ESTEC March 8, 2019 20 / 23
  • 21. MOE communicates to TOF the value r and a description of LRMOE (E, H, ¬H), that is a description of what propositions r is a likelihood ratio of. MOE communicates its newly acquired information to TOF that it now considers PMOE (E) = 1, i.e. E being true, to be an adequate representation of the state of affairs (Thus MOE has updated its probability function.) TOF trusts MOE to the extent that TOF prefers those of MOE’s quantitative values that MOE communicates over its own values for the same probabilities, likelihoods, and likelihood ratios. TOF takes all information into account and applies various conditioning operators to end up with its new (updated, posterior) belief function PTOF . TOF becomes aware of it having updated its beliefs, with the effect that PTOF (H) = r·PTOF (H) 1+(r−1)·PTOF (H). TOF checks whether a threshold is exceeded so that a sound judgement on H can be made. Jan Bergstra Informatics Institute ESTEC March 8, 2019 21 / 23
  • 22. Some conclusions 1 Upon receiving the value of the likelihood ratio r from MOE, TOF can (and must) update its probability function by means of the double likelihood Adams conditioning. (DOGMA: agents must always and immediately take all new information into account by updating their credal states). 2 Upon subsequently receiving the information that E is true (according to MOE) TOF applies Bayes conditioning (after Adams conditioning). 3 MOE must first transfer the likelihood ratio. Only thereafter MOE contemplates the truth of E. (If MOE first settles the truth of E then the likelihood ratio equals 1 and the protocol collapses, or MOE fails to communicate its proper beliefs). 4 MOE communicates the truth of E in a separate (second) message, after having updated its own probability function. 5 After the first message of MOE, TOF must apply Adams conditioning. This is missed by all accounts that I have read. Jan Bergstra Informatics Institute ESTEC March 8, 2019 22 / 23
  • 23. Further remarks MOE may transfer both likelihoods in separate successive messages. Then TOF can apply single likelihood Adams conditioning after both messages, with the same effect as in the protocol. In principle TOF may receive likelihood ratio’s regarding different pieces of evidence E, E , E etc. But then E , E , E must be independent (this requires a highly non-trivial bookkeeping by TOF). For TOF there is no way around subjective probability. It is not clear from the literature of forensic science if MOE is supposed to think in terms of subjective probability as well. (Not a necessity as TOF may freely turn MOE’s “objective” probabilities into its own subjective probabilities, but opinions diverge.) If MOE must adhere to subjective probability then (i) single message reporting is not an option, and (ii) TOF must apply at least two successive updates of its probability function (even in the simplest case). Jan Bergstra Informatics Institute ESTEC March 8, 2019 23 / 23