SlideShare a Scribd company logo
1 of 51
Download to read offline
Tail Probabilities for Randomized
Program Runtimes via Martingales
for Higher Moments
Satoshi Kura1,2
Natsuki Urabe1
Ichiro Hasuo1,2
1
National Institute of Informatics, Tokyo, Japan
2
The Graduate University for Advanced Studies (SOKENDAI),
Kanagawa, Japan
April 10, 2019
1 / 36
Our question
“What is an upper bound of the tail probability?”
0 1 2 3 . . .
1 − p 1 − p 1 − p
p p
How likely is it to terminate within 100 steps?
(e.g. at least 90%)
How unlikely is it to not terminate within 100 steps?
(e.g. at most 10%)
step
prob.
100
Pr(T ≥ 100)
tail probability
≤ ??
2 / 36
Related work
Supermartingale-based approach
• Proving almost-sure termination
[Chakarov & Sankaranarayanan, CAV’13]
• Overapproximating tail probabilities:
Pr(T ≥ d) ≤ ??
[Chatterjee & Fu, arxiv preprint], [Chatterjee et al., TOPLAS’18]
• Azuma’s, Hoeffding’s and Bernstein’s
inequalities
• Markov’s inequality (wider applicability)
Pr(T ≥ d) ≤
E[T ]
d
3 / 36
Our approach
• Aim: overapproximating tail probabilities:
Pr(T ≥ d) ≤ ??
• Corollary of Markov’s inequality
Pr(T ≥ d) ≤
E[T k
]
dk
• Extends ranking supermartingale for
higher moments E[T k
] (k = 1, 2, . . . )
4 / 36
Our workflow
randomized program
our supermartingales
upper bounds of higher moments
E[T ], . . . , E[T K
] ≤ (u1, . . . , uK)
concentration inequality
upper bound of tail probability
Pr(T ≥ d) ≤ ?
deadline d
5 / 36
Our workflow
randomized program
our supermartingales
upper bounds of higher moments
E[T ], . . . , E[T K
] ≤ (u1, . . . , uK)
concentration inequality
upper bound of tail probability
Pr(T ≥ d) ≤ ?
deadline d
6 / 36
Randomized program
 sampling
 (demonic/termination avoiding) nondeterminism
Given as a pCFG (probabilistic control flow graph).
1 x := 5;
2 while x  0 do
3 if prob (0.4) then
4 x := x + 1
5 else
6 x := x - 1
7 fi
8 od
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
7 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
1
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
1
1
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
(l4, [x → 5])
1
1
0.4
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
(l4, [x → 5])
...
1
1
0.4
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
(l4, [x → 5])
...
1
1
0.4
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
(l4, [x → 5])
...
(l5, [x → 5])
1
1
0.4 0.6
8 / 36
Semantics
• Configuration: (l, x) ∈ L × RV
• L: finite set of locations
• V : finite set of program variables
• Run: sequence of configurations
l1
l2 l3
l4
l5l6
x := 5
¬(x  0)
x  0
0.4
0.6
x := x + 1
x := x − 1
(l1, [x → 0])
(l2, [x → 5])
(l3, [x → 5])
(l4, [x → 5])
...
(l5, [x → 5])
...
1
1
0.4 0.6
8 / 36
Our workflow
randomized program
our supermartingales
upper bounds of higher moments
E[T ], . . . , E[T K
] ≤ (u1, . . . , uK)
concentration inequality
upper bound of tail probability
Pr(T ≥ d) ≤ ?
deadline d
9 / 36
Ranking function
[Floyd, ’67]
r : L × RV
→ N ∪ {∞}
For each transition, r decreases by (at least) 1:
(l, x) → (l , x ) =⇒ r(l , x ) ≤ r(l, x) − 1
Theorem
If r(l, x)  ∞, then the program is terminating
from (l, x) within r(l, x) steps.
1 x := 5;
2 while x  0 do
3 x := x - 1
4 od
l1
2x + 1
l2
2x
l3
0
x  0
x := x − 1
x ≤ 0
10 / 36
Ranking supermartingale
[Chakarov  Sankaranarayanan, CAV’13]
η : L × RV
→ [0, ∞]
For each transition, η decreases by (at least) 1
“on average”:
(Xη)(l, x) ≤ η(l, x) − 1 for each (l, x)
where X is next-time operator (the expected value
after one transition):
(Xη)(l, x) := E[η(l , x ) | (l, x) → (l , x )].
11 / 36
Ranking supermartingale
Theorem
If η(l, x)  ∞, then the program is (positively)
almost surely terminating from (l, x) with the
expected runtime ≤ η(l, x) steps.
This can be explained lattice-theoretically.
• Expected runtime is a lfp
• Ranking supermartingale is a prefixed point
12 / 36
Runtime before and after transition
Let T (l, x) be a random variable representing the
runtime from (l, x).
l0
T (l0, x0) l1
T (l1, x1)
l2
T (l2, x2)
. . .
. . .
p
1 − p
Runtime from (l0, x0):
• T (l1, x1) + 1 with probability p
• T (l2, x2) + 1 with probability 1 − p
13 / 36
Expected runtime is a fixed point
l0
T (l0, x0) l1
T (l1, x1)
l2
T (l2, x2)
. . .
. . .
p
1 − p
E[T ](l0, x0)
= pE[T (l1, x1) + 1] + (1 − p)E[T (l2, x2) + 1]
= p(E[T (l1, x1)] + 1) + (1 − p)(E[T (l2, x2)] + 1)
= E E[T (l , x )] + 1 | (l0, x0) → (l , x )
= (X(E[T ] + 1))(l0, x0)
where E[T ] := λ(l, x). E[T (l, x)].
14 / 36
Expected runtime is lfp
E[T ] = X(E[T ] + 1)
In fact, E[T ] is the “least” fixed point of
F1(η) := X(η + 1).
• F1 is a monotone function on the complete
lattice [0, ∞]L×RV
• F1 adds 1 unit of time, and then calculate the
expected value after one transition
15 / 36
Ranking supermartingale is prefixed point
η is a ranking supermartingale
⇐⇒ η is a prefixed point of F1
F1η = X(η + 1) ≤ η
Theorem (Knaster–Tarski)
Let L be a complete lattice and F : L → L be a
monotone function. The least fixed point µF is the
least prefixed point. Therefore we have
F η ≤ η =⇒ µF ≤ η.
It follows that
η is a ranking supermartingale =⇒ E[T ] ≤ η.
16 / 36
Our supermartingale
[Chakarov 
Sankaranarayanan,
CAV’13]
lattice L × RV
→ [0, ∞]
monotone
function F
F1
lfp µF E[T ]
prefixed point
F η ≤ η
ranking
supermartingale
η
Knaster–Tarski
µF ≤ η
E[T ] ≤ η
†
for a pCFG without nondeterminism
17 / 36
Our supermartingale
[Chakarov 
Sankaranarayanan,
CAV’13]
Our supermartingale
lattice L × RV
→ [0, ∞] L × RV
→ [0, ∞]K
monotone
function F
F1 FK
lfp µF E[T ] (E[T ], . . . , E[T K
])†
prefixed point
F η ≤ η
ranking
supermartingale
η
ranking
supermartingale
for higher moments
η
Knaster–Tarski
µF ≤ η
E[T ] ≤ η (E[T ], . . . , E[T K
]) ≤ η
†
for a pCFG without nondeterminism
17 / 36
Runtime before and after transition
Let T (l, x) be a random variable representing the
runtime from (l, x).
l0
T (l0, x0) l1
T (l1, x1)
l2
T (l2, x2)
. . .
. . .
p
1 − p
Runtime from (l0, x0):
• T (l1, x1) + 1 with probability p
• T (l2, x2) + 1 with probability 1 − p
18 / 36
Characterizing E[T 2
] as lfp?
l0
T (l0, x0) l1
T (l1, x1)
l2
T (l2, x2)
. . .
. . .
p
1 − p
E[T 2
](l0, x0)
= pE[ T (l1, x1) + 1
2
]
+ (1 − p)E[(T (l2, x2) + 1)2
]
= X(E[T 2
] + 2E[T ] + 1) (l0, x0)
19 / 36
Characterizing E[T 2
] as lfp?
l0
T (l0, x0) l1
T (l1, x1)
l2
T (l2, x2)
. . .
. . .
p
1 − p
E[T 2
](l0, x0)
= pE[ T (l1, x1) + 1
2
]
+ (1 − p)E[(T (l2, x2) + 1)2
]
= X(E[T 2
] + 2E[T ] + 1) (l0, x0)
Calculate E[T ] and E[T 2
] simultaneously
19 / 36
Characterizing E[T ] and E[T 2
] as lfp
E[T ]
E[T 2
]
= X
1
1
+
1 0
2 1
E[T ]
E[T 2
]
In fact, (E[T ], E[T 2
]) is the “least” fixed point of
F2
η1
η2
:= X
1
1
+
1 0
2 1
η1
η2
where
• η1, η2 : L × RV
→ [0, ∞]
• E[T ] = λ(l, x). E[T (l, x)]
• E[T 2
] = λ(l, x). E[ T (l, x)
2
]
20 / 36
Characterizing higher moments as lfp
In the same way, we can define
FK : L × RV
→ [0, ∞]K
→ L × RV
→ [0, ∞]K
that characterizes higher moments of runtime.
Lemma
• For a pCFG without nondeterminism,
(E[T ], . . . , E[T K
]) = µFK.
• In general (with nondeterminism),
(E[T ], . . . , E[T K
]) ≤ µFK.
21 / 36
Supermartingale is a prefixed point
Definition
A ranking supermartingale for K-th moment is
a prefixed point η = (η1, . . . , ηK) of FK.
FKη ≤ η
By the Knaster–Tarski theorem, η gives an upper
bound (even with nondeterminism).


E[T ]
...
E[T K
]

 ≤ µFK ≤


η1
...
ηK


22 / 36
Our workflow
randomized program
our supermartingales
upper bounds of higher moments
E[T ], . . . , E[T K
] ≤ (u1, . . . , uK)
concentration inequality
upper bound of tail probability
Pr(T ≥ d) ≤ ?
deadline d
23 / 36
Problem
Assume
• d  0,
• T is a nonnegative random variable,
•


E[T ]
...
E[T K
]

 ≤


u1
...
uK

,
• but we do not know the exact values of
E[T ], . . . , E[T K
].
How to obtain an upper bound of P (T ≥ d)?
24 / 36
If K = 1 ...
Theorem (Markov’s inequality)
If T is a nonnegative r.v. and d  0,
Pr(T ≥ d) ≤
E[T ]
d
.
By E[T ] ≤ u1,
Pr(T ≥ d) ≤
E[T ]
d
≤
u1
d
.
25 / 36
General case
• For any k ∈ {1, . . . , K},
Pr(T ≥ d) = Pr(T k
≥ dk
)
≤
E[T k
]
dk
≤
uk
dk
• (“0-th” moment)
Pr(T ≥ d) ≤ 1 =
E[T 0
]
d0
26 / 36
Concentration inequality we used
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
where
• d  0
• T is a nonnegative random variable
•


E[T ]
...
E[T K
]

 ≤


u1
...
uK


• u0 = 1
Moreover, this gives the “optimal” upper bound
under the above conditions.
27 / 36
Our workflow
randomized program
our supermartingales
upper bounds of higher moments
E[T ], . . . , E[T K
] ≤ (u1, . . . , uK)
concentration inequality
upper bound of tail probability
Pr(T ≥ d) ≤ ?
deadline d
28 / 36
Synthesis (linear template)
Based on [Chakarov  Sankaranarayanan, CAV’13]
• Input: a pCFG with initial config (linit, xinit)
• Output: an upper bound of E[T K
](linit, xinit)
Assume that η = (η1, . . . , ηK) is linear:
ηk(l, x) = ak,l · x + bk,l (k = 1, . . . , K)
Determine ak,l, bk,l by solving the LP problem:
• minimize: ηK(linit, xinit)
• subject to: ranking supermartingale condition
(using Farkas’ lemma)
Then we have
E[T K
](linit, xinit) ≤ min ηK(linit, xinit)
29 / 36
Synthesis (polynomial template)
Based on [Chatterjee et al., CAV’16]
• Input: a pCFG with initial config (linit, xinit)
• Output: an upper bound of E[T K
](linit, xinit)
Assume that η = (η1, . . . , ηK) is polynomial.
Determine coefficients by solving the SDP problem:
• minimize: ηK(linit, xinit)
• subject to: ranking supermartingale condition
(using Positivstellensatz)
Then we have
E[T K
](linit, xinit) ≤ min ηK(linit, xinit)
30 / 36
Experiments
• Implementation based on linear/polynomial
templates
• Tested 7 example programs
• 2 coupon collector’s problems
• 5 random walks (some of them include
nondeterminism)
• (degree of polynomial template) ≤ 3
31 / 36
Experimental result (1)
A coupon collector’s problem (linear template)
upper bound execution time
E[T ] ≤ 68 0.024 s
E[T 2
] ≤ 3124 0.054 s
E[T 3
] ≤ 171932 0.089 s
E[T 4
] ≤ 12049876 0.126 s
E[T 5
] ≤ 1048131068 0.191 s
32 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
k = 3
k = 4
k = 5
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
k = 3
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
k = 3
k = 4
deadline d
tailprobability
33 / 36
Experimental result (1)
Pr(T ≥ d) ≤ min
k=0,...,K
uk
dk
0 20 40 60 80 100120140
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
k = 3
k = 4
k = 5
deadline d
tailprobability
33 / 36
Experimental result (2)
A random walk with nondeterminism
• Linear template
upper bound execution time
E[T ] ≤ 96 0.020 s
E[T 2
]: infeasible 0.029 s
• Polynomial template
upper bound execution time
E[T ] ≤ 95.95 157.748 s
E[T 2
] ≤ 10944.0 361.957 s
34 / 36
Experimental result (2)
0 100 200 300 400
0
0.2
0.4
0.6
0.8
1
k = 1
k = 2
deadline d
tailprobability
35 / 36
Conclusion  Future work
Conclusion
• New supermartingale for higher moments of runtime
• Applied to obtain upper bounds of tail probabilities
• Tested our method experimentally
Future work
• Improved treatment of nondeterminism
• Compositional reasoning (cf. [Kaminski et al., ESOP’16])
• Improve implementation (numerical error of SDP solver)
36 / 36

More Related Content

What's hot

Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions ManualApplied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manualtowojixi
 
Ece formula sheet
Ece formula sheetEce formula sheet
Ece formula sheetManasa Mona
 
Signal Prosessing Lab Mannual
Signal Prosessing Lab Mannual Signal Prosessing Lab Mannual
Signal Prosessing Lab Mannual Jitendra Jangid
 
Signals and Systems Formula Sheet
Signals and Systems Formula SheetSignals and Systems Formula Sheet
Signals and Systems Formula SheetHaris Hassan
 
Signal fundamentals
Signal fundamentalsSignal fundamentals
Signal fundamentalsLalit Kanoje
 
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...Kasun Ranga Wijeweera
 
The Ring programming language version 1.5.4 book - Part 36 of 185
The Ring programming language version 1.5.4 book - Part 36 of 185The Ring programming language version 1.5.4 book - Part 36 of 185
The Ring programming language version 1.5.4 book - Part 36 of 185Mahmoud Samir Fayed
 
Mathematical formula tables
Mathematical formula tablesMathematical formula tables
Mathematical formula tablesSaravana Selvan
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signalsDr.SHANTHI K.G
 
Lowest common ancestor
Lowest common ancestorLowest common ancestor
Lowest common ancestorShakil Ahmed
 
Classification of Systems: Part 1
Classification of Systems:  Part 1Classification of Systems:  Part 1
Classification of Systems: Part 1Dr.SHANTHI K.G
 
A note on word embedding
A note on word embeddingA note on word embedding
A note on word embeddingKhang Pham
 

What's hot (20)

Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions ManualApplied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
 
Ece formula sheet
Ece formula sheetEce formula sheet
Ece formula sheet
 
Lec9
Lec9Lec9
Lec9
 
Taylor problem
Taylor problemTaylor problem
Taylor problem
 
Signal Prosessing Lab Mannual
Signal Prosessing Lab Mannual Signal Prosessing Lab Mannual
Signal Prosessing Lab Mannual
 
Signals and Systems Formula Sheet
Signals and Systems Formula SheetSignals and Systems Formula Sheet
Signals and Systems Formula Sheet
 
Signal fundamentals
Signal fundamentalsSignal fundamentals
Signal fundamentals
 
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...
Accurate, Simple and Efficient Triangulation of a Polygon by Ear Removal with...
 
The Ring programming language version 1.5.4 book - Part 36 of 185
The Ring programming language version 1.5.4 book - Part 36 of 185The Ring programming language version 1.5.4 book - Part 36 of 185
The Ring programming language version 1.5.4 book - Part 36 of 185
 
Mathematical formula tables
Mathematical formula tablesMathematical formula tables
Mathematical formula tables
 
Mit6 094 iap10_lec04
Mit6 094 iap10_lec04Mit6 094 iap10_lec04
Mit6 094 iap10_lec04
 
Mit6 094 iap10_lec03
Mit6 094 iap10_lec03Mit6 094 iap10_lec03
Mit6 094 iap10_lec03
 
Randomized algorithms ver 1.0
Randomized algorithms ver 1.0Randomized algorithms ver 1.0
Randomized algorithms ver 1.0
 
ABA Problem
ABA ProblemABA Problem
ABA Problem
 
Unit 1 Operation on signals
Unit 1  Operation on signalsUnit 1  Operation on signals
Unit 1 Operation on signals
 
Lec10
Lec10Lec10
Lec10
 
Mit6 094 iap10_lec01
Mit6 094 iap10_lec01Mit6 094 iap10_lec01
Mit6 094 iap10_lec01
 
Lowest common ancestor
Lowest common ancestorLowest common ancestor
Lowest common ancestor
 
Classification of Systems: Part 1
Classification of Systems:  Part 1Classification of Systems:  Part 1
Classification of Systems: Part 1
 
A note on word embedding
A note on word embeddingA note on word embedding
A note on word embedding
 

Similar to Tail Probabilities for Randomized Program Runtimes via Martingales for Higher Moments

IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionCharles Deledalle
 
Seismic data processing lecture 3
Seismic data processing lecture 3Seismic data processing lecture 3
Seismic data processing lecture 3Amin khalil
 
EC8352-Signals and Systems - Laplace transform
EC8352-Signals and Systems - Laplace transformEC8352-Signals and Systems - Laplace transform
EC8352-Signals and Systems - Laplace transformNimithaSoman
 
Conditional Random Fields
Conditional Random FieldsConditional Random Fields
Conditional Random Fieldslswing
 
Bachelor_Defense
Bachelor_DefenseBachelor_Defense
Bachelor_DefenseTeja Turk
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsElvis DOHMATOB
 
A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...OctavianPostavaru
 
2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptxsaadhaq6
 
Rosser's theorem
Rosser's theoremRosser's theorem
Rosser's theoremWathna
 
DIFFERENTIATION Integration and limits (1).pptx
DIFFERENTIATION Integration and limits (1).pptxDIFFERENTIATION Integration and limits (1).pptx
DIFFERENTIATION Integration and limits (1).pptxOchiriaEliasonyait
 
Introduction to MatLab programming
Introduction to MatLab programmingIntroduction to MatLab programming
Introduction to MatLab programmingDamian T. Gordon
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixturesChristian Robert
 

Similar to Tail Probabilities for Randomized Program Runtimes via Martingales for Higher Moments (20)

IVR - Chapter 1 - Introduction
IVR - Chapter 1 - IntroductionIVR - Chapter 1 - Introduction
IVR - Chapter 1 - Introduction
 
Seismic data processing lecture 3
Seismic data processing lecture 3Seismic data processing lecture 3
Seismic data processing lecture 3
 
residue
residueresidue
residue
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
EC8352-Signals and Systems - Laplace transform
EC8352-Signals and Systems - Laplace transformEC8352-Signals and Systems - Laplace transform
EC8352-Signals and Systems - Laplace transform
 
Conditional Random Fields
Conditional Random FieldsConditional Random Fields
Conditional Random Fields
 
Lecture5
Lecture5Lecture5
Lecture5
 
Bachelor_Defense
Bachelor_DefenseBachelor_Defense
Bachelor_Defense
 
MVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priorsMVPA with SpaceNet: sparse structured priors
MVPA with SpaceNet: sparse structured priors
 
Derivatives
DerivativesDerivatives
Derivatives
 
ma112011id535
ma112011id535ma112011id535
ma112011id535
 
A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...A numerical method to solve fractional Fredholm-Volterra integro-differential...
A numerical method to solve fractional Fredholm-Volterra integro-differential...
 
Lec3
Lec3Lec3
Lec3
 
2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx2. Fixed Point Iteration.pptx
2. Fixed Point Iteration.pptx
 
Rosser's theorem
Rosser's theoremRosser's theorem
Rosser's theorem
 
DIFFERENTIATION Integration and limits (1).pptx
DIFFERENTIATION Integration and limits (1).pptxDIFFERENTIATION Integration and limits (1).pptx
DIFFERENTIATION Integration and limits (1).pptx
 
Mcqmc talk
Mcqmc talkMcqmc talk
Mcqmc talk
 
Introduction to MatLab programming
Introduction to MatLab programmingIntroduction to MatLab programming
Introduction to MatLab programming
 
Main
MainMain
Main
 
Bayesian inference on mixtures
Bayesian inference on mixturesBayesian inference on mixtures
Bayesian inference on mixtures
 

Recently uploaded

Genomic DNA And Complementary DNA Libraries construction.
Genomic DNA And Complementary DNA Libraries construction.Genomic DNA And Complementary DNA Libraries construction.
Genomic DNA And Complementary DNA Libraries construction.k64182334
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real timeSatoshi NAKAHIRA
 
zoogeography of pakistan.pptx fauna of Pakistan
zoogeography of pakistan.pptx fauna of Pakistanzoogeography of pakistan.pptx fauna of Pakistan
zoogeography of pakistan.pptx fauna of Pakistanzohaibmir069
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhousejana861314
 
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tanta
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tantaDashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tanta
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tantaPraksha3
 
Scheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxScheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxyaramohamed343013
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTSérgio Sacani
 
Work, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE PhysicsWork, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE Physicsvishikhakeshava1
 
Neurodevelopmental disorders according to the dsm 5 tr
Neurodevelopmental disorders according to the dsm 5 trNeurodevelopmental disorders according to the dsm 5 tr
Neurodevelopmental disorders according to the dsm 5 trssuser06f238
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfSwapnil Therkar
 
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.PraveenaKalaiselvan1
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...jana861314
 
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.aasikanpl
 
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​kaibalyasahoo82800
 
The Black hole shadow in Modified Gravity
The Black hole shadow in Modified GravityThe Black hole shadow in Modified Gravity
The Black hole shadow in Modified GravitySubhadipsau21168
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxSwapnil Therkar
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...RohitNehra6
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsssuserddc89b
 

Recently uploaded (20)

Genomic DNA And Complementary DNA Libraries construction.
Genomic DNA And Complementary DNA Libraries construction.Genomic DNA And Complementary DNA Libraries construction.
Genomic DNA And Complementary DNA Libraries construction.
 
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
9953056974 Young Call Girls In Mahavir enclave Indian Quality Escort service
 
Grafana in space: Monitoring Japan's SLIM moon lander in real time
Grafana in space: Monitoring Japan's SLIM moon lander  in real timeGrafana in space: Monitoring Japan's SLIM moon lander  in real time
Grafana in space: Monitoring Japan's SLIM moon lander in real time
 
zoogeography of pakistan.pptx fauna of Pakistan
zoogeography of pakistan.pptx fauna of Pakistanzoogeography of pakistan.pptx fauna of Pakistan
zoogeography of pakistan.pptx fauna of Pakistan
 
Orientation, design and principles of polyhouse
Orientation, design and principles of polyhouseOrientation, design and principles of polyhouse
Orientation, design and principles of polyhouse
 
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tanta
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tantaDashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tanta
Dashanga agada a formulation of Agada tantra dealt in 3 Rd year bams agada tanta
 
Scheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docxScheme-of-Work-Science-Stage-4 cambridge science.docx
Scheme-of-Work-Science-Stage-4 cambridge science.docx
 
Disentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOSTDisentangling the origin of chemical differences using GHOST
Disentangling the origin of chemical differences using GHOST
 
Work, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE PhysicsWork, Energy and Power for class 10 ICSE Physics
Work, Energy and Power for class 10 ICSE Physics
 
Neurodevelopmental disorders according to the dsm 5 tr
Neurodevelopmental disorders according to the dsm 5 trNeurodevelopmental disorders according to the dsm 5 tr
Neurodevelopmental disorders according to the dsm 5 tr
 
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdfAnalytical Profile of Coleus Forskohlii | Forskolin .pdf
Analytical Profile of Coleus Forskohlii | Forskolin .pdf
 
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
BIOETHICS IN RECOMBINANT DNA TECHNOLOGY.
 
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
Traditional Agroforestry System in India- Shifting Cultivation, Taungya, Home...
 
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
Call Girls in Munirka Delhi 💯Call Us 🔝9953322196🔝 💯Escort.
 
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Munirka Delhi 💯Call Us 🔝8264348440🔝
 
Nanoparticles synthesis and characterization​ ​
Nanoparticles synthesis and characterization​  ​Nanoparticles synthesis and characterization​  ​
Nanoparticles synthesis and characterization​ ​
 
The Black hole shadow in Modified Gravity
The Black hole shadow in Modified GravityThe Black hole shadow in Modified Gravity
The Black hole shadow in Modified Gravity
 
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptxAnalytical Profile of Coleus Forskohlii | Forskolin .pptx
Analytical Profile of Coleus Forskohlii | Forskolin .pptx
 
Biopesticide (2).pptx .This slides helps to know the different types of biop...
Biopesticide (2).pptx  .This slides helps to know the different types of biop...Biopesticide (2).pptx  .This slides helps to know the different types of biop...
Biopesticide (2).pptx .This slides helps to know the different types of biop...
 
TOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physicsTOPIC 8 Temperature and Heat.pdf physics
TOPIC 8 Temperature and Heat.pdf physics
 

Tail Probabilities for Randomized Program Runtimes via Martingales for Higher Moments

  • 1. Tail Probabilities for Randomized Program Runtimes via Martingales for Higher Moments Satoshi Kura1,2 Natsuki Urabe1 Ichiro Hasuo1,2 1 National Institute of Informatics, Tokyo, Japan 2 The Graduate University for Advanced Studies (SOKENDAI), Kanagawa, Japan April 10, 2019 1 / 36
  • 2. Our question “What is an upper bound of the tail probability?” 0 1 2 3 . . . 1 − p 1 − p 1 − p p p How likely is it to terminate within 100 steps? (e.g. at least 90%) How unlikely is it to not terminate within 100 steps? (e.g. at most 10%) step prob. 100 Pr(T ≥ 100) tail probability ≤ ?? 2 / 36
  • 3. Related work Supermartingale-based approach • Proving almost-sure termination [Chakarov & Sankaranarayanan, CAV’13] • Overapproximating tail probabilities: Pr(T ≥ d) ≤ ?? [Chatterjee & Fu, arxiv preprint], [Chatterjee et al., TOPLAS’18] • Azuma’s, Hoeffding’s and Bernstein’s inequalities • Markov’s inequality (wider applicability) Pr(T ≥ d) ≤ E[T ] d 3 / 36
  • 4. Our approach • Aim: overapproximating tail probabilities: Pr(T ≥ d) ≤ ?? • Corollary of Markov’s inequality Pr(T ≥ d) ≤ E[T k ] dk • Extends ranking supermartingale for higher moments E[T k ] (k = 1, 2, . . . ) 4 / 36
  • 5. Our workflow randomized program our supermartingales upper bounds of higher moments E[T ], . . . , E[T K ] ≤ (u1, . . . , uK) concentration inequality upper bound of tail probability Pr(T ≥ d) ≤ ? deadline d 5 / 36
  • 6. Our workflow randomized program our supermartingales upper bounds of higher moments E[T ], . . . , E[T K ] ≤ (u1, . . . , uK) concentration inequality upper bound of tail probability Pr(T ≥ d) ≤ ? deadline d 6 / 36
  • 7. Randomized program sampling (demonic/termination avoiding) nondeterminism Given as a pCFG (probabilistic control flow graph). 1 x := 5; 2 while x 0 do 3 if prob (0.4) then 4 x := x + 1 5 else 6 x := x - 1 7 fi 8 od l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 7 / 36
  • 8. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) 8 / 36
  • 9. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) 1 8 / 36
  • 10. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) 1 1 8 / 36
  • 11. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) (l4, [x → 5]) 1 1 0.4 8 / 36
  • 12. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) (l4, [x → 5]) ... 1 1 0.4 8 / 36
  • 13. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) (l4, [x → 5]) ... 1 1 0.4 8 / 36
  • 14. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) (l4, [x → 5]) ... (l5, [x → 5]) 1 1 0.4 0.6 8 / 36
  • 15. Semantics • Configuration: (l, x) ∈ L × RV • L: finite set of locations • V : finite set of program variables • Run: sequence of configurations l1 l2 l3 l4 l5l6 x := 5 ¬(x 0) x 0 0.4 0.6 x := x + 1 x := x − 1 (l1, [x → 0]) (l2, [x → 5]) (l3, [x → 5]) (l4, [x → 5]) ... (l5, [x → 5]) ... 1 1 0.4 0.6 8 / 36
  • 16. Our workflow randomized program our supermartingales upper bounds of higher moments E[T ], . . . , E[T K ] ≤ (u1, . . . , uK) concentration inequality upper bound of tail probability Pr(T ≥ d) ≤ ? deadline d 9 / 36
  • 17. Ranking function [Floyd, ’67] r : L × RV → N ∪ {∞} For each transition, r decreases by (at least) 1: (l, x) → (l , x ) =⇒ r(l , x ) ≤ r(l, x) − 1 Theorem If r(l, x) ∞, then the program is terminating from (l, x) within r(l, x) steps. 1 x := 5; 2 while x 0 do 3 x := x - 1 4 od l1 2x + 1 l2 2x l3 0 x 0 x := x − 1 x ≤ 0 10 / 36
  • 18. Ranking supermartingale [Chakarov Sankaranarayanan, CAV’13] η : L × RV → [0, ∞] For each transition, η decreases by (at least) 1 “on average”: (Xη)(l, x) ≤ η(l, x) − 1 for each (l, x) where X is next-time operator (the expected value after one transition): (Xη)(l, x) := E[η(l , x ) | (l, x) → (l , x )]. 11 / 36
  • 19. Ranking supermartingale Theorem If η(l, x) ∞, then the program is (positively) almost surely terminating from (l, x) with the expected runtime ≤ η(l, x) steps. This can be explained lattice-theoretically. • Expected runtime is a lfp • Ranking supermartingale is a prefixed point 12 / 36
  • 20. Runtime before and after transition Let T (l, x) be a random variable representing the runtime from (l, x). l0 T (l0, x0) l1 T (l1, x1) l2 T (l2, x2) . . . . . . p 1 − p Runtime from (l0, x0): • T (l1, x1) + 1 with probability p • T (l2, x2) + 1 with probability 1 − p 13 / 36
  • 21. Expected runtime is a fixed point l0 T (l0, x0) l1 T (l1, x1) l2 T (l2, x2) . . . . . . p 1 − p E[T ](l0, x0) = pE[T (l1, x1) + 1] + (1 − p)E[T (l2, x2) + 1] = p(E[T (l1, x1)] + 1) + (1 − p)(E[T (l2, x2)] + 1) = E E[T (l , x )] + 1 | (l0, x0) → (l , x ) = (X(E[T ] + 1))(l0, x0) where E[T ] := λ(l, x). E[T (l, x)]. 14 / 36
  • 22. Expected runtime is lfp E[T ] = X(E[T ] + 1) In fact, E[T ] is the “least” fixed point of F1(η) := X(η + 1). • F1 is a monotone function on the complete lattice [0, ∞]L×RV • F1 adds 1 unit of time, and then calculate the expected value after one transition 15 / 36
  • 23. Ranking supermartingale is prefixed point η is a ranking supermartingale ⇐⇒ η is a prefixed point of F1 F1η = X(η + 1) ≤ η Theorem (Knaster–Tarski) Let L be a complete lattice and F : L → L be a monotone function. The least fixed point µF is the least prefixed point. Therefore we have F η ≤ η =⇒ µF ≤ η. It follows that η is a ranking supermartingale =⇒ E[T ] ≤ η. 16 / 36
  • 24. Our supermartingale [Chakarov Sankaranarayanan, CAV’13] lattice L × RV → [0, ∞] monotone function F F1 lfp µF E[T ] prefixed point F η ≤ η ranking supermartingale η Knaster–Tarski µF ≤ η E[T ] ≤ η † for a pCFG without nondeterminism 17 / 36
  • 25. Our supermartingale [Chakarov Sankaranarayanan, CAV’13] Our supermartingale lattice L × RV → [0, ∞] L × RV → [0, ∞]K monotone function F F1 FK lfp µF E[T ] (E[T ], . . . , E[T K ])† prefixed point F η ≤ η ranking supermartingale η ranking supermartingale for higher moments η Knaster–Tarski µF ≤ η E[T ] ≤ η (E[T ], . . . , E[T K ]) ≤ η † for a pCFG without nondeterminism 17 / 36
  • 26. Runtime before and after transition Let T (l, x) be a random variable representing the runtime from (l, x). l0 T (l0, x0) l1 T (l1, x1) l2 T (l2, x2) . . . . . . p 1 − p Runtime from (l0, x0): • T (l1, x1) + 1 with probability p • T (l2, x2) + 1 with probability 1 − p 18 / 36
  • 27. Characterizing E[T 2 ] as lfp? l0 T (l0, x0) l1 T (l1, x1) l2 T (l2, x2) . . . . . . p 1 − p E[T 2 ](l0, x0) = pE[ T (l1, x1) + 1 2 ] + (1 − p)E[(T (l2, x2) + 1)2 ] = X(E[T 2 ] + 2E[T ] + 1) (l0, x0) 19 / 36
  • 28. Characterizing E[T 2 ] as lfp? l0 T (l0, x0) l1 T (l1, x1) l2 T (l2, x2) . . . . . . p 1 − p E[T 2 ](l0, x0) = pE[ T (l1, x1) + 1 2 ] + (1 − p)E[(T (l2, x2) + 1)2 ] = X(E[T 2 ] + 2E[T ] + 1) (l0, x0) Calculate E[T ] and E[T 2 ] simultaneously 19 / 36
  • 29. Characterizing E[T ] and E[T 2 ] as lfp E[T ] E[T 2 ] = X 1 1 + 1 0 2 1 E[T ] E[T 2 ] In fact, (E[T ], E[T 2 ]) is the “least” fixed point of F2 η1 η2 := X 1 1 + 1 0 2 1 η1 η2 where • η1, η2 : L × RV → [0, ∞] • E[T ] = λ(l, x). E[T (l, x)] • E[T 2 ] = λ(l, x). E[ T (l, x) 2 ] 20 / 36
  • 30. Characterizing higher moments as lfp In the same way, we can define FK : L × RV → [0, ∞]K → L × RV → [0, ∞]K that characterizes higher moments of runtime. Lemma • For a pCFG without nondeterminism, (E[T ], . . . , E[T K ]) = µFK. • In general (with nondeterminism), (E[T ], . . . , E[T K ]) ≤ µFK. 21 / 36
  • 31. Supermartingale is a prefixed point Definition A ranking supermartingale for K-th moment is a prefixed point η = (η1, . . . , ηK) of FK. FKη ≤ η By the Knaster–Tarski theorem, η gives an upper bound (even with nondeterminism).   E[T ] ... E[T K ]   ≤ µFK ≤   η1 ... ηK   22 / 36
  • 32. Our workflow randomized program our supermartingales upper bounds of higher moments E[T ], . . . , E[T K ] ≤ (u1, . . . , uK) concentration inequality upper bound of tail probability Pr(T ≥ d) ≤ ? deadline d 23 / 36
  • 33. Problem Assume • d 0, • T is a nonnegative random variable, •   E[T ] ... E[T K ]   ≤   u1 ... uK  , • but we do not know the exact values of E[T ], . . . , E[T K ]. How to obtain an upper bound of P (T ≥ d)? 24 / 36
  • 34. If K = 1 ... Theorem (Markov’s inequality) If T is a nonnegative r.v. and d 0, Pr(T ≥ d) ≤ E[T ] d . By E[T ] ≤ u1, Pr(T ≥ d) ≤ E[T ] d ≤ u1 d . 25 / 36
  • 35. General case • For any k ∈ {1, . . . , K}, Pr(T ≥ d) = Pr(T k ≥ dk ) ≤ E[T k ] dk ≤ uk dk • (“0-th” moment) Pr(T ≥ d) ≤ 1 = E[T 0 ] d0 26 / 36
  • 36. Concentration inequality we used Pr(T ≥ d) ≤ min k=0,...,K uk dk where • d 0 • T is a nonnegative random variable •   E[T ] ... E[T K ]   ≤   u1 ... uK   • u0 = 1 Moreover, this gives the “optimal” upper bound under the above conditions. 27 / 36
  • 37. Our workflow randomized program our supermartingales upper bounds of higher moments E[T ], . . . , E[T K ] ≤ (u1, . . . , uK) concentration inequality upper bound of tail probability Pr(T ≥ d) ≤ ? deadline d 28 / 36
  • 38. Synthesis (linear template) Based on [Chakarov Sankaranarayanan, CAV’13] • Input: a pCFG with initial config (linit, xinit) • Output: an upper bound of E[T K ](linit, xinit) Assume that η = (η1, . . . , ηK) is linear: ηk(l, x) = ak,l · x + bk,l (k = 1, . . . , K) Determine ak,l, bk,l by solving the LP problem: • minimize: ηK(linit, xinit) • subject to: ranking supermartingale condition (using Farkas’ lemma) Then we have E[T K ](linit, xinit) ≤ min ηK(linit, xinit) 29 / 36
  • 39. Synthesis (polynomial template) Based on [Chatterjee et al., CAV’16] • Input: a pCFG with initial config (linit, xinit) • Output: an upper bound of E[T K ](linit, xinit) Assume that η = (η1, . . . , ηK) is polynomial. Determine coefficients by solving the SDP problem: • minimize: ηK(linit, xinit) • subject to: ranking supermartingale condition (using Positivstellensatz) Then we have E[T K ](linit, xinit) ≤ min ηK(linit, xinit) 30 / 36
  • 40. Experiments • Implementation based on linear/polynomial templates • Tested 7 example programs • 2 coupon collector’s problems • 5 random walks (some of them include nondeterminism) • (degree of polynomial template) ≤ 3 31 / 36
  • 41. Experimental result (1) A coupon collector’s problem (linear template) upper bound execution time E[T ] ≤ 68 0.024 s E[T 2 ] ≤ 3124 0.054 s E[T 3 ] ≤ 171932 0.089 s E[T 4 ] ≤ 12049876 0.126 s E[T 5 ] ≤ 1048131068 0.191 s 32 / 36
  • 42. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 k = 3 k = 4 k = 5 deadline d tailprobability 33 / 36
  • 43. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 deadline d tailprobability 33 / 36
  • 44. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 deadline d tailprobability 33 / 36
  • 45. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 deadline d tailprobability 33 / 36
  • 46. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 k = 3 deadline d tailprobability 33 / 36
  • 47. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 k = 3 k = 4 deadline d tailprobability 33 / 36
  • 48. Experimental result (1) Pr(T ≥ d) ≤ min k=0,...,K uk dk 0 20 40 60 80 100120140 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 k = 3 k = 4 k = 5 deadline d tailprobability 33 / 36
  • 49. Experimental result (2) A random walk with nondeterminism • Linear template upper bound execution time E[T ] ≤ 96 0.020 s E[T 2 ]: infeasible 0.029 s • Polynomial template upper bound execution time E[T ] ≤ 95.95 157.748 s E[T 2 ] ≤ 10944.0 361.957 s 34 / 36
  • 50. Experimental result (2) 0 100 200 300 400 0 0.2 0.4 0.6 0.8 1 k = 1 k = 2 deadline d tailprobability 35 / 36
  • 51. Conclusion Future work Conclusion • New supermartingale for higher moments of runtime • Applied to obtain upper bounds of tail probabilities • Tested our method experimentally Future work • Improved treatment of nondeterminism • Compositional reasoning (cf. [Kaminski et al., ESOP’16]) • Improve implementation (numerical error of SDP solver) 36 / 36