• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Probability and Uncertainty in Software Engineering (keynote talk at NASAC 2013)
 

Probability and Uncertainty in Software Engineering (keynote talk at NASAC 2013)

on

  • 2,760 views

keynote talk presented at the 2013 National Software Application Conference (NASAC 2013), Tianjin, China, 9 November 2013

keynote talk presented at the 2013 National Software Application Conference (NASAC 2013), Tianjin, China, 9 November 2013

Statistics

Views

Total Views
2,760
Views on SlideShare
857
Embed Views
1,903

Actions

Likes
3
Downloads
0
Comments
0

8 Embeds 1,903

http://www.comp.nus.edu.sg 1891
http://www.linkedin.com 2
https://translate.googleusercontent.com 2
https://www.google.com.sg 2
http://www.google.com.sg 2
http://www0.comp.nus.edu 2
http://www.comp.nus.edu 1
http://www.google.com 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Probability and Uncertainty in Software Engineering (keynote talk at NASAC 2013) Probability and Uncertainty in Software Engineering (keynote talk at NASAC 2013) Presentation Transcript

    • Probability and Uncertainty in Software Engineering David S. Rosenblum Dean, School of Computing National University of Singapore NASAC 2013,Tianjin, 9 November 2013
    • Software Engineering
 at NUS Hugh
 Anderson Khoo
 Siau Cheng Chin
 Wei Ngan Damith
 Rajapakse Dong
 Jin Song Aquinas
 Hobor David Abhik
 Rosenblum Roychoudhury NASAC 2013,Tianjin, 9 November 2013 Joxan Jaffar Stan
 Jarzabek Bimlesh
 Wadhwa Yap
 Hock Chuan,
 Roland
    • Certainty in
 Software Engineering Engineering of software is centered around simplistic, “yes/no” characterizations of artifacts Program is correct/incorrect Program execution finished/crashed Compilation completed/aborted Test suite succeeded/failed Specification is satisfied/violated NASAC 2013,Tianjin, 9 November 2013
    • Example Model Checking ✕ System State Machine Model Model Checker Results !(( ¬p → ◊q ) ∧") Requirements Temporal
 Property NASAC 2013,Tianjin, 9 November 2013 Counterexample Trace
    • Uncertainty in
 Software Engineering ✓ Nondeterminism ✓ Randomized Algorithms ✓ “Good Enough Software” ✓ Test Coverage Metrics Probabilistic Modeling and Analysis NASAC 2013,Tianjin, 9 November 2013
    • Probabilistic
 Model Checking ✓ 0.6 ✕ 0.4 System Probabilistic State Machine Model Model Checker Results P≥0.95 [!(( ¬p → ◊q ) ∧")] Requirements Probabilistic Temporal
 Property NASAC 2013,Tianjin, 9 November 2013 Counterexample Trace
    • Probabilistic
 Model Checking ✓ 0.6 ✕ 0.4 System Probabilistic State Machine Model Model Checker Results P=? [!(( ¬p → ◊q ) ∧") ] Requirements Probabilistic Temporal
 Property 0.9732 Quantitative Results NASAC 2013,Tianjin, 9 November 2013 Counterexample Trace
    • Example
 Die Tossing Simulated by Coin Flipping 0.5 0.5 3 1 0.5 0.5 0 0.5 4 0.5 0.5 0.5 0.5 0.5 5 0.5 2 0.5 0.5 6 0.5 The behavior is governed by a theoretical probability distribution NASAC 2013,Tianjin, 9 November 2013 Knuth-Yao algorithm,
 from the PRISM group
 (Kwiatkowska et al.)
    • Probabilistic
 Model Checking ✓ 0.6 0.4 System Probabilistic State Machine Model Model Checker Results P≥0.95 [!(( ¬p → ◊q ) ∧")] Requirements Probabilistic Temporal
 Property 0.9732 Quantitative Results NASAC 2013,Tianjin, 9 November 2013 Counterexample Trace
    • Probabilistic
 Model Checking 0.59 ✕ 0.41 System Probabilistic State Machine Model Model Checker Results P≥0.95 [!(( ¬p → ◊q ) ∧")] Requirements Probabilistic Temporal
 Property 0.6211 Quantitative Results NASAC 2013,Tianjin, 9 November 2013 Counterexample Trace
    • Example Zeroconf Protocol packet-loss rate {start} s0 q s1 1-p 1-q s7 p s2 p s3 p s4 1-p p 1-p 1-p s5 1 1 s8 s6 1 {ok} from the PRISM group
 (Kwiatkowska et al.) 1 {error} The behavior is governed by an empirically estimated probability distribution NASAC 2013,Tianjin, 9 November 2013
    • Perturbed Probabilistic Systems (Current Research) • Starting Points ✓ Discrete-Time Markov Chains (DTMCs) ✓ … with one or more probability parameters ✓ … verified against reachability properties: S? ∪ S! Guoxin Su and David S. Rosenblum,
 “Asymptotic Bounds for Quantitative Verification of Perturbed Probabilistic Systems”,
 Proc. ICFEM 2013 NASAC 2013,Tianjin, 9 November 2013
    • Parametric
 Markov Chains • A distribution parameter in a DTMC is represented as a vector x of parameters xi • The norm of total variance represents the amount of perturbation: ! v = ∑ vi • The parameter is allowed a “sufficiently small” perturbation with respect to ideal reference values r: ! x−r ≤ Δ • Can generalize to multiple parameters NASAC 2013,Tianjin, 9 November 2013
    • Perturbation Bounds • Perturbation Function ∞ ! (( ) ρ ( x ) = ι? i ∑ A x i b x − ( A i i b ) i=0 i ) where A is the transition probability sub-matrix for S? and b is the vector of one-step probabilities from S? to S! ! • Condition Numbers ! ⎧ ρ (x − r) κ = lim sup ⎨ δ →0 ⎩ δ : ⎫ x − r ≤ δ ,δ > 0 ⎬ ⎭ NASAC 2013,Tianjin, 9 November 2013
    • Results Noisy Zeroconf (35000 Hosts, PRISM) p 0.095 0.096 0.097 0.098 0.099 0.100 0.101 0.102 0.103 0.104 0.105 Actual Collision Probability -19.8% -16.9% -12.3% -8.33% -4.23% 1.8567 ✕ 10-4 +4.38% +8.91% +13.6% +18.4% +23.4% NASAC 2013,Tianjin, 9 November 2013 Predicted Collision Probability -21.5% -17.2% -12.9% -8.61% -4.30% — +4.30% +8.61% +12.9% +17.2% +21.5%
    • Additional Aspects • Models ✓ Markov Decision Processes (MDPs) ✓ Continuous-Time Markov Chains (CMTCs) • Verification ✓ LTL Model Checking using Deterministic Rabin Automata ✓ PCTL Model Checking with singular perturbations due to nested P[ ] operators ✓ Reward Properties ✓ Alternative Norms and Bounds Kullback-Leibler Divergence, Quadratic Bounds NASAC 2013,Tianjin, 9 November 2013
    • Other Forms of Uncertainty “There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know.” ! — Donald Rumsfeld NASAC 2013,Tianjin, 9 November 2013
    • Uncertainty in Testing (New Research) 1982: Weyuker: Non-Testable Programs Impossible/too costly to efficiently check results Example: mathematical software - 2010: Garlan: Intrinsic Uncertainty Systems embody intrinsic uncertainty/imprecision Cannot easily distinguish bugs from “features” Example: ubiquitous computing - NASAC 2013,Tianjin, 9 November 2013
    • Example Google Latitude When is an
 incorrect location a bug, and when
 is it a “feature”? ~ 500m ~ 50m And how do you know? ~ 2m NASAC 2013,Tianjin, 9 November 2013
    • Example Affective Computing When is an incorrect classification a bug, and when is it a “feature”? And how do you know? NASAC 2013,Tianjin, 9 November 2013
    • Sources of
 Uncertainty ✓ Output: results, characteristics of results ✓ Sensors: redundancy, reliability, resolution ✓ Context: sensing, inferring, fusing ✓ Machine learning: imprecision, user training These create significant challenges for
 software engineering research and practice! NASAC 2013,Tianjin, 9 November 2013
    • Conclusion ✓ Software engineering (certainly) suffers from excessive certainty ✓ A probabilistic mindset offers greater insight ✓ But significant challenges remain for probabilistic verification ✓ And other forms of uncertainty are equally challenging to address NASAC 2013,Tianjin, 9 November 2013
    • Probability and Uncertainty in Software Engineering David S. Rosenblum Dean, School of Computing National University of Singapore Thank You! NASAC 2013,Tianjin, 9 November 2013