This document discusses how formal methods can help certify dependable neural networks and ensure safety. It summarizes three in-house projects at fortiss GmbH aimed at using formal methods: 1) nn-verifier, a tool that uses constraint programming to formally verify properties of neural networks, 2) Formal synthesis of runtime monitors from specifications to constrain neural network outputs, and 3) Research towards understanding neural networks through formal verification and certification approaches analogous to standards like DO-178C. The goal is applying formal methods to analyze neural networks and guarantee properties like safety.
1. fortiss GmbH
An-
Formal Methods for Dependable Neural Networks
Towards Certifying Dependable Neural Networks and the role of Formal Methods
ACM Chapters Computer Science in Cars Symposium (CSCS 2017), Munich
Chih-Hong Cheng, Georg
Content of this work is based on a project contributed also by other researchers within fortiss:
Federik Diehl, Gereon Hinz, Michale Troung Le, Markus Rickert, Harald Ruess
2. Landesinstitut des Freistaats Bayern
2
• Non-profit academic research institute
• Associated with TU Munich
fortiss - An-Institut Technische
3. Data-driven engineering
Classical approach from specification to
implementation
• Costly to execute process-based
certification (ISO 26262, DO-178C, IEC
61508)
fortiss GmbH3
Artificial Neural Network (ANN) approach learning
from data
• Fast to develop (quick win)
•
• Implementation behaves like a black box
• No certification method exists to deal with ANN
Shall we just completely drop off existing assurance
approaches (ISO-26262) and embrace the new era?
ISO-26262
DO-178C
Requirement-to-code traceability
Architecture deign
Testing and verification
5. Towards dependable ANN
from a certification perspective
• Goal of process-based certification as in DO-178C, ISO 26262,
IEC 61508:
– Specification
• Assume that specification is correct, or
• Provide evidence that specification is correct
– Provide evidence that implementation realizes specification
• Via providing understandability guarantees (e.g., code block X realizes
specification Y)
• Via testing and coverage criteria, or static analysis (e.g., DO-333)
5
Deep / Convolutional / Recurrent
Neural Network for
control policy decision
6. Towards dependable Neural Networks
From a certification perspective, we need to have
• Understandability: associate each substructure of an ANN
with a partial specification/functionality
– E.g., research works regarding deconvolution or heat-maps are
approaches towards this direction
• Correctness: provide (best effort) correctness claims over
partial classical specification (e.g., road safety, traffic rules)
• Accountability: the infrastructure allows to, whenever an
undesired behavior occurs in run-time, to backtrack and
understand if
– modern specification is correct and complete
– implementation realizes modern specification (due to limitation
of best effort approaches)6
10. Formal verification via constraint programming
• Neural networks with piece-wise linear activation
functions can be modeled as mixed-integer-
linear constraints.
• By presenting the property of the network to be
constraints or objectives, we reduce the
verification problem to a MILP problem.
– By solving the optimization problem we compute the
robustness or prove a property of the neural network.
Examples of using this technique?
10
11. Example 1
Resilience bound for neural networks
11
Neural Network
Neural Network
good can your neural
network resist sensor
A formal, computable, and comparable
measure can act as an indicator or as a
differentiator
12. Defining Resilience
• We define a resilience metric that can be computed
precisely
– F maximal
allowed perturbation
•
we go beyond a single image and noise
• Equivalently, we ask what is the minimal perturbation to lead to bad
behavior
• The tricky part is about the output layer
–
into MILP
– softmax: a function that involves the computation of
exponentials e^{x}
fortiss GmbH12
Preprint available at
https://arxiv.org/abs/1705.01040
95 % : 3
5 % : 8
21 % : 2
21 % : 3
57 % : 8
image Mininumly perturbed image
Neural
Network description
formal reasoning engine
(nn-verifier)
outputs
input
13. Example 2
Safety of highway motion predictor
Properties under consideration:
• [Problematic decision] Is it possible for the
controller to suggest go left, while there is
already car in the left?
• [Strange speed range] Is it possible, for all
cars to be between 100~110 km/h, that the
controller suggests to run 200 km/h?
• [Effect of output difference] By introducing
sensor imprecision, what can be the maximal
speed difference suggested by the neural
network?
13
Highway motion predictior, being trained under the NGSim dataset
SlideShare users: Move to the next page
for accessing the Youtube Video!
https://www.youtube.com/watch?v=C_Z2s-fauKY
14. Demonstration
nn-verifier (research prototype from fortiss)
14
The verification technique is implemented in a tool called
nn-verifier, using IBM CPLEX as its underlying MILP solver.
SlideShare users: Move to the next page for accessing the Youtube Video!
http://www.youtube.com/watch?v=BK825-_ScCU
15. Formal methods for dependable neural networks
In-house projects
Dependable
Neural Network
Understandability
AccountabilityCorrectness
15
Formal synthesis for pervasive
controllers for run-time
monitoring/enforcement
16. Overlaying a neural network by a monitor / regulator
You may not always want to trust neural network
• This design of monitor/regulator is fine to take partial specifications, such as safety rules
– It constrains some output values created by the neural network
fortiss GmbH16
Controller
being trained
using neural
network
Sensor input
Prohibit
some
actions
Actuation
Controller
(synthesized)
from formal,
classical
specification
Allowed:
{speed-up,
go-right}
speed-up: 0%
go-right: 45%
go-left: 47%
go-right
The
action
with 2nd
largest
prob is
selected!e
17. Run-time monitors / enforcement
• Create components from formal specifications
– Runtime monitoring units
• For examining if current output is has high confidence
• For examining if current output is consistent with actions regulated by the partial specification
– Runtime enforcement units
• Perform corrective actions
• We want to use formal specification, formal synthesis, and model checking to guarantee highest
safety requirement such as SIL-4 or ASIL-D
17
18. Synthesizing monitors = finding maximal pervasive controller
• The basic concept is about maximum pervasive
controller in a safety game
• It is more complicated when numeric is involved,
i.e., to have specification that goes beyond
Boolean variables
risk attractors
(states which eventually leads to risk)
Risk
C8
C2
b a
c
a
c
Maximum
pervasive controller
19. Demonstration
Formal synthesis of pervasive controllers (research prototype from fortiss)
19
SlideShare users: Please
YouTube video to see the
explanation!
http://www.youtube.com/watch?v=p26rfsl-ohk
The gamified simulator is modified from the highway overtaking simulator
from the MIT 6.S094 course http://selfdrivingcars.mit.edu/deeptrafficjs/
20. Outlook
• Formal methods can help introducing neural networks in critical environments
– From formal verification to run-time verification/enforcement
• Further research directions
– A certification roadmap, as well as formal method complements
• Analogous to DO-178C (safety for civil avionics) and DO-333 (formal method complement)
– Understandablity of neural networks by formal methods
– Scalability of verification by combining approaches (e.g., by taking knowledge of de-convolution)
–
20
21. Dr. Chih-Hong Cheng, Georg
fortiss GmbH
An-
tel +49 89 3603522 11 fax +49 89 3603522 50
info@fortiss.org
www.fortiss.org
21