How To Fix Mercedes Benz Anti-Theft Protection Activation Issue
Zander eng scd_final
1. Model-based Testing of Embedded Real-Time Systems
in the Automotive Domain
Eng. Sc. D. candidate: Justyna Zander-Nowicka
Supervisor: Prof. Dr.–Ing. Ina Schieferdecker (TU Berlin)
Supervisor: Prof. Dr. rer. nat. Ingolf Krüger (UC San Diego)
Committee Chairman: Prof. Dr.–Ing. Clemens Gühmann (TU Berlin)
Doctoral Thesis Defense: December, 19th, 2008
2. Motivation
Embedded systems in the automotive domain:
Addressed characteristics: Environment
Hybrid: continuous signal flows and discrete events
Real-time
Reactive Sensor Actuator
The complexity of car software
dramatically increases
The functions are distributed Embedded System
Demand to shorten time-to-market
Demand for quality assurance safety Embedded Software
standards IEC 61508 and ISO 26262
Testing:
Demands 40-60% (EU, 2005) of development resources
Systematic and automatic test approach starting at the earliest model phase of
software development still missing
Different companies use different technologies, methods, and tools
2 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
3. Outline
I. Model-based Testing (MBT)
II. Signal-feature Test Paradigm
– Test Development Process
– Signal-feature Detection for Test Evaluation
– Signal-feature Application for Test Data Generation
III. The Test System
IV. Case Study
V. Summary and Outlook
3 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
4. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Model-Based Testing
Model-based testing is testing in which the entire test specification is derived in
whole or in part from both the system requirements and a model that describe
selected functional aspects of the system under test (SUT).
In this context, the term entire test specification covers the abstract test
scenarios made concrete by the sets of test data and the expected SUT outputs.
The test specification is organized in a set of test cases.
Model-in-the-Loop for Embedded System Test (MiLEST) proposed in this thesis.
REQUIREMENTS
Test Objectives
SYSTEM MODEL Interfaces and TEST MODEL
Test Objectives
4 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
5. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
MBT Taxonomy
Acknowledgement: M. Utting et al. (2006)
MiLEST
5 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
6. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Related Work Challenges and Thesis Contributions
Open Issues: Solutions:
Test data specification Automatic and systematic test data
automatic, but only for structural generation for functional test
test or state-based models Signal-feature – based generation of test
systematic for functional test, but data
still only manual
Automatic test evaluation, but mainly Automatic and online test evaluation
based on the reference signal flows based on reference signal partition
Signal-feature – based test assessment by
application of validation functions
Test process established, but not Automation of MBT process
efficient enough in terms of cost, jjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjjj
Application of automated hierarchical test
efforts, and reusability patterns and transformations
jjjjjjjjj
Main goal automatically created test design executable at the model level.
6 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
7. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Signal Feature – Definition
A signal feature, also called signal property, is a formal description of certain
predefined attributes of a signal. It is an identifiable, descriptive property of a
signal.
A feature can be composed of or predicated by other features, logical
connectives, or timing expressions.
f(kT) local max
decrease step response
increase characteristics
constant
kT
time partitioning
Acknowledgement: E. Lehmann (2003)
7 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
8. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Proposed Test Development Process
SUT as a Model
automatic generation – step I
Test Harness Generation
manual refinement – step II
Test Specification
automatic generation – step III
Test Data & Test Control Generation
automatic execution – step IV
Verdicts Analysis
8 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
9. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Test Harness
9 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
10. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Signal-Feature Detection and Generation
- temporal expressions
(e.g., after(5ms))
- logical connectives
(e.g., and)
transformation
Generate Detect
Signal Feature Signal Feature
transformation
10 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
11. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Increase Detection and Generation
Detect Increase:
The Increase feature can be detected by analyzing its derivative. This can be
approximated (simplified version of the algorithm!) where the actual signal value
and the past one (backward difference):
feature(kT) = sign [signal (kT) − signal ((k − 1) * T)]
feature(kT) is positive if the signal increases.
Generate Increase:
Example:
11 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
12. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Classification of Signal-Feature Detection Mechanisms (1)
Time-independent
Acknowledgement: A. Marrero-Pérez (2007)
12 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
13. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Classification of Signal-Feature Detection Mechanisms (2)
Triggered
Acknowledgement: A. Marrero-Pérez (2007)
13 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
14. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Levels of Test Design in MiLEST
Test Specification Test Data Generation
designed manually obtained automatically
refinement
Transformation
abstraction
Test Harness Level Test Harness Level
Transformation
Test Requirement Level Test Requirement Level
Transformation
Validation Function Level Test Case Level
Transformation
Feature Detection Level Feature Generation Level
IF preconditions set THEN assertions set
IF preconditions set THEN generations set
14 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
15. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Test Patterns Classification
Test Specification Patterns
Test Requirement Level
Validation Function Level
- Signal-Feature Detection Level
Test activity Test pattern name Context Problem Solution instance
Detect step Test of an electronic Assessment of an ECU behavior in
Test specification
response features control unit (ECU) terms of a selected signal feature
15 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
16. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Pedal Interpretation Example
Interpretation of accelerator pedal position
Normalized accelerator pedal position (phi_Acc) should be interpreted as
desired driving torque (T_des_Drive). The desired driving torque is scaled in
the non-negative range in such a way that the higher the velocity (v) is
given, the lower driving torque is obtained (Conrad, 2004).
IF v=const AND phi_Acc increases
THEN T_des_Drive increases
IF v=const AND phi_Acc decreases
THEN T_des_Drive decreases
IF v=const AND phi_Acc=const
THEN T_des_Drive=const
IF v increases AND phi_Acc=const
AND T_des_Drive>=0
THEN T_des_Drive does not increase
16 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
17. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
MiLEST Quality Metrics – an Example
# of variants for a selected SigF applied in a test design
Variants coverage for a SigF =
# of all possible variants for a selected SigF
Pedal Interpretation – Pedal Interpretation –
manual generation automatic generation
100 100
90 90
80 80
70 70
60 60
% 50 % 50
40 40
30 30
20 20
10 10
0 0
v
phi_Acc
phi_Brake
v
phi_Acc
phi_Brake
Consider MiLEST Quality Metrics
in combination (weight-based
approach), not in isolation!
17 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
18. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
MiLEST Summary
Contributions: Achieved goals:
Automatic and systematic test data Systematic and consistent functional
generation for functional test based test specification
on signal-feature concept Automation of the test specification
Automatic and online test evaluation Novel manner of signal assessment
based mainly on signal-feature Continuous observation of the SUT
taxonomy
Automation of the test process
Reusable test patterns constituting a Abstract and concrete views (i.e.,
test framework abstract libraries concretized in the
test system)
MBT methodology and automated Requirements- & model-based testing
test process Test execution starting at MiL level
System model and test model in
common execution environment
18 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
19. I. Model-based Testing II. Signal-Feature Test Paradigm III. The Test System IV. Case Study V. Summary & Outlook
Future Work
Specific for current version of MiLEST:
Test stimuli generation algorithms and their optimization
Transposition of conditional rules (i.e., automation of the test specification)
Further test patterns (e.g., incl. different numerical integration methods)
Combination of the methods for verification or failure analysis purpose
Automation for negative testing
More case studies (i.e., test approach scalability)
Further domains (e.g., aerospace, railway, space, earth, or military systems)
Further execution platforms
Mapping to TTCN-3 es
Further test quality metrics such as:
Cost/effort needed for constructing a test data set
Relative number of found errors in relation to the number of test cases
needed to find them
19 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
20. Thank you so much!
justyna.zander-nowicka@fokus.fraunhofer.de
22. Automotive Embedded System Test Dimensions
22 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
23. Related Work – Tools
Criteria Test Specification
Test Transformation
Selected Test Manual Automatic Test and
Evaluation
Methodologies, Test Case / Test Case / Formal Patterns Support Automation
Scenarios
Technologies, Tools Test Data Test Data Verification Facilities
as Driving
Specification Generation
Force
EmbeddedValidator + + (15 patterns)
MTest with CTE/ES +
Reactis Tester + +
Reactis Validator + + –/+ (2 patterns)
Simulink®
Verification and + + + (12 patterns)
Validation™
Simulink® Design
Verifier™
+ + –/+ (4 patterns)
SystemTest™ +
TPT + +
T-VEC + +
Transformations
Approach (Dai,‘06)
+ +
Watchdogs (Conrad,‘98) +
MiLEST + + + (~50 patterns) +
23 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
24. MiLEST with respect to MBT Taxonomy
Test Test Generation: Test Execution Test Evaluation:
Approach Selection Criteria Options Specification
and Technology and Technology
- data coverage - MiL - reference signal-feature – based
- requirements coverage - reactive - requirements coverage
MiLEST - test case specifications - test evaluation specifications
- automatic generation - automatic
- offline generation - online evaluation
24 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
25. Signal Features – a Descriptive Approach
u1(time) -
step
time
q1(time)- ts
step
response max ess
tr
time
Step response characteristics: rise time (tr), maximum overshoot (max), settling time (ts), steady state error (ess)
25 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
29. From Signal Feature Detection to Signal Feature Generation
Signal feature Trigger-independent
detection
Detect signal value
Immediately Detect increase / decrease
identifiable
…
Signal feature Trigger-independent
generation
Any curve coming through a given value
within the permitted range of values, where
duration time is default
Immediately Any increasing/decreasing function with a
identifiable default/given slope or other characteristics
in the permitted range of values, where
duration time is default
…
29 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
30. MSOffice8
Classification of Signal Features based on their Detection
Type
Immediately Identifiable with Identifiable with
identifiable determinate delay indeterminate delay
Detect signal value Detect max / min / Detect duration of
Detect increase / decrease / inflection every single delay
Time-independent
constant Detect peak
Detect continuous signal / Detect impulse
derivative Detect step
Detect linearity (w.r.t. 1st value)
Detect functional relation y = f(x)
Detect causal filter
Detect max-to-date / min-to-date
Detect signal value @ time1 Detect any time Detect step response
Detect time stamp independent features characteristics
Triggered
Detect any time independent over a time interval (rise time, settling
features over a time interval e.g., value @ time, overshoot)
e.g., value @ time1 time of max Detect response delay
e.g., value @ [time1, time2] Detect complete step
30 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
31. Slajd 30
MSOffice8 Complete Step Detection : das was die Preconditions von Step response machen: Step detektieren und dann triggern wenn die signale 'Step' und 'Step
response' sich stabilisiert haben.
Step detection : detektiert nur ein Step, triggert also direkt beim Step
max-to-date : speichert immer den maximalen Wert bislang. wenn ich z.B. die Wertefolge habe 0 1 2 3 5 6 10 9 54 6 7 3, max-to-date liefert: 0 1 2 3 4 6
10 10 54 54 54 54
Justyna Zander-Nowicka, 12/12/2006
32. Signal-Features Classification (excerpt)
Evaluation View Generation View
SigF
Time-independent
Signal value detection Any curve crossing the value of interest in the permitted range of values,
where duration time = default
Immediately identifiable
Generation information:
- value of interest
Basic mathematical operations (e.g., Any curve described by a basic mathematical operations (e.g., crossing zero
zero detection) value in the permitted range of values), where duration time = default
Generation information:
- time of zero crossing
Increase detection Any ramp increasing with a default/given slope in the permitted range of
values, where duration time = default
Generation information:
- slope
- initial output
- final output
Decrease detection Any ramp decreasing with a default/given slope in the permitted range of
values, where duration time = default
Generation information:
- slope
- initial output
- final output
Constant detection Any constant in the permitted range of values, where duration time = default
Generation information:
- constant value
Signal continuity detection Any continuous curve in the permitted range of values, where duration time =
default
31 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
33. IF – THEN Rules
Logical connectives, e.g.:
IF constrained_inputsn AND constrained_outputsm THEN constrained_inputsn AND constrained_outputsm
IF constrained_inputsn AND constrained_outputsm THEN constrained_outputsm
IF constrained_inputsn THEN constrained_inputsn AND constrained_outputsm
IF constrained_inputsn THEN constrained_outputsm
IF true ^ any constraints THEN constrained_outputsm
Alternative, i.e.:
THEN B
IF A OR C
OR D
Temporal expressions, e.g.:
IF A THEN during(x)B AND after(y)C
32 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
34. Test Patterns Classification (2)
Test Data Structure Pattern
Test Requirement Level
Test Case Level
- Signal-Feature Generators
Test activity Test pattern name Context Problem Solution instance
Evaluation of a step
Test data Generate signal Generation of appropriate
response function is
generation feature signal to stimulate an SUT
intended
Test Control Patterns (e.g., for reactive testing)
Test activity Test pattern name Context Problem Solution instance
IF verdict=pass or verdict=fail or
Automatic Test of an Establishing of the verdict=error of a test case
Test control THEN leave this test case at that time
sequencing of test electronic starting point of the
specification point & execute the next test case
cases control unit next test case
starting at that established time point
Test Harness Pattern
33 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
35. Combination of Variants
Combination techniques:
Minimal combination
One factor at a time
N-wise combination
Others:
Complete combination
Random combination
etc...
34 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
36. Pedal Interpretation Component
driving torque
velocity (v) (T_des_Drive)
acceleration System
pedal (phi_Acc) under
Test
… …
35 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
37. Test Data Patterns Derivation
Interpretation of accelerator pedal position
Normalized accelerator pedal position should be interpreted as desired driving torque. The desired
driving torque is scaled in the non-negative range in such a way that the higher the velocity is given,
the lower driving torque is obtained.
v,
Generate v=const
phi_Acc
Generate phi_Acc increases
0,0 time
Generate v=const
Generate phi_Acc decreases v,
phi_Acc
Generate v=const
Generate phi_Acc=const 0,0 time
v,
IF T_des_Drive>=0 phi_Acc
Generate v increases
Generate phi_Acc=const 0,0 time
v,
phi_Acc
0,0 time
36 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
38. Concrete Test Data
Range
phi_Acc constraints
0,0 time
velocity
0,0 time
phi_Brake Temporal
constraints
0,0 time
37 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
39. Variants for the Increase Generation – Concrete View
Consider the velocity of a car < -10, 70 > with the partition point of 0.
Then, using the classification tree method (Grochtmann & Grimm, 1993), and
the formulas:
<pn, pn + 10% * (pn+1 – pn)> and <pn – 10% * (pn – pn-1), pn>
Increase variants are: <-10, -9>, <-1, 0>, (0, 7>, <63, 70>.
38 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
40. Concrete Test Data Variants
phi_Acc2
100
v phi_Acc 90
80
v1 v2 v3 v4 v5 phi_Acc1 phi_Acc2 70
60
phi_Acc
{-10} {-5} {0} {35} {70} [0,10] [90,100] 50
40
30
One factor at a time combination 20
phi_Acc1
SUT inputs 10
0
phi_Acc v 0 2 4 6 8 10 12
time [s]
phi_Acc1 phi_Acc2 v1 v2 v3 v4 v5
iteration1 iteration 2 iteration3 iteration 4 iteration 5 iteration6
1 t0
iterations[n]
time [units]
90
80
2 t1 v5
70
60
3 t2 50
v4
v
40
4 t3 30
20
5 t4 10
v3
0 v2
v1 v1
-10
6 t5
0 2 4 6 8 10 12
time [s]
39 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
41. Set of Test Cases Sequenced in Test Suites
40 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
42. Test Cases Sequenced in Test Suite
41 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
43. MiLEST Test Quality Metrics
Test data related: Test control related:
Signal range consistency Test cases coverage
Constraint correctness Others:
Variants coverage for a SigF Service activation coverage
Variants coverage during test execution System model coverage
Variants related preconditions coverage Cost/effort needed for constructing a test data
Variants related assertions coverage set
SUT output variants coverage Relative number of found errors in relation to
the number of test cases needed to find them
Minimal combination coverage
Coverage of signal variants combinations –
Test specification related: CTCmax, CTCmin
Test requirements coverage
VFs activation coverage
VF specification quality
Preconditions coverage
Effective assertions coverage
42 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
44. Summary and Future Work
Test l e, Arbi
Orac
p p a tu l-
Eval
on
Sp
F e ig n a
Te ica
ro re
ec
ra c t i
h
st tio
if
ac
uat i o at i on
S
A bst
n
n, Te
tr
A
st
Te
n
st
io
at
Q
ua
om
lit
ut
y
A
Three types of case studies:
component level test
component in the loop level test
integration level test
43 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
45. Example: Ariane 5
ADA Code of 2nd channel
...
declare
Ariane 5 Flight 501 on 4 June 1996 failed vertical_veloc_sensor: float;
Weight: 740 t, Payload: cluster satellites horizontal_veloc_sensor: float;
vertical_veloc_bias: integer;
Rocket self-destructing 37 seconds after launch horizontal_veloc_bias: integer;
because of a malfunction in the control software ...
begin
Most expensive computer bug in history: declare
370 Mio $ pragma suppress(numeric_error, horizontal_veloc_bias);
begin
sensor_get(vertical_veloc_sensor);
sensor_get(horizontal_veloc_sensor);
Causes: vertical_veloc_bias := integer(vertical_veloc_sensor);
Reused software from Ariane 4 horizontal_veloc_bias :=
integer(horizontal_veloc_sensor); Horizontal velocity
Data conversion from 64-bit float to 16-bit ... > 32786.0 internal unit
signed integer overflow / not caught exception
when numeric_error => calculate_vertical_veloc();
ADA software with 2 channels (redundancy), but when others => use_irs1();
identical implementation! end;
end irs2;
1st channel had same problem 72ms before .
Software handler got exceptions from both
channels, no Plan B for such situations Unclassified Exception caught
Control transfer to 1st channel
Main computer interpreted horizontal velocity
and sent strange control command
Self-destruction due to safety issues
* source: http://www-aix.gsi.de/~giese/swr/ariane5.html (retrieved 2008)
44 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
46. MiLEST Realization
MiLEST – Model-in-the-Loop for Embedded System Testing
It is a Simulink® add-on built on top of the MATLAB® engine.
45 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
47. Integration of MiLEST in the Automotive-specific V-Modell®
Acknowledgement: J. Großmann et al. (2008)
46 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
48. Model- and Requirement-based Testing
REQUIREMENTS
Test Objectives
What is the role of a system model?
What is the role of a test model?
SYSTEM MODEL use a common language forMODEL
Is it possible to TEST both
Interfaces and
system and test specifications?
Test Objectives
Transformation How can discrete and
Transformation
continuous signals be
SYSTEM
handled atTESTsame time?
the
IMPLEMENTATION How should a test
IMPLEMENTATION
framework be realized?
Execution Environment
How to automate the test
process?
How to assure the quality of
tests?
47 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
49. MiLEST Marketing
Features: Benefits:
Systematic functional test specification Testing in early design stages
Signal-feature – oriented paradigm Test of hybrid systems including temporal
Graphical test design and logical dependencies
Test process automation Traceability of test cases to the
systematic and automatic test data requirements
generation Traceability of verdicts to the root faults
online automatic test evaluation Increased test coverage and test
Model-in-the-Loop test execution completeness
Reusable test patterns Assured quality of the tests
Abstract and concrete views
48 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka
50. Discrete and Continuous Signal Interpretation in Simulink
Consider a second order Runge-Kutta numerical integration
a1 = hk f ( x(tk ), tk )
hk a1
a2 = hk f (tk + , x(tk ) + )
2 2
x(tk +1 ) = x(tk ) + a2
49 December, 19th, 2008 MBT of Embedded Systems Justyna Zander-Nowicka