2. MD facilitates the creation of system models. This
presentation proposes to extend this functionality by
adding the ability to monitor the model implementation:
⢠A test management plugin allows to build tests that
verify the models and monitor the test implementation
progress. Our ultimate goal is to deliver products with
zero implementation defects (âshift leftâ)
⢠A project management plugin allows to monitor new
feature implementation. Our ultimate goal is to monitor
the new development-related KPIs
MagicDraw - value added
3. Proposal
In the presence of a modeling tool, to
ensure test completeness, the âModel
Driven Testingâ approach has to be
used
4. are run against
SYSTEM
Model Driven Testing
Code
management:
Executable
Test Scripts
Test
management:
Abstract Tests
Model
management:
Structural &
behavioral
models
describe
are derived from
are mapped to
Project
management:
Feature
Backlog are associated with
dashboards
MagicDraw
5. âEnrichedâ model management
The author of the PERL EXPECT plugin once said:
âI took 5% of EXPECT language that is used 95%
of time.â
Similarly, my proposal is to augment MD with
~5% traditional test management and project
management functionality to assure the test
completeness against the models and monitor
the new feature implementation.
6. For knowingâŚ
Nikola Tesla visited Henry Ford at his factory, which was
having some kind of difficulty. Ford asked Tesla if he
could help identify the problem area. Tesla walked up
to a wall of boilerplate and made a small X in chalk on
one of the plates. Ford was thrilled, and told him to
send an invoice. The bill arrived, for $10,000. Ford
asked for a breakdown. Tesla sent another invoice,
indicating a $1 charge for marking the wall with an X,
and $9,999 for knowing where to put it.
9. Model management
⢠The system can be presented by two integrated
parts: existing system features and features that are
under development.
⢠A feature is a subset of models and/or their
elements.
⢠There are thousands of features that make up a
system. Therefore, it is essential to identify the
features that are under current development; only
these features need to be monitored.
10. Model management - structural view
Application/
Business
layer
Application
1
Application
2
Application
6
Middleware/
Platform
layer
Service
1
Service
2
Service
3
HW
abstraction
layer
Component 1 Component 2 Component 3
Interface
layer API 1 API 2 API 7
11. Model management - behavioral view
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
12. Model management - system view
Application/
Business
layer
Application
1
Application
2
Application
6
Middleware/
Platform
layer
Service
1
Service
2
Service
3
HW
abstraction
layer
Component 1 Component 2 Component 3
Interface
layer API 1 API 2 API 7
13. Model management - feature view
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Feature WWW-111
Feature XYZ-321
A feature is represented by new and/or updated diagrams/diagram elements
15. Preamble
The coverage of requirements and
acceptance criteria (REQs/AC) is
necessary, but not sufficient, to
achieve complete test.
16. Example: REQs/AC coverage
REQs TEXT AC
RPREQ_1500
As Application_3 SW, I want Platform SW to control state LEDs during early HW start-up, warm and
cold reset, so that sate HW physical LEDs indicate when module is in resetting phase and initialized.
RPREQ_1747
As Application_3 SW, I want Platform SW to set color and pattern on state and interface LEDs
according to received requests, so that I can see a state of a particular unit.
RPAC_498
RPAC_523
AC TEXT REQ
RPAC_498
Service_5 notifies Component_5 about new state LED color and pattern, Component_5 requests
Component_9 to set state LED according to Sefice_5 notification. Component_9 sets the state LED
accordingly. RPREQ_1747
RPAC_523
Service_8 notifies Component_2 about new state LED color and pattern. Component_2 requests
Component_9 to set LED according to Service_8 notification. Component_9 sets particular state LED
color and pattern. RPREQ_1747
The coverage of REQs/AC by test is typically required by most
organizations. Often, the AC are just a rephrasing of the
respective requirements.
In this example, testers can get away with just ~15 test cases to
cover these REQs/AC. However âŚ
17. Example: Specification coverage
Use case diagram (1 diagram):
start-up, cold/ warm reset/ OFF
for various boards Activity diagram (7 diagrams):
Algorithms/ conditions of
start-up, cold/ warm reset/ OFF
Sequence diagram (4 diagrams):
Message exchange for LED settings
The previous requirements are described by 12 UML models.
These models require ~200 test cases (as opposed to 15).
18. MD TMS vs traditional TMS
A traditional TMS:
⢠Deals with requirements coverage. In contrast, the
MD TMS maps tests to specification models. This
allows to control the test completeness.
⢠Is a release-oriented tool - all feature test plans exist
only temporarily and independently from the
regression tests.
⢠Does not allow to see how well the regression test
covers the existing system, because a traditional TMS
is not linked to the overall system architecture/
behavior.
19. Principles of model-based tests
⢠The main purpose of a TMS within MD is to associate
tests to models, to ensure the test completeness.
⢠Tests are built based on the model types.
⢠Tests are not generated automatically.
⢠Test completeness is verified during the review.
⢠Tests include a requirement ID, a model ID, and a
unique tag for traceability purposes.
⢠Executable test scripts are not intended to be stored
in TMS, but they have to cover the tests in TMS, using
the test tag.
20. Test model
Test Case layer
Test Scenario layer
Test Suite layer
Test Plan layer TP 1
TS 1
UC 1
TC 1 TC 2 TC 3
UC 2
TC 5 TC 6
TS 2
UC 7
TC 7
TS 3
UC 5
TC 8 TC 9
21. ⢠A Test Plan represents one of the traditional test
types, such as application, feature, sanity, regression,
performance, etc.
⢠A Test Suite reflects the structural view of the system.
⢠A Test Scenario mirrors the behavioral view, such as
end-to-end scenarios or business functions.
⢠A Test Case is a set of actions, such as a message
exchange, with one compare statement.
Test hierarchy
22. Model - Test
Use case
Diagram 1
Activity
Diagram 9
Sequence
diagram 5
Activity
diagram 3
Sequence
diagram 2
Sequence
diagram 7
Activity
diagram 3
Use case
diagram 2
Activity
Diagram 5
Activity
Diagram 3
Activity
diagram 9
State
machine 8
Activity
diagram 3
Application/
Business
layer Application 1 Application 2 Application 6
Middleware/
Platform
layer Service 1 Service 2 Service 3
HW
abstraction
layer Component 1 Component 2 Component 3
Interface
layer
API 1 API 2 API 7
26. Project management
⢠The main purpose is to monitor the quality of new feature
development.
⢠The decomposition/refinement process produces the backlogs
for various system levels/ components, that represent the initial
data for MD project management.
⢠MD project management uses only data that are necessary to
monitor the development and verification of the models, such as
the relationship between features, models and tests.
⢠Most of the common project management artifacts, such as
implementation tasks, schedule, builds, definition of done, etc.
are not included in the MD project management.
27. KPI sourcesFilter sources
Project management
feature
releases
products
requirements AC
references to
models
references to
Test Plans
belongs to
is used in
is defined by
is verified by
is implemented through
is verified by
variants
is applied to
29. Reports, Dashboards, search pages:
Select artifacts: release, product, component, feature,
architectural layer, test plan
Show KPIs:
test plans coverage by automated tests
test plan requirements/ AC coverage by automated tests
feature requirements/ AC coverage by automated tests
model coverage by abstract test
Show relationship/ traceability:
release <-> requirements/ AC <-> models <->
test plans <-> test scripts <-> test cases
Solution: Report management
31. Quality Dashboards:
⢠system components coverage by test
⢠new features coverage by test
Daily: Extract
features data
Doc repository:
requirements
and acceptance
criteria
E2E Process
JIRA
Backlogs:
releases and
features
Modeling Tool:
Specifications
and Design
Test
Management
System: abstract
testware
DevOps environment
Source Control
System: Test
Scripts
Daily: Extract testware
tags/ results
DevOps
environment:
Logs and Reports
feature/reqs/models/tests repository
32. Conclusion
The test and project management, as additions
to MagicDraw, provide the possibility to verify
the system development and to monitor the
implementation progress.
These âextensionsâ have the potential of making
MagicDraw attractive to a broader customer
base, looking for model implementation
aspects.
33. Further reading
⢠Requirements coverage - a false sense of security,
Professional Tester magazine, issue 42, 12-17; December
2017. Is the forerunner of the this presentation
⢠Tower of Babel insights Professional Tester magazine, issue
35, 15-18; December 2015. Proposes standards that make
requirements testable
⢠From test techniques to test methods Professional Tester
magazine, issue 29, November 2014; 4-14; Presents test
design methods for all UML software models
⢠QA of testing, Professional Tester magazine, issue 28, August
2014; 9-12; Describes the process that guaranties the test
automation in parallel with code development
Defect Detection Efficiency (DDE) is the number of defects injected and detected during a phase divided by the total number of defects injected during that phase. ALU PLTF data ~75%, but can be 95%