Test Driven
Development of
Embedded Systems
Marcin Czenko
Martijn Gelens
Wim van de Goor
Agile Testing Days, 14 october 2009, Berlin
• Test Consultancy at VIIQ.
• Agile Test Team leader at Philips CareServant.
• 10 years of experience in the Testing Field (7 years
V-Model, 3 years Agile).
• Some experience with coding.
• Bachelor degree in Information Technics.
• Married to Jeanne, has a dog (Sasha) and 2 cats.
Martijn Gelens
• Currently, Philips Software Engineering Services,
MiPlaza, Software Designer.
• Ph.D. in Computer Security from University of
Twente, Enschede, The Netherlands.
• M.Sc. in Software Engineering at Warsaw University
of Technology.
• 7 years of experience as an embedded software
developer (SterKom (Poland) and freelancer).
• Modelling Languages: my master thesis + one year
internship at Institute National des
Télécomunications (INT), Evry, Cedex, France.
• Agile, test-invected since one year.
• Married to Beata and also has two cats (Mufasa and
Ursus).
Marcin Czenko, Ph.D.
Wim van de Goor
• Agile Software Team Leader at Philips Software
Engineering Services, Philips MiPlaza.
• 8 years experience with eXtreme Programming.
• Agile mentor and Coach.
• Advised us on Agile Principles.
Contents
• Agile Testing
What Do we want to do?
• Constraints
What can we do?
• Agile Embedded Testing
How do we test?
• Conclusions
What do we want ?
• Prerequisites
• Test Strategy
• Acceptance Criteria
• Continuous Integration
• Testing
Prerequisites
• Testing is integrated in the team's Way of
Working.
• Acceptance Criteria are defined, discussed and
explored beforehand.
• Test Driven Development.
• Continuous Build and Integration.
• Testing is structured (specify, verify, report).
Test Strategy
• Product risks and mitigation.
• Tester-role.
• Definition of Done.
• Quality gates.
• Testing is structured (specify,
verify, report).
Test Strategy (cont.)
• Deliverables from outside (also HW).
• Issue procedure and attendees.
• Reporting (test reports, PMI, ...).
• Emergency scenarios.
Test Strategy (cont.)
The test strategy is build around the iterations and
hardware deliveries.
Acceptance Criteria
• Defined upfront
• Based upon Quality Attributes; functionality,
usability, efficiency, maintainability, reliability,
portability, etc...
• Definition of Done:
Tested and accepted, Code reviewed, documents
written, yellow sticker on the note, ...
• Quality Gate:
Acceptance tests passed, smoke test passed, ...
Continuous Integration
Automate your:
• Build procedure.
• Release package creation.
• Deployment to simulator.
• Deployment to Board Support Package.
Continuous Integration
Continuous Integration
Test Driven Development
• Means: “Write unit tests
before code”.
• Integrate with your
Continuous Integration
environment.
• Automate the Acceptance
Tests.
This is your framework
Constraints
Light to Heavy
Light Heavy
Light
• No cross-compilation required.
• Usually mainstream OS (Windows/Linux).
• Wide range of testing/mocking
frameworks available.
• Standard hardware - no or very limited
hardware level programming required.
• No dependence on a particular vendor
(supplier).
Heavy
• Small memory (no OS), limited performance,
limited debugging possibilities.
• Limited cross–compilers: often only C, and
Embedded (Extended) C++ available.
• Vendor specific.
• Difficult to find a testing/mocking framework.
• Custom hardware (ramp-up).
Hardware design
challenges
Using Evaluation Boards
Getting There (1)
Getting There (2)
Getting There (3)
Getting
There (4)
Getting There (5)
Getting There (6)
Getting There (7)
Development Board
• Selecting the right board can be
challenging (expensive ?).
• Chip selection driven by the availability
of the right board.
• The board selection driven by the
availability of the BSP and OS.
Is there a more lean
solution ?
Are we lean ?
Queue 1
Queue 2
Queue 3
The effect on Agile
Principles
• Longer ramp-up time.
• Resistance to modify hardware
(introduces up-front design).
• Limited response to changes.
• Higher risk.
How to proceed ?
• Often we cannot remove queues &
batches in HW development (are we
going to be able doing so in any
predictable future ?).
• Reducing the queue size is also often not
an option.
• Is there a lean solution ?
Getting There (7)
Fix in between...
Fix in between...
How do we test ?
Our experiences
What do we need
• Testing Strategy.
• Hardware to work with.
• Tool chain (compiler).
• Testing Framework.
• Mocking Framework.
Compiler
ISO C++
ANSI C
Testing Framework
Keep it simple !
Do-It-Yourself !
Testing Framework (C)
• Many frameworks are simple ports of
the frameworks for PC-based
development.
• Increased stack consumption.
• Dynamic memory allocation.
CMock
• Easy to understand. Easy to customise.
Lightweight.
• Comes with Supporting Ruby-based
Mocking Framework.
• Ready For “Heavy Embedded” - tests
executed in batches.
Testing Frameworks (C++)
• Run-Time Type
Information
(RTTI).
• Exceptions.
• ISO C++ compiler
needed.
• Gnu or Microsoft
are preferred.
Testing Frameworks (C++)
• We could not find a framework that
compiles on GreenHills and WindRiver
C++ compilers (forget IAR Extended
Embedded C++).
• It was cheaper and more effective to
come with your own simple testing
frameworks.
yaffut
• Our choice for unit testing in C++.
• Just one header file.
• Not meant for embedded: needs RTTI,
and C++ exceptions.
• Easy to understand and customise. We
made an RTTI and Exception-free
version.
Mocking
• We did not succeed in using existing
frameworks. Our best candidate
GoogleMock does not even compile and
it is quite complex.
• Does it mean that no one is doing TDD
on embedded ? Probably not.
Mocking - way to go
• Think of your own framework.
• We needed three “evenings” to create
our own mocking framework.
Educate your customer
Educate your customer
Conclusions
• Agile in Heavy Embedded is a challenge: we cannot change it, but
we can understand it and try to reduce its impact on agile software
development.
• The tester should be experienced in working with hardware,
perhaps even more than a developer.
• There is no one way: what you can do depends on the constraints
you have (e.g. light to heavy).
• Hardware development is far from being lean - and there is not
that much we can change.
• Development and support tools are far behind the needs of the
agile teams.
• Let your agile testing framework grow with your code.
Conclusions
• Agile in Heavy Embedded is a challenge: we cannot change it, but
we can understand it and try to reduce its impact on agile software
development.
• The tester should be experienced in working with hardware,
perhaps even more than a developer.
• There is no one way: what you can do depends on the constraints
you have (e.g. light to heavy).
• Hardware development is far from being lean - and there is not
that much we can change.
• Development and support tools are far behind the needs of the
agile teams.
• Let your agile testing framework grow with your code.
Be agile.
Questions

Agile Testing Days

  • 1.
    Test Driven Development of EmbeddedSystems Marcin Czenko Martijn Gelens Wim van de Goor Agile Testing Days, 14 october 2009, Berlin
  • 2.
    • Test Consultancyat VIIQ. • Agile Test Team leader at Philips CareServant. • 10 years of experience in the Testing Field (7 years V-Model, 3 years Agile). • Some experience with coding. • Bachelor degree in Information Technics. • Married to Jeanne, has a dog (Sasha) and 2 cats. Martijn Gelens
  • 3.
    • Currently, PhilipsSoftware Engineering Services, MiPlaza, Software Designer. • Ph.D. in Computer Security from University of Twente, Enschede, The Netherlands. • M.Sc. in Software Engineering at Warsaw University of Technology. • 7 years of experience as an embedded software developer (SterKom (Poland) and freelancer). • Modelling Languages: my master thesis + one year internship at Institute National des Télécomunications (INT), Evry, Cedex, France. • Agile, test-invected since one year. • Married to Beata and also has two cats (Mufasa and Ursus). Marcin Czenko, Ph.D.
  • 4.
    Wim van deGoor • Agile Software Team Leader at Philips Software Engineering Services, Philips MiPlaza. • 8 years experience with eXtreme Programming. • Agile mentor and Coach. • Advised us on Agile Principles.
  • 5.
    Contents • Agile Testing WhatDo we want to do? • Constraints What can we do? • Agile Embedded Testing How do we test? • Conclusions
  • 6.
    What do wewant ? • Prerequisites • Test Strategy • Acceptance Criteria • Continuous Integration • Testing
  • 7.
    Prerequisites • Testing isintegrated in the team's Way of Working. • Acceptance Criteria are defined, discussed and explored beforehand. • Test Driven Development. • Continuous Build and Integration. • Testing is structured (specify, verify, report).
  • 8.
    Test Strategy • Productrisks and mitigation. • Tester-role. • Definition of Done. • Quality gates. • Testing is structured (specify, verify, report).
  • 9.
    Test Strategy (cont.) •Deliverables from outside (also HW). • Issue procedure and attendees. • Reporting (test reports, PMI, ...). • Emergency scenarios.
  • 10.
    Test Strategy (cont.) Thetest strategy is build around the iterations and hardware deliveries.
  • 11.
    Acceptance Criteria • Definedupfront • Based upon Quality Attributes; functionality, usability, efficiency, maintainability, reliability, portability, etc... • Definition of Done: Tested and accepted, Code reviewed, documents written, yellow sticker on the note, ... • Quality Gate: Acceptance tests passed, smoke test passed, ...
  • 12.
    Continuous Integration Automate your: •Build procedure. • Release package creation. • Deployment to simulator. • Deployment to Board Support Package.
  • 13.
  • 14.
  • 15.
    Test Driven Development •Means: “Write unit tests before code”. • Integrate with your Continuous Integration environment. • Automate the Acceptance Tests.
  • 16.
    This is yourframework
  • 17.
  • 18.
  • 19.
    Light • No cross-compilationrequired. • Usually mainstream OS (Windows/Linux). • Wide range of testing/mocking frameworks available. • Standard hardware - no or very limited hardware level programming required. • No dependence on a particular vendor (supplier).
  • 20.
    Heavy • Small memory(no OS), limited performance, limited debugging possibilities. • Limited cross–compilers: often only C, and Embedded (Extended) C++ available. • Vendor specific. • Difficult to find a testing/mocking framework. • Custom hardware (ramp-up).
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
    Development Board • Selectingthe right board can be challenging (expensive ?). • Chip selection driven by the availability of the right board. • The board selection driven by the availability of the BSP and OS.
  • 31.
    Is there amore lean solution ?
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
    The effect onAgile Principles • Longer ramp-up time. • Resistance to modify hardware (introduces up-front design). • Limited response to changes. • Higher risk.
  • 37.
    How to proceed? • Often we cannot remove queues & batches in HW development (are we going to be able doing so in any predictable future ?). • Reducing the queue size is also often not an option. • Is there a lean solution ?
  • 38.
  • 39.
  • 40.
  • 41.
    How do wetest ? Our experiences
  • 42.
    What do weneed • Testing Strategy. • Hardware to work with. • Tool chain (compiler). • Testing Framework. • Mocking Framework.
  • 43.
  • 44.
    Testing Framework Keep itsimple ! Do-It-Yourself !
  • 45.
    Testing Framework (C) •Many frameworks are simple ports of the frameworks for PC-based development. • Increased stack consumption. • Dynamic memory allocation.
  • 46.
    CMock • Easy tounderstand. Easy to customise. Lightweight. • Comes with Supporting Ruby-based Mocking Framework. • Ready For “Heavy Embedded” - tests executed in batches.
  • 47.
    Testing Frameworks (C++) •Run-Time Type Information (RTTI). • Exceptions. • ISO C++ compiler needed. • Gnu or Microsoft are preferred.
  • 48.
    Testing Frameworks (C++) •We could not find a framework that compiles on GreenHills and WindRiver C++ compilers (forget IAR Extended Embedded C++). • It was cheaper and more effective to come with your own simple testing frameworks.
  • 49.
    yaffut • Our choicefor unit testing in C++. • Just one header file. • Not meant for embedded: needs RTTI, and C++ exceptions. • Easy to understand and customise. We made an RTTI and Exception-free version.
  • 50.
    Mocking • We didnot succeed in using existing frameworks. Our best candidate GoogleMock does not even compile and it is quite complex. • Does it mean that no one is doing TDD on embedded ? Probably not.
  • 51.
    Mocking - wayto go • Think of your own framework. • We needed three “evenings” to create our own mocking framework.
  • 52.
  • 53.
  • 54.
    Conclusions • Agile inHeavy Embedded is a challenge: we cannot change it, but we can understand it and try to reduce its impact on agile software development. • The tester should be experienced in working with hardware, perhaps even more than a developer. • There is no one way: what you can do depends on the constraints you have (e.g. light to heavy). • Hardware development is far from being lean - and there is not that much we can change. • Development and support tools are far behind the needs of the agile teams. • Let your agile testing framework grow with your code.
  • 55.
    Conclusions • Agile inHeavy Embedded is a challenge: we cannot change it, but we can understand it and try to reduce its impact on agile software development. • The tester should be experienced in working with hardware, perhaps even more than a developer. • There is no one way: what you can do depends on the constraints you have (e.g. light to heavy). • Hardware development is far from being lean - and there is not that much we can change. • Development and support tools are far behind the needs of the agile teams. • Let your agile testing framework grow with your code. Be agile.
  • 56.