Mapping the attributes of a test activity
▪ Brian Marick first developed the agile testing matrix 
▪ Lisa Crispin then used this in her book “Agile Testing” 
▪ There have been many interesting developments of the model
▪ The purpose of the agile testing matrix is to categorize test
activities in four distinct quadrants to help plan the necessary
▪ Categorizing test activities is all about granularity – sometimes
it is enough to have 2 categories, sometimes you have to have
Introducing Test Activity Attributes
▪ To be able to categorize test activities we need to know
what distinguishes different test activities from each other
▪ We need to identify the different types of attributes that a
test activity can have
▪ We also need to identify the different values that the
different attributes can have
▪ Once we have done this, we can create any categorization
model we want to, which meets our specific granularity
Test Activity Attributes Overview
▪ What value does the test activity generate?
▪ Finding defects?
▪ Passing certifications and standards?
▪ Meeting customer requirements?
▪ Generating decision material and other information?
▪ Supporting developers in some other way?
▪ Provides start criteria for other test activities?
▪ Who are the stakeholders of the test activity?
▪ The project leader?
▪ The developer?
▪ The system architect?
▪ The line manager?
▪ The test leader?
▪ Other testers?
▪ How predictable is the (sub-)system under test?
▪ A small unit is often more or less predictable if it is tested in a
▪ A large system is often unpredictable, even if you have system
requirements, and the system is made up of many small
▪ Sub-systems can be more or less predictable
▪ On what level is reporting necessary?
▪ Does every test have to be recorded in detail?
▪ What measurements to the stakeholders need?
▪ Is it enough with general quality feedback?
▪ What will the information in the report be used for?
▪ What possibilities does the tester have to affect the scope?
▪ Is the scope completely fixed?
▪ Certification / Standard
▪ Customer requirements
▪ Is it semi-flexible?
▪ Could be that priority 1 test cases have to be executed, but the rest is
▪ Is it completely up to the tester?
▪ Can you run which ever test sessions you want, without any pre-set
Required Tool Support
▪ Does the activity require certain tools?
▪ Bluetooth testing, power consumption tests, 3GPP tests, all
require specific equipment to run the tests
▪ Activities such as integration tests which are run in a
continuous integration system need to be automated
▪ User-focused test are examples where no specific tools are
▪ Who executes the tests?
▪ Dedicated tester?
▪ External User?
▪ Internal User?
▪ External test house?
Definition of Done
▪ When is the test activity over?
▪ When all tests are executed?
▪ When a time period has passed?
▪ When the tester says so?
▪ When the first defect is found?
▪ When the stakeholder says so?
▪ Once you have all activities mapped with attributes and
values you can start comparing and evaluating them
▪ This can give you insight into for example if two activities
are very similar and perhaps redundant
▪ It can also show that there are gaps in some areas, if many
activities have similar attribute values, and parts of the
value-spectrum is not covered
How attributes affect test method
▪ The test activities themselves to not force a specific test
▪ Scripted testing / Session-based testing / Ad-hoc testing
▪ Manual / Automated / Tool supported
▪ But often if you look at the attributes, you will get hints as to
what is more or less suitable as a method for that activity
▪ The reason why there are 8 test activity attributes described
here is totally arbitrary and only because I wanted to use
octagon in the title – which attributes are relevant is
completely context dependant
▪ By having all relevant attributes mapped out, it becomes
much easier to plan, and find gaps and redundant activities
▪ How many attributes you choose to use is based on what
granularity you need for your planning (and if you want to
have a cool sounding model name)
 Brian Marick
 Lisa Crispin
 Gojko Adzic
 Markus Gärtner