EuroSTAR webinar
Creating Agile Test Strategies for Larger Enterprises
Derk-Jan de Grood/ Valori
February 2017
1
Questions this webinar
addresses
2
3
Testing is declining
Business[Agility]
4
Testing is on the rebound
5
Testing and business agility
Effective testing ensures that the
organization can extend and
change software products whilst
retaining confidence in the quality
and correct operation of what is
delivered
[ING orange book on testing and Quality]
6
[http://www.slideshare.net/janetgregoryca]7
Some statements from Janet’s talk
Agile wants small co-located teams, but in large
organizations this is not always possible.
Culture: A blame culture kills innovation
Testing is a team problem, but in large organization there are
a lot people outside the team (Note the keynote by Diane
Larsen- the Boundary Interaction map)
In large projects, many projects are concurrently active and
not seldom people are managed as resources (iso humans)
Typical testing issues in larger enterprises are: Throwing
over the wall, not knowing who to ask, integration, testers
that are working in more than one team, Failing to look at
release total.
8
9
Gojko Adzic on ATD 2016
10
Development Testing
Click !
11
12
Case 1: Focus on acceptance
13
14
It ain’t done until its accepted
15
Witness
Development en Tes ng
Tested solu on
Witness report Customer
Witness report: 2 outcomes
Advice to accept solution
• Professional testing
• Good process
• Good test results
Advice to not accept solution
• No professional testing
• No Good process
• No good test results
16
Witness proces
17
Case 2: the need for status info
18
19
It ain’t done until it’s tested
Testing a Central Topic
Has the train arrived at the station?
20
Case 3: It ain’t done until it’s
integrated
21
Multi team setting
22
Architecture
• What are the
business
processes?
• What are the
components?
• What are the
interfaces?
Acceptance
criteria
• What is the
Minimal Viable
Product?
• What integrations
are needed to
make it work?
Requirements
traceability
• When are we
complete?
• How do test
results add up to
acceptance?
23
Missing
What should a car minimally do?
24
Planned Integration Tests
25
Integration
tests
Release Date
Ensuring Integration (rough sketch)
26
Organization
Component
System
Service
Continuously
(in the sprint)
Occasionally
(e.g. prior to a
release)
Annotation:
Increasing the
system (e.g from
Units tot Systems)
results in less
frequent
integration,
because it
becomes harder to
test the
integration. This
has impact on the
time-to-market.
Click !
27
Witness
Development en Tes ng
Tested solu on
Witness report Customer
Agile Test Strategy
28
What needs to be tested?
Feedback, steering and progress
Auditing the test work
Coaching the testing team members
Organizing tests that do not fit the sprint
29
What needs to be tested
25
Fears
Does it
work as a
whole?
Acceptance driven Test Report
30
Lot 1
Lot 3
Lot 5
Lot 4
Lot 6
Lot 2
Test reporting
31
Lot 1
Lot 3
Lot 5
Lot 4
Lot 6
Lot 2
Tell the testing story
• Do you know what is tested
in the various teams?
• Do you know how well they
did it?
• Do you have proof?
• How do you rate the quality?
• What concerns do you have
(combine the bugs and
translate to business
impact)?
Janet’s test matrix 32
Testing is a team responsibility
33
Do you know how well
they did it?
Levels of strategy
Policy
Strategic
Operational
34
What could be in an operational
test strategy?
Environments
Tools
Risk analysis
Organisition and roles
How you use CI/CD
Interpretation of (A)TDD
Relation between through away & regression tests
How to do test automation
Finding procedure
Management processes
Etc…
35
Test Driven Development (TDD)
3
1. Business is actively
involved with
defining acceptance
criteria, Examples
and Scenarios
2. Development
process is adopted
to fit TDD
3. Release and
deployment cycles
are established
4. Backlog of
(automated)
regression tests is
minimal
5. Teams have
sufficient test, tool
and automation
knowledge
6. Tools are
configured and
running on controlled
environment
7. It is defined what
tests need to be run
at what level in the
pyramid
8. Management
commits to solution
and stimulates the
team to fail forward
CI/CD requisites
3
Teams Collaborate Integration is
Continue
Tests are
Automated
Deployment is
hands-off process
No Automation
Backlog
Clear Acceptance
Criteria
Feedback loop to
improve Testing
Frequent Product
Launch
Who owns the Quality?
44
45
A job for the
• Test manager
• Quality master
• For you?
WRAP-UP
46
Derk-Jan
Valori
Coltbaan 4a
3439 NG NIEUWEGEIN
The Netherlands
• derkjandegrood@valori.nl
• +31(0)651807878
• www.valori.nl
• @DerkJanDeGrood
• http://djdegrood.wordpress.com
47

Creating Agile Test Strategies for Larger Enterprises

  • 1.
    EuroSTAR webinar Creating AgileTest Strategies for Larger Enterprises Derk-Jan de Grood/ Valori February 2017 1
  • 2.
  • 3.
  • 4.
  • 5.
    Testing is onthe rebound 5
  • 6.
    Testing and businessagility Effective testing ensures that the organization can extend and change software products whilst retaining confidence in the quality and correct operation of what is delivered [ING orange book on testing and Quality] 6
  • 7.
  • 8.
    Some statements fromJanet’s talk Agile wants small co-located teams, but in large organizations this is not always possible. Culture: A blame culture kills innovation Testing is a team problem, but in large organization there are a lot people outside the team (Note the keynote by Diane Larsen- the Boundary Interaction map) In large projects, many projects are concurrently active and not seldom people are managed as resources (iso humans) Typical testing issues in larger enterprises are: Throwing over the wall, not knowing who to ask, integration, testers that are working in more than one team, Failing to look at release total. 8
  • 9.
  • 10.
    Gojko Adzic onATD 2016 10 Development Testing
  • 11.
  • 12.
  • 13.
    Case 1: Focuson acceptance 13
  • 14.
  • 15.
    It ain’t doneuntil its accepted 15 Witness Development en Tes ng Tested solu on Witness report Customer
  • 16.
    Witness report: 2outcomes Advice to accept solution • Professional testing • Good process • Good test results Advice to not accept solution • No professional testing • No Good process • No good test results 16
  • 17.
  • 18.
    Case 2: theneed for status info 18
  • 19.
    19 It ain’t doneuntil it’s tested
  • 20.
    Testing a CentralTopic Has the train arrived at the station? 20
  • 21.
    Case 3: Itain’t done until it’s integrated 21
  • 22.
  • 23.
    Architecture • What arethe business processes? • What are the components? • What are the interfaces? Acceptance criteria • What is the Minimal Viable Product? • What integrations are needed to make it work? Requirements traceability • When are we complete? • How do test results add up to acceptance? 23 Missing
  • 24.
    What should acar minimally do? 24
  • 25.
  • 26.
    Ensuring Integration (roughsketch) 26 Organization Component System Service Continuously (in the sprint) Occasionally (e.g. prior to a release) Annotation: Increasing the system (e.g from Units tot Systems) results in less frequent integration, because it becomes harder to test the integration. This has impact on the time-to-market.
  • 27.
    Click ! 27 Witness Development enTes ng Tested solu on Witness report Customer
  • 28.
    Agile Test Strategy 28 Whatneeds to be tested? Feedback, steering and progress Auditing the test work Coaching the testing team members Organizing tests that do not fit the sprint
  • 29.
    29 What needs tobe tested 25 Fears Does it work as a whole?
  • 30.
    Acceptance driven TestReport 30 Lot 1 Lot 3 Lot 5 Lot 4 Lot 6 Lot 2
  • 31.
    Test reporting 31 Lot 1 Lot3 Lot 5 Lot 4 Lot 6 Lot 2 Tell the testing story • Do you know what is tested in the various teams? • Do you know how well they did it? • Do you have proof? • How do you rate the quality? • What concerns do you have (combine the bugs and translate to business impact)?
  • 32.
  • 33.
    Testing is ateam responsibility 33 Do you know how well they did it?
  • 34.
  • 35.
    What could bein an operational test strategy? Environments Tools Risk analysis Organisition and roles How you use CI/CD Interpretation of (A)TDD Relation between through away & regression tests How to do test automation Finding procedure Management processes Etc… 35
  • 36.
    Test Driven Development(TDD) 3 1. Business is actively involved with defining acceptance criteria, Examples and Scenarios 2. Development process is adopted to fit TDD 3. Release and deployment cycles are established 4. Backlog of (automated) regression tests is minimal 5. Teams have sufficient test, tool and automation knowledge 6. Tools are configured and running on controlled environment 7. It is defined what tests need to be run at what level in the pyramid 8. Management commits to solution and stimulates the team to fail forward
  • 37.
    CI/CD requisites 3 Teams CollaborateIntegration is Continue Tests are Automated Deployment is hands-off process No Automation Backlog Clear Acceptance Criteria Feedback loop to improve Testing Frequent Product Launch
  • 38.
    Who owns theQuality? 44
  • 39.
    45 A job forthe • Test manager • Quality master • For you?
  • 40.
  • 41.
    Derk-Jan Valori Coltbaan 4a 3439 NGNIEUWEGEIN The Netherlands • derkjandegrood@valori.nl • +31(0)651807878 • www.valori.nl • @DerkJanDeGrood • http://djdegrood.wordpress.com 47

Editor's Notes

  • #4 A few years ago - 2011-2013 A lot of talks were on the topic Testing is Dead. Like I believe James Withacker stated in his EuroSTAR keynote that all new innovations game from Dev and test did not contribute much Many organizations reduced the Test departments, fired the test managers. Recently one of clients ( a dutch bank) told that testers could chose to be BA of Dev. And on the testhuddle a discussion thread is lively today…
  • #5 Business agilty Livespan of organisation is reduced – American stock market Start-ups loose mometnum and fail to addapt their services So busines agiltiy is key… What makes you flexible…?
  • #8 I was quite trilled by the talk of Janet Gregory on Agile testing in the enterprise. I have been publishing and talking about the same topic. Good to hear someone likes to emphasis this story also. Janet stated that agile wants small co-located teams, but in large organizations this is not always possible. Organization culture is defined by Values, norms, etc.… But also structure, history etc… A blame culture might lead to reluctancy to trying new stuff. When testers are blamed for errors, it might be difficult to make quick changes. In large projects, many projects are concurrently active and not seldom people are managed as resources (iso humans). Typical testing issues in larger enterprises are: Throwing over the wall, not knowing who to ask, integration, testers that are working in more than one team, etc. Very often organization fail to look at release total. Tips Janet gave were: Make a mind map with items to test (to validate the release). Janet likes to use a test matrix to share results Have a focus on dependencies…and discuss them. Janet believes testing is part of the team, but in large organizations sometimes testing is too large to fit in the team. Maybe there is room for a product team, a separate “test” team. e.g with some post-development testing. But beware they should be testing in the next sprint, not the whole release.  Testing is a team problem, but in large organization there are a lot people outside the team (Note the keynote by Diane Larsen- the Boundary Interaction map)
  • #10 https://www.techwell.com/techwell-insights/2016/12/reviving-master-test-plan-age-agile
  • #11 The final keynote was by Gojko Adzic. Gojko made some bold predictions to shake us awake. In short he claimed that in the near future most software is run in 3rd party software- e.g. Amazon servers etc. As example he told that e.g. Amazon is selling login and user settings services as a service. For a company this means that using this saves you 3-months of development time, and further you save maintenance costs since the functions are maintained by de OPS provider. He states that the OPS in DEVOPS is outsourced this way and the line to production and test will blur. Line between production and test environments will blurs when multi versioning becomes mainstream. Since services are charged per request multiple instances will become quite cheap. Therefore, testing explodes. We need to test multiple instances. Another prediction he made was that apps will be distributed. We are running services rather than apps. This makes the architecture complex, but flexible. We can have quick bug fixing by replacing small Parts of the system. Most software will glue together 3rd party services. The risks are in integration, not in the units. The risk will be that a party is changing a service without telling you. The testing pyramid will inverse. Unit testing will become less important than integration testing or PID. Testing after deployment will be critical and Approval testing becomes more relevant than unit testing.   In such an environment we will work with SLA’s and Gojko addresses that failure budgets become a norm. Maybe it does not matter to have bugs in the system, he suggests, as long as we comply to the SLA. 

  • #13 So what is needed? 3 cases of projects that I did in the recent past !
  • #17  Scenario advice to not accept: nobody wants this shouldn’t be a surprise
  • #18 The way to avoid the second scenario is by putting into place witness process Set of goals that need to be reached by 4 activities. Extensive reviewing Witnessing is an ongoing phase and is all about transparency and working together. Avoid surprises, mitigate risk, clear understanding on quality0 Audit: De leverancier toont aan dat: checklist van uit te voeren activiteiten, -> paper trail klopt! Bewijslast ligt bij leverancier. Demo: periodieke inzage in kwaliteit van product. Toont test werkzaamheden van afgelopen periode, testscenario’s worden afgespeeld en inzicht wordt gegeven van de belangrijkste bevindingen (wat werkt nog niet). -> Demo rapport! Meetings: Strategisch testoverleg, Finding Meeting, Demo -> Kwaliteitsoverzicht Review: Testplannen, testcases, Testresultaten, Non functional testen.
  • #22 We do not want a screen, we want a feature
  • #23 In large organisations there are many people outside the team
  • #27 How to ensure Integration We do: CI/CD MBT UT Automated System test Automated e2e test Interface testing Manual Regression testing Integration sprints Other General trend: Increasing the system (e.g from Units tot Systems) results in less frequent integration, because it becomes harder to test the integration. This has impact on the time-to-market and this insight might lead to targeted improvements
  • #28 @Split: Technical tests could be: Exploraotry tests, Bughunting, unit test, ST testing, performance testing, interface testing, Business rule testing Business tests could be AO , e2e-chain, Testing in the large Note the Pi-shaped tester: More than one specialism, which one you chose
  • #37 Teams collaborate to deliver each sprint a working product Integration is continue Tests are automated Tests are run from the build server Deployment is hands-off process TDD ensures no backlog in TA Acceptance criteria are clear Feedback loop to create better tests in place Product goes live regularly
  • #38 Teams collaborate to deliver each sprint a working product Integration is continue Tests are automated Tests are run from the build server Deployment is hands-off process TDD ensures no backlog in TA Acceptance criteria are clear Feedback loop to create better tests in place Product goes live regularly
  • #42 The team was best in traditional testdesign, but this had limited value