• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
'Operational Testing – Walking A Mile In The Users Boots' by Gitte Ottosen
 

'Operational Testing – Walking A Mile In The Users Boots' by Gitte Ottosen

on

  • 283 views

When testing is based on the requirements and focusing solely on the features being implemented, we tend to forget an important perspective – whether the system fits for its purpose. We need to take ...

When testing is based on the requirements and focusing solely on the features being implemented, we tend to forget an important perspective – whether the system fits for its purpose. We need to take testing one step further – to walk a mile in the user’s boots – to understand and to test based on intensive domain knowledge When developing software, we base our work on requirements - being line items, use cases or scenarios. But often there is a long way from requirements to what the user really wants, and often the user is far away from where the requirements specification is being defined.When testing is based on the requirements and focusing solely on the features being implemented, we tend to forget an important perspective – whether the system fits for its purpose. We need to take testing one step further – to walk a mile in the user’s boots – to understand and to test based on intensive
domain knowledge.



In Systematic we especially discovered this need when developing our product, a command and control system for the army, and we therefore decided to take testing to the trenches – to implement operational testing of the system. The company already used domain advisors extensively when designing the system in order to ensure the users voice being heard, but now we wanted to take it yet another step towards REAL use and REAL users, designing and executing large operational scenarios with the use of REAL operational users. In this presentation you will be introduced to the process we went through and the results this lead to.



In the presentation I will describe the process we have gone through in order to implement the concept, describing both successes and challenges. I will try to ensure that I also include reactions from the users that are involved as well as feedback from development.

Statistics

Views

Total Views
283
Views on SlideShare
283
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • SitaWareproduct suite is a command and control system covering the needs for an army organisation on all organisationallevels from division to the individuealsoldier.The system handles internalcommunication and command/control but alsointeroperabilitywithexternal units e.g. other nations throughour MIP gateway.Ourtrackserverconnects land, maritime and air – where HQ, BM and dismountedfocuson land operations including an operationalpicture, thentrack server adds maritime and air tracks so thatwecandefine the complet joint combinedoperationalpicture.
  • The test we have introducted which I will be talking about today covers our HQ application. This is where we started the princiople of operational testing, but this will be extended to all products in the suite.
  • We havealways done a lot of testingonourproject, from unit test to system test plus of courseability tests dependingon the customersrequirements.So shouldn’tthatbeenough?So we do a lot of testingduring the lifecycle - KLIK
  • The proces for developingthis type of testing has been an iterative proces – and weare still not done  letsdiveinto the iterations.
  • We had a verybroad program – nolimitations, nostrict procedures, nochecklists………maybetoobroad?Participantswere a combination of domain advisors, testers, external testers, developers (support) and test manager (coffee support  )Twofulldays test – one weekend.

'Operational Testing – Walking A Mile In The Users Boots' by Gitte Ottosen 'Operational Testing – Walking A Mile In The Users Boots' by Gitte Ottosen Presentation Transcript

  • Operational Testing: Walking a Mile in the User's Boots
  • © Sogeti Systematic • Established in 1985 • Approx. 500 employees; 58% hold a MSc og PhD in software engineering • Microsoft, ESRI and SAP partner • 97% of our customers would recommend Systematic to other customers • Customers in 35 countries • CMMI Level 5 certified • For more info visit: www.systematic.com Aarhus – Headquarters Software and systems development Tampere Products and services London Products and services Washington Products and services Page 2
  • © Sogeti Systematic’s Business Areas A leading international company in delivering reliable and simple solutions to people who make critical decisions every day Vision: Mission: Simplifying Critical Decision Making Interoperability CommandandControl SituationAwareness ElectronicWarfareSupport ClinicalInformationSystems ManagementInformationSystems BoarderControl Various Finance Transport Agriculture Government Defence HealthCare Intelligenceand NationalSecurity IntegrationServices
  • © Sogeti The Sitaware Product Suite
  • © Sogeti OVERVIEW SitaWare Headquarters Usable at multiple organisational levels Improving situational awareness Speeding up and supporting the Military Decision Making Process Reducing planning preparation time Swift and improved ability to adjust on-going operations
  • © Sogeti Product vs Project SitaWare COTS Product SitaWare Product Add-in SitaWare Product Add-in SitaWare Product Add-in SitaWare Product Add-in SitaWare Product Add-in SitaWare Product Add-in Customer specific extension Customer specific extension Customer specific extension Customer specific extension
  • © Sogeti The Background – How we Work GUI Communication Layer Database Business logic Feature1 Feature2 The System
  • © Sogeti Howwe Test SUT Unit Test Unit Integration Test System Test •Scripted •Exploratory System Integration Test Operational Test Ability Tests
  • © Sogeti Test in the Whole Lifecycle Feature Kick-off Feature A Feature B Code cut-off Feature C System test Testing Stories Test ”Qualify Feature” Bug hunts Unit Integration System Sprint 3Sprint 1 Sprint 2 Increment “Sprint test” System Integration test
  • © Sogeti The Challenges • The domain • The complexity • The workflows • The environment - Interoperability Sensors Communications WiFi HF UHFVHF Works with many different comunication products Gyro GPS Laser Range Finder Integrates to a variety of sensor products SitaWare Headquarters
  • © Sogeti Operational Test
  • © Sogeti Moving the Focus Test that the concept holds Test the SYSTEM as a whole Attack the system as a user would Transfer fresh domain knowledge to the project
  • © Sogeti An Iterative Process Execute opr. test Defining the concept Scenario and environment ready Execute test Learning Refining the concept Defining TSP and environment Project Product Time
  • © Sogeti Defining the Concept – Take One • A project context • The operational test specification • The test environment • Internal test execution
  • © Sogeti The Project Environment
  • © Sogeti The Result and the Challenges  New bugs found  System used in a whole new way/context  Domain knowledge was transferred to the developer/tester ÷ Used as regression test – or misused ÷ Stopped reading – executing by memory ÷ Went faster and faster ÷ Running the same scenario every time
  • © Sogeti Defining the Concept – Take Two • The real concept - Product • Defining the operational basis • Defining the organization OPORD Exercise Lightning Fist 1. Situation 2. Mission 3. Execution 4. … 5. …
  • © Sogeti The First Execution – Getting Ready Establishing the environment Defining the scenario Getting the right people Getting the training
  • © Sogeti The Execution
  • © Sogeti The Result of First Execution • The concept was validated • New defects were identified • Knowledge was transferred into the organization • Testers were trained (Internal as well as external)
  • © Sogeti What did we learn? • Need for structure • Needs a lot of organisation • Clear division of responsibilities • Communication - communication – communication 
  • © Sogeti Following Executions • Getting more people with the right profile • A better environment – a ”real organization” • Inviting UX to participate as observers - get to know the users • Inviting test to participate as assistants - get to know the domain • What else: – Usability test with the end users – Conceptual discussions
  • © Sogeti Experiences – Best for systems with larger work flows – Needs a lot of preparation – Need a lot of resources • People • Environment • Test data setup • Scenarios • ….. And time
  • © Sogeti The Way Ahead • Extending the scenarios – Special operations – Including more applications from the product suite • Finding new profiles – Fit the changing operational profiles – Other nationalities? • More active involvement of operational testers in the planning phase • The focus – from concept validation to bug hunting • The schedule – Split up into shorter scenarios – max 2-4 hours – Feedback more often
  • © Sogeti So What’s in it for us? • Value on many levels – Verification of the concept – Finding new defects – More operational knowledge gathered – Testers get a better domain knowledge – Operational testers used for sparring as well, input for the roadmap
  • © Sogeti