Software Testing – Present Problems and
Future Solutions
June 24, 2008
2
Science Fiction – Fiction today, Reality tomorrow
Jules Verne’s “Twenty
Thousand leagues under
the sea” (1869) – Simon
Lake built “Argonaut” first
submarine to operate in the
open seas (1897)
Arthur C. Clark
Geosynchronous satellites
(1945) – Actual satellites
(1963)
Cloning - Dolly
3
What About These?
• What about Androids? Time travel? Teleportation?
• How about self testing code?
☺ May that never become a reality as long as I live…
4
Assumptions
• All future solutions are based on today’s technologies
• Disruptive innovations like new class of operating systems, development
paradigms, languages may significantly alter the landscapes
• I have a few ideas on
• New class of Operating Systems
• New class of programming languages and paradigms to support
that operating system
• Ultimately supported by new class of hardware (I have no clue
about it)
5
Problems in Testing?
• Are there problems in testing? We all seem to be doing fine!
• This probably is the biggest problem in testing - “We do not have a
problem”
• Organizations that do not do testing
and even more
• Organizations that DO INVEST in testing
• Why is this a problem? Is there a solution in future?
• Only two things are infinite, the universe and human stupidity, and I'm
not sure about the former. -- Albert Einstein
6
Challenges in Testing
• What does testing mean for different stakeholders?
• Have we tested enough?
• How much time do I need to test it?
• How many test cases should I write?
• Oracle problem
• How do I choose right set of regression tests?
• How do I get developers to do better testing?
• Do I need coding skills? What will I do with those?
• How do I teaching testing to my testers?
7
What does testing mean for different stakeholders?
Let us face it
Nobody wants testing…
If I assume testing is necessary I am likely to try to improve how to do
testing
If I believe testing is unnecessary expenditure I am likely to find ways
to minimize or eliminate testing
Mindset Matters
8
TESTING WORLD’S BIGGEST PROBLEM
• Where is the Innovation in testing?
• Innovation
• Incremental v/s Breakthrough?
• Can you name 3 breakthrough innovations in testing? At least 1?
• Where does the future of testing lie?
9
HITE
• HITE (Have I tested enough?) - Do you have a bulletproof
answer?
• First define “ENOUGH” –
• Today - No catastrophic or critical or major failure for
users
• Tomorrow - ?
• Integrated software status monitor: Explained later (Patent
Pending)
• New software technology – Self monitoring (Patent
Pending), Self healing systems
• Software testing as a distributed, automated service – you
pay software companies to install software on your system
and test it (Patent Pending)
10
How much time do I need to test it?
• What are the problems you face in answering this question?
• What are the solutions today?
• Estimation techniques? Work Breakdown Structure? Function
Points? Test Points? Use Case Points?
• Historical Data? Rule of thumb? Gut feel?
• Are these accurate? What about
• Different levels of quality, workmanship (developers and testers
both), project constraints such as budget, time and Subsequent
Regression Cycles?
• Future solution
• A mechanism to define quality level and a mechanism to measure
that level (Patent Pending)
• Collection and analysis of data in automated manner and
predictions based on current state of the project (Patent Pending)
11
Bug – The Only Perfect Being
Most bugs are because imperfect requirements are
imperfectly translated into imperfect design which is
imperfectly translated into imperfect code
12
Whole Is More Than The Sum Of Parts
• An integrated environment where
• Requirements are captured in a granular mode, risk prioritized and
tracked and checked automatically for consistency, ambiguity,
incompleteness
• Automated/assisted impact analysis is done on other
requirements/design/code
• Requirements are mapped to individual elements of design/code
• Design is also assisted along with code generation and reverse
engineering
• Unit and integration tests are automatically generated for code
• System tests of various granularity are automatically generated and
assisted based on some new and old test design techniques
13
Whole Is More Than The Sum Of Parts
• The tests are executed in an automated manner and results
compared.
• The whole stack trace is maintained and the entire application can
be replayed line by source line
• All log/user messages are bubbled up by the tool and test written to
execute those conditions and all un-displayed error messages are
flagged as holes
• Performance and coverage data is maintained
• The map of code executed has statistics on requirements related to
that piece of code, tests, testers, who wrote that code, bugs,
modules, risks, performance etc.
• Each code fix is bubbled up to impact on other related modules,
features, GUI elements, performance etc.
• Regression tests are suggested and run and coverage,
performance, security etc. are all checked again
14
Integrated Environment
Requirements
Visual
Prioritized
Self-checking
Testable
Constrained
Design – Mapped
to Requirement
Code – Mapped to
Requirements/Design
Tests – Mapped to
Requirements/Design/Code
15
Regression Test Selection Tool
16
Tool-Chain
Requirements
Test Design Tools for
automated/assisted
creation of test cases
– Q-Patterns, Unified
test design method,
Noun and Verb
Design
Code
Defect
database
Requirement
based
models
Design based
models: UML
and future
Code based
models: UML
and future
Execution tools
– CASE, CAST,
Automation
17
User Messages and API based test cases
Where is this
message called?
How?
Has it been called?
Has this log been
generated?
Code and
Repository of
“user”
messages
Log messages
generated
API tests. Have all
APIs been
covered? With
what data?
18
Coverage & Data – Integral Part of Tool-Chain
Executable
Requirement
Coverage
Risk
Coverage
Test
Coverage
Non-
functional
Coverage
Code
Coverage
Code
profiling
Code
Security
check
External
Interface
coverage
Standards
Conformance
Defect
database, data-
mining for bug-
taxonomy
Customer
care –
database,
emails,
blogs
Internet
security
advisories
Data from
application
monitoring
in
production
19
New format for writing test cases: visual format
20
Ability to replay code
Crash
21
Far Future – Where Testing?
• Tool driven
• Integral part of development
• Better ways to tie all development artifacts together
• Integration with various sources of information
• Better analysis and action
Q&A
Feel free to contact – vipul@puretesting.com

FutureOfTesting2008

  • 1.
    Software Testing –Present Problems and Future Solutions June 24, 2008
  • 2.
    2 Science Fiction –Fiction today, Reality tomorrow Jules Verne’s “Twenty Thousand leagues under the sea” (1869) – Simon Lake built “Argonaut” first submarine to operate in the open seas (1897) Arthur C. Clark Geosynchronous satellites (1945) – Actual satellites (1963) Cloning - Dolly
  • 3.
    3 What About These? •What about Androids? Time travel? Teleportation? • How about self testing code? ☺ May that never become a reality as long as I live…
  • 4.
    4 Assumptions • All futuresolutions are based on today’s technologies • Disruptive innovations like new class of operating systems, development paradigms, languages may significantly alter the landscapes • I have a few ideas on • New class of Operating Systems • New class of programming languages and paradigms to support that operating system • Ultimately supported by new class of hardware (I have no clue about it)
  • 5.
    5 Problems in Testing? •Are there problems in testing? We all seem to be doing fine! • This probably is the biggest problem in testing - “We do not have a problem” • Organizations that do not do testing and even more • Organizations that DO INVEST in testing • Why is this a problem? Is there a solution in future? • Only two things are infinite, the universe and human stupidity, and I'm not sure about the former. -- Albert Einstein
  • 6.
    6 Challenges in Testing •What does testing mean for different stakeholders? • Have we tested enough? • How much time do I need to test it? • How many test cases should I write? • Oracle problem • How do I choose right set of regression tests? • How do I get developers to do better testing? • Do I need coding skills? What will I do with those? • How do I teaching testing to my testers?
  • 7.
    7 What does testingmean for different stakeholders? Let us face it Nobody wants testing… If I assume testing is necessary I am likely to try to improve how to do testing If I believe testing is unnecessary expenditure I am likely to find ways to minimize or eliminate testing Mindset Matters
  • 8.
    8 TESTING WORLD’S BIGGESTPROBLEM • Where is the Innovation in testing? • Innovation • Incremental v/s Breakthrough? • Can you name 3 breakthrough innovations in testing? At least 1? • Where does the future of testing lie?
  • 9.
    9 HITE • HITE (HaveI tested enough?) - Do you have a bulletproof answer? • First define “ENOUGH” – • Today - No catastrophic or critical or major failure for users • Tomorrow - ? • Integrated software status monitor: Explained later (Patent Pending) • New software technology – Self monitoring (Patent Pending), Self healing systems • Software testing as a distributed, automated service – you pay software companies to install software on your system and test it (Patent Pending)
  • 10.
    10 How much timedo I need to test it? • What are the problems you face in answering this question? • What are the solutions today? • Estimation techniques? Work Breakdown Structure? Function Points? Test Points? Use Case Points? • Historical Data? Rule of thumb? Gut feel? • Are these accurate? What about • Different levels of quality, workmanship (developers and testers both), project constraints such as budget, time and Subsequent Regression Cycles? • Future solution • A mechanism to define quality level and a mechanism to measure that level (Patent Pending) • Collection and analysis of data in automated manner and predictions based on current state of the project (Patent Pending)
  • 11.
    11 Bug – TheOnly Perfect Being Most bugs are because imperfect requirements are imperfectly translated into imperfect design which is imperfectly translated into imperfect code
  • 12.
    12 Whole Is MoreThan The Sum Of Parts • An integrated environment where • Requirements are captured in a granular mode, risk prioritized and tracked and checked automatically for consistency, ambiguity, incompleteness • Automated/assisted impact analysis is done on other requirements/design/code • Requirements are mapped to individual elements of design/code • Design is also assisted along with code generation and reverse engineering • Unit and integration tests are automatically generated for code • System tests of various granularity are automatically generated and assisted based on some new and old test design techniques
  • 13.
    13 Whole Is MoreThan The Sum Of Parts • The tests are executed in an automated manner and results compared. • The whole stack trace is maintained and the entire application can be replayed line by source line • All log/user messages are bubbled up by the tool and test written to execute those conditions and all un-displayed error messages are flagged as holes • Performance and coverage data is maintained • The map of code executed has statistics on requirements related to that piece of code, tests, testers, who wrote that code, bugs, modules, risks, performance etc. • Each code fix is bubbled up to impact on other related modules, features, GUI elements, performance etc. • Regression tests are suggested and run and coverage, performance, security etc. are all checked again
  • 14.
    14 Integrated Environment Requirements Visual Prioritized Self-checking Testable Constrained Design –Mapped to Requirement Code – Mapped to Requirements/Design Tests – Mapped to Requirements/Design/Code
  • 15.
  • 16.
    16 Tool-Chain Requirements Test Design Toolsfor automated/assisted creation of test cases – Q-Patterns, Unified test design method, Noun and Verb Design Code Defect database Requirement based models Design based models: UML and future Code based models: UML and future Execution tools – CASE, CAST, Automation
  • 17.
    17 User Messages andAPI based test cases Where is this message called? How? Has it been called? Has this log been generated? Code and Repository of “user” messages Log messages generated API tests. Have all APIs been covered? With what data?
  • 18.
    18 Coverage & Data– Integral Part of Tool-Chain Executable Requirement Coverage Risk Coverage Test Coverage Non- functional Coverage Code Coverage Code profiling Code Security check External Interface coverage Standards Conformance Defect database, data- mining for bug- taxonomy Customer care – database, emails, blogs Internet security advisories Data from application monitoring in production
  • 19.
    19 New format forwriting test cases: visual format
  • 20.
  • 21.
    21 Far Future –Where Testing? • Tool driven • Integral part of development • Better ways to tie all development artifacts together • Integration with various sources of information • Better analysis and action
  • 22.
    Q&A Feel free tocontact – vipul@puretesting.com