Software Testing -
A Framework based
approach
Vipul Kocher
www.puretesting.com
Dhanasekaren R
(C) Vipul Kocher www.PureTesting.com2
Roadmap
 Premise of frameworks
 Introduction to Frameworks
 Applying Frameworks
 Caveats
(C) Vipul Kocher www.PureTesting.com3
Deja Vu
 Have you ever looked
at a new system to be
tested and thought “I
have done this
before”?
 What are the things
that you had “done
before?”
(C) Vipul Kocher www.PureTesting.com4
How do I reuse my testing
experience
 Can I leverage this
 For similar applications
 For different applications with similar features
 For the same or different organization
 How do we capture knowledge, information acquired
during the course of a project
 Can we propagate knowledge, to enable faster, better
testing
 Can one take advantage to guide comprehensively
any project through SDLC for variety of
product/project contexts across
 Testers
 Test Managers
 Test Process
Introduction to
Framework
(C) Vipul Kocher www.PureTesting.com6
Introduction to Framework
 The free dictionary[1] defines a framework as:
1. A structure for supporting or enclosing something else, especially a skeletal
support used as the basis for something being constructed.
2. An external work platform; a scaffold.
3. A fundamental structure, as for a written work.
4. A set of assumptions, concepts, values, and practices that constitutes a way of
viewing reality.
 Wikipedia[2] defines a framework as
 “a real or conceptual structure intended to serve as a support or guide for the
building of something that expands the structure into something useful.”
 Thus a framework is
 an “external” structure that supports some activity and consists of various
things such as assumtions, practices, concepts, tools and various other things
which can be used to create models and thus be useful for whatever activity for
which these are being applied.
[1] http://www.thefreedictionary.com/framework [2] htttp://en.wikipedia.org/wiki/Framework
(C) Vipul Kocher www.PureTesting.com7
Introduction to Framework
 Tester’s questions
 How do I take advantage of my learning in the next project
 Can I compare metrics, and manage/improve for next
phases/regress cycles
 How can I reuse what I added as a method/adapted a tool/type
of testing, to reveal a class of defects
 How can I sharpen my skills <Problem solving, frees up time to
think of the problem and strategize, track my Knowledge Gap>
 How can I synthesize my learning, and choose to track what is
relevant
 How do I Learn from others?
 Explain my practice as a Transfer of Testing Technology, Mentor
some one
(C) Vipul Kocher www.PureTesting.com8
Introduction to Framework
 Tester’s outcomes
 My diary of events – weblog, new tests, transient knowledge are
made explicit
 Helps me reflect on a new method/a tool that I adapted, that
helped me reveal a class of defects
 Enables to compile a tool box, which I have relationship with and
will have inclination to pick up and use in a given situation
 At each phase of a SDLC – Agile/Iterative/Spiral/Waterfall one
can have an application of the FW for stages/parts
 I can start identifying patterns, categorize - Taxonomy
 Leads to creating my database, becomes powerful when I start
sharing with another tester <helps validating use of my method to
another context, can enable adding new features like a open
source feedback>
 Promotes to develop systematic problem solving, innovation
(C) Vipul Kocher www.PureTesting.com9
Introduction to Framework
 Lead/Manager’s point of view
 How can I help my testing team view testing as a cohesive set of
activities
 Can I give a set of flexible tools, processes
 Guidelines as a starting point, team experiments and improves
as they do
 How do I bring in points of view, leverage team members
strengths, collect data on risks within and across projects
 How to evolve processes, metrics within the team
 Sharing of Lessons learnt
 How to manage transitions in leadership of test leads managers
with minimal impact to team and culture
 How can my team learn from other parts of the organization
(C) Vipul Kocher www.PureTesting.com10
Introduction to Framework
 Lead/Manager’s outcomes
 Motivates teams to learn from each other’s experiences and
answers emerge
 Improve the effectiveness through Communication and
Collaboration, process improvements, learning
 Enables collective thinking on context specific
 Risks
 Models
 Metrics
 Dashboard
 Empowerment, Trust and Open feedback
(C) Vipul Kocher www.PureTesting.com11
Introduction to Framework
 Process point of view
 How can we evolve a right Test strategy
 Can we tailor depth of testing
 Can we bring out reporting of metrics based on goals
 Process point of view outcomes
 Higher test quality
 With depth of testing, allows teams to choose the mix on testing
 Aligns to goals, real data as the project unfolds
(C) Vipul Kocher www.PureTesting.com12
Simple
(C) Vipul Kocher www.PureTesting.com13
Expanded
Applying Framework:
Requirements, Design
(C) Vipul Kocher www.PureTesting.com15
Applying Framework-Requirements
 Tool box
 Story boards
 Context Free questions (reference: Exploring requirements, Weinberg and Gause)
 Who is the client
 What is a highly successful solution worth to this client
 What is the real reason for wanting to solve this problem
 How much time do we have for this project
 What problems does this product solve/create
 What environment is this product likely to encounter
 Review requirements
 Stressing words
o “Mary had a little lamb”, <It was Mary’s lamb and not some one else>
o “Mary had a little lamb” <She no longer has the lamb>
 Interpretation against various contexts by substituting synonyms <examine dictionary
meanings>
 Noun, Verb technique
o Look for properties of this noun, ask What, Why, When, Where, Who, Which, How,
o How much/many
o Look for properties of this verb, ask What, Why, When, Where, Who, Which, How,
o How much/many
 Modeling
 State transitions, equivalence partitions,
 Requirement formalism that can enable test case generation from UML, Specification and
Description Language, Entity Relation ship Diagrams
 Categorize, Risk Prioritization
 Q-Patterns: User centric views
(C) Vipul Kocher www.PureTesting.com16
 Understand underlying and impacting technology
 Complexities due to shifts such as: External failures leads to increase in Data
protection/replication, Vulnerability and faults leads to increase in coverage for Security
testing
 Understanding Technology trends and adapt to those shifts ahead, or in parallel. Some
examples are
 SaaS, SoA
 Rich embedded devices
 Wireless/Security
 Processes
 Development models
 RUP – Use cases to derive tests
 Agile – Story boards, WiKi
 V model – Acceptance tests first
 Metrics
 Trace ability – Every requirement is mapped to one or more test cases
 IEEE standard on coverage - The degree to which a given test or set of tests addresses all specified requirements for a given system or
component.
 Your own Requirement Coverage Index – Rt/SR where Rt = Number of requirements or Use Cases for which Test cases has been
written and SR = Number of specified requirements or use cases within scope
 Improvements
 Defect escape in Requirement phase
 Requirements change that lead to a risk, in spite of change management
Applying Framework Requirements
(C) Vipul Kocher www.PureTesting.com17
 Skills/knowledge
 Skills inventory
o That will enable me for Planning, Analyzing, Implementing, Designing, Executing
 Skills required
o Elicitation, Interviewing, Visual methods. Ability to review
o Understand constrains, Tradeoffs
 Sources for acquiring skills
o SPIN groups
o IEEE Requirements Engineering conferences
 Contextual
– New idea or product
– Tester who comes in when most of the requirements are laid out will need to
o Clarify the solution, even though the designers would have gone through a certain level of exploring the solution, scope
definition.
o Begin again, since the tester would have missed the beginning process of thoughts
o Understand who are the users and the perceive needs
– Existing solution
– Tester, may not have the benefit of an updated document and could have the following approaches
o Hands on of the working product or an equivalent solution, studying every possible action that can be exercised and see
which of them map to the new solution
o Traverse through every function
o Understand existing design documents, to understand various interfaces between the system modules
– Enhancement versions
o Impact of new requirements
o How to manage continuous new requirements
o Bugs as a source of enhancements/new requirements
Applying Framework Requirements
(C) Vipul Kocher www.PureTesting.com18
 Tool box
 Resources
 Bookmarks
o http://www.testingfaqs.org/t-design.html
o http://www.testingeducation.org/BBST/ (BBST course from Cem Kaner)
o http://www.satisfice.com/tools/satisfice-tsm-4p.pdf
o Grochtmann, M., and Wegener, J. "Test Case Design Using Classification-Trees and the Classification-
Tree Editor CTE"
 Articles: Practitioners sharing on stickyminds.com, Testing experts blogs
 Books
o Lee Copeland: A practitioner's guide to software test design
o Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen (1999)
o Boris Beizer: Test Design techniques
o Robert Binder. Testing Object-oriented Systems
 Documents
 Checklists
 Test target checklist
 Pradeep Soundarajan’s screen saver
 Templates
 IEE829 Test design, test case and test procedure templates
 Rex Black’s excel template
 Tabular template
 Q-Patterns, Many Q-Patterns exists for various domains
Applying Framework Design
(C) Vipul Kocher www.PureTesting.com19
 Tool box
 Techniques
o Boundary Value Analysis, Equivalence Partitioning
o Cause-effect diagram, Decision tables, Orthogonal arrays and all-pairs
o State-Transition tables, Finite State machines do Node/Edge coverage
o Extension to Noun-Verb Technique
o Heuristic or exploratory tests
o Domain based tests, Syntax testing
o Bug taxonomy based test design
o Fault/attack models
 Lessons learnt
o "Lessons Learnt", examples, stories
o Using bugs to find gaps in written tests
o Uselessness/Usefulness of: Detailed test scripts, Conversion of bugs to test cases, Group review of test
cases
o Gray-box test design
o Stories: Reduction in bug count for Mobile Notes
 Tools Open-Source/Free-ware/COTS
o All-Pairs, Jenny, Multi, Dadada
o COTS: Case maker BenderRBT
o Test data generators
o Visualization tools
Applying Framework Design
(C) Vipul Kocher www.PureTesting.com20
 Processes
 Testing design processes
o FSM models can give test cases early
 Models
o Agile/Test Driven – focus on automated unit testing
o V model – Early test design
o RUP – Use case driven test design
 Metrics
o Test case design productivity
o Test case per function
o Test case to bug ratio
o Missed test cases percentage
 Process improvements
o Bug taxonomy based test design
o Improving coverage by bug analysis
o Improving coverage by reliability analysis
 Skills/knowledge
 Skills inventory
o That will enable me for Planning, Analyzing, Implementing, Designing, Executing
 Knowledge map
o Domain, Product, Technology
 Skills required
o Estimation, Test design techniques
 Sources for acquiring skills
o Certification – ISTQB, CSTE, CSTP
o Education – BBST course
o Practice – Volunteer for Test design for projects, for open source projects, reading test cases for various open-source
projects
Applying Framework Design
(C) Vipul Kocher www.PureTesting.com21
Framework: Caveats
 Possibility of
 unwieldy nodes, contradictions
 It is not just a simple application to help in testing nor is it
an automation tool based framework. It is also not a new
method to do testing or a new process model!
 Framework is not available as an IDE or a tool but that is
something that authors consider as work in progress.
(C) Vipul Kocher www.PureTesting.com22
More information
 http://www.whatistesting.com/qpatterns.htm
 Mail: vipul at PureTesting.com
Thanks

Vipul Kocher - Software Testing, A Framework Based Approach

  • 1.
    Software Testing - AFramework based approach Vipul Kocher www.puretesting.com Dhanasekaren R
  • 2.
    (C) Vipul Kocherwww.PureTesting.com2 Roadmap  Premise of frameworks  Introduction to Frameworks  Applying Frameworks  Caveats
  • 3.
    (C) Vipul Kocherwww.PureTesting.com3 Deja Vu  Have you ever looked at a new system to be tested and thought “I have done this before”?  What are the things that you had “done before?”
  • 4.
    (C) Vipul Kocherwww.PureTesting.com4 How do I reuse my testing experience  Can I leverage this  For similar applications  For different applications with similar features  For the same or different organization  How do we capture knowledge, information acquired during the course of a project  Can we propagate knowledge, to enable faster, better testing  Can one take advantage to guide comprehensively any project through SDLC for variety of product/project contexts across  Testers  Test Managers  Test Process
  • 5.
  • 6.
    (C) Vipul Kocherwww.PureTesting.com6 Introduction to Framework  The free dictionary[1] defines a framework as: 1. A structure for supporting or enclosing something else, especially a skeletal support used as the basis for something being constructed. 2. An external work platform; a scaffold. 3. A fundamental structure, as for a written work. 4. A set of assumptions, concepts, values, and practices that constitutes a way of viewing reality.  Wikipedia[2] defines a framework as  “a real or conceptual structure intended to serve as a support or guide for the building of something that expands the structure into something useful.”  Thus a framework is  an “external” structure that supports some activity and consists of various things such as assumtions, practices, concepts, tools and various other things which can be used to create models and thus be useful for whatever activity for which these are being applied. [1] http://www.thefreedictionary.com/framework [2] htttp://en.wikipedia.org/wiki/Framework
  • 7.
    (C) Vipul Kocherwww.PureTesting.com7 Introduction to Framework  Tester’s questions  How do I take advantage of my learning in the next project  Can I compare metrics, and manage/improve for next phases/regress cycles  How can I reuse what I added as a method/adapted a tool/type of testing, to reveal a class of defects  How can I sharpen my skills <Problem solving, frees up time to think of the problem and strategize, track my Knowledge Gap>  How can I synthesize my learning, and choose to track what is relevant  How do I Learn from others?  Explain my practice as a Transfer of Testing Technology, Mentor some one
  • 8.
    (C) Vipul Kocherwww.PureTesting.com8 Introduction to Framework  Tester’s outcomes  My diary of events – weblog, new tests, transient knowledge are made explicit  Helps me reflect on a new method/a tool that I adapted, that helped me reveal a class of defects  Enables to compile a tool box, which I have relationship with and will have inclination to pick up and use in a given situation  At each phase of a SDLC – Agile/Iterative/Spiral/Waterfall one can have an application of the FW for stages/parts  I can start identifying patterns, categorize - Taxonomy  Leads to creating my database, becomes powerful when I start sharing with another tester <helps validating use of my method to another context, can enable adding new features like a open source feedback>  Promotes to develop systematic problem solving, innovation
  • 9.
    (C) Vipul Kocherwww.PureTesting.com9 Introduction to Framework  Lead/Manager’s point of view  How can I help my testing team view testing as a cohesive set of activities  Can I give a set of flexible tools, processes  Guidelines as a starting point, team experiments and improves as they do  How do I bring in points of view, leverage team members strengths, collect data on risks within and across projects  How to evolve processes, metrics within the team  Sharing of Lessons learnt  How to manage transitions in leadership of test leads managers with minimal impact to team and culture  How can my team learn from other parts of the organization
  • 10.
    (C) Vipul Kocherwww.PureTesting.com10 Introduction to Framework  Lead/Manager’s outcomes  Motivates teams to learn from each other’s experiences and answers emerge  Improve the effectiveness through Communication and Collaboration, process improvements, learning  Enables collective thinking on context specific  Risks  Models  Metrics  Dashboard  Empowerment, Trust and Open feedback
  • 11.
    (C) Vipul Kocherwww.PureTesting.com11 Introduction to Framework  Process point of view  How can we evolve a right Test strategy  Can we tailor depth of testing  Can we bring out reporting of metrics based on goals  Process point of view outcomes  Higher test quality  With depth of testing, allows teams to choose the mix on testing  Aligns to goals, real data as the project unfolds
  • 12.
    (C) Vipul Kocherwww.PureTesting.com12 Simple
  • 13.
    (C) Vipul Kocherwww.PureTesting.com13 Expanded
  • 14.
  • 15.
    (C) Vipul Kocherwww.PureTesting.com15 Applying Framework-Requirements  Tool box  Story boards  Context Free questions (reference: Exploring requirements, Weinberg and Gause)  Who is the client  What is a highly successful solution worth to this client  What is the real reason for wanting to solve this problem  How much time do we have for this project  What problems does this product solve/create  What environment is this product likely to encounter  Review requirements  Stressing words o “Mary had a little lamb”, <It was Mary’s lamb and not some one else> o “Mary had a little lamb” <She no longer has the lamb>  Interpretation against various contexts by substituting synonyms <examine dictionary meanings>  Noun, Verb technique o Look for properties of this noun, ask What, Why, When, Where, Who, Which, How, o How much/many o Look for properties of this verb, ask What, Why, When, Where, Who, Which, How, o How much/many  Modeling  State transitions, equivalence partitions,  Requirement formalism that can enable test case generation from UML, Specification and Description Language, Entity Relation ship Diagrams  Categorize, Risk Prioritization  Q-Patterns: User centric views
  • 16.
    (C) Vipul Kocherwww.PureTesting.com16  Understand underlying and impacting technology  Complexities due to shifts such as: External failures leads to increase in Data protection/replication, Vulnerability and faults leads to increase in coverage for Security testing  Understanding Technology trends and adapt to those shifts ahead, or in parallel. Some examples are  SaaS, SoA  Rich embedded devices  Wireless/Security  Processes  Development models  RUP – Use cases to derive tests  Agile – Story boards, WiKi  V model – Acceptance tests first  Metrics  Trace ability – Every requirement is mapped to one or more test cases  IEEE standard on coverage - The degree to which a given test or set of tests addresses all specified requirements for a given system or component.  Your own Requirement Coverage Index – Rt/SR where Rt = Number of requirements or Use Cases for which Test cases has been written and SR = Number of specified requirements or use cases within scope  Improvements  Defect escape in Requirement phase  Requirements change that lead to a risk, in spite of change management Applying Framework Requirements
  • 17.
    (C) Vipul Kocherwww.PureTesting.com17  Skills/knowledge  Skills inventory o That will enable me for Planning, Analyzing, Implementing, Designing, Executing  Skills required o Elicitation, Interviewing, Visual methods. Ability to review o Understand constrains, Tradeoffs  Sources for acquiring skills o SPIN groups o IEEE Requirements Engineering conferences  Contextual – New idea or product – Tester who comes in when most of the requirements are laid out will need to o Clarify the solution, even though the designers would have gone through a certain level of exploring the solution, scope definition. o Begin again, since the tester would have missed the beginning process of thoughts o Understand who are the users and the perceive needs – Existing solution – Tester, may not have the benefit of an updated document and could have the following approaches o Hands on of the working product or an equivalent solution, studying every possible action that can be exercised and see which of them map to the new solution o Traverse through every function o Understand existing design documents, to understand various interfaces between the system modules – Enhancement versions o Impact of new requirements o How to manage continuous new requirements o Bugs as a source of enhancements/new requirements Applying Framework Requirements
  • 18.
    (C) Vipul Kocherwww.PureTesting.com18  Tool box  Resources  Bookmarks o http://www.testingfaqs.org/t-design.html o http://www.testingeducation.org/BBST/ (BBST course from Cem Kaner) o http://www.satisfice.com/tools/satisfice-tsm-4p.pdf o Grochtmann, M., and Wegener, J. "Test Case Design Using Classification-Trees and the Classification- Tree Editor CTE"  Articles: Practitioners sharing on stickyminds.com, Testing experts blogs  Books o Lee Copeland: A practitioner's guide to software test design o Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen (1999) o Boris Beizer: Test Design techniques o Robert Binder. Testing Object-oriented Systems  Documents  Checklists  Test target checklist  Pradeep Soundarajan’s screen saver  Templates  IEE829 Test design, test case and test procedure templates  Rex Black’s excel template  Tabular template  Q-Patterns, Many Q-Patterns exists for various domains Applying Framework Design
  • 19.
    (C) Vipul Kocherwww.PureTesting.com19  Tool box  Techniques o Boundary Value Analysis, Equivalence Partitioning o Cause-effect diagram, Decision tables, Orthogonal arrays and all-pairs o State-Transition tables, Finite State machines do Node/Edge coverage o Extension to Noun-Verb Technique o Heuristic or exploratory tests o Domain based tests, Syntax testing o Bug taxonomy based test design o Fault/attack models  Lessons learnt o "Lessons Learnt", examples, stories o Using bugs to find gaps in written tests o Uselessness/Usefulness of: Detailed test scripts, Conversion of bugs to test cases, Group review of test cases o Gray-box test design o Stories: Reduction in bug count for Mobile Notes  Tools Open-Source/Free-ware/COTS o All-Pairs, Jenny, Multi, Dadada o COTS: Case maker BenderRBT o Test data generators o Visualization tools Applying Framework Design
  • 20.
    (C) Vipul Kocherwww.PureTesting.com20  Processes  Testing design processes o FSM models can give test cases early  Models o Agile/Test Driven – focus on automated unit testing o V model – Early test design o RUP – Use case driven test design  Metrics o Test case design productivity o Test case per function o Test case to bug ratio o Missed test cases percentage  Process improvements o Bug taxonomy based test design o Improving coverage by bug analysis o Improving coverage by reliability analysis  Skills/knowledge  Skills inventory o That will enable me for Planning, Analyzing, Implementing, Designing, Executing  Knowledge map o Domain, Product, Technology  Skills required o Estimation, Test design techniques  Sources for acquiring skills o Certification – ISTQB, CSTE, CSTP o Education – BBST course o Practice – Volunteer for Test design for projects, for open source projects, reading test cases for various open-source projects Applying Framework Design
  • 21.
    (C) Vipul Kocherwww.PureTesting.com21 Framework: Caveats  Possibility of  unwieldy nodes, contradictions  It is not just a simple application to help in testing nor is it an automation tool based framework. It is also not a new method to do testing or a new process model!  Framework is not available as an IDE or a tool but that is something that authors consider as work in progress.
  • 22.
    (C) Vipul Kocherwww.PureTesting.com22 More information  http://www.whatistesting.com/qpatterns.htm  Mail: vipul at PureTesting.com
  • 23.