Test Analyst 
ISTQB® CTAL TAv1.0 
Samer Desouky
ISTQB® – CTAL - TAv1.0 
01- Test Process
Activities 
•Perform 
•Prioritize 
Techniques 
•Select & Apply 
•Confidence & Coverage 
Types 
•Functionality 
•Usability 
Reviews 
•Knowledge of typical mistakes 
Who is the Test Analyst 
Documentation 
•Needs 
Defect 
•Classification Schema 
Tools 
•Support the test process
Confidence 
•RISK 
•Prioritization 
•Dependency 
•Coverage 
Who is the Test Analyst
1.2 Testing in the 
Software Development Lifecycle
Planning, monitoring and control 
Analysis and design 
Implementation and execution 
Evaluating exit criteria and reporting 
Test closure activities 
Testing in the Software Development Lifecycle 
Fundamental test process
•Better fit the software development lifecycle, and to facilitate effective test monitoring and control: 
•Planning, monitoring and control 
•ANALYSIS 
•DESIGN 
•IMPLEMENTATION 
•EXECUTION 
•Evaluating exit criteria and reporting 
•Test closure activities 
Testing in the Software Development Lifecycle
Activities implemented sequentially or in parallel 
Test cases Design and Execution are the primary areas of concentration 
Testing Approach Influenced by 
•SDLC 
•System Type 
Test Strategy should consider The long-term lifecycle approach to testing 
The moment of involvement is different for the various lifecycles: 
•Amount of involvement 
•Time required 
•Information available 
Testing activities must be aligned with the chosen software development lifecycle model. 
Testing in the Software Development Lifecycle 
Keynotes
Test Planning 
•Project Planning 
Monitoring & Control 
•Until Execution & Closure 
TAD 
•Requirement Elicitation 
•System Design 
Test Environment Implementation 
•System Design 
•Coding 
Test Execution 
•Entry Criteria 
•CT has been done 
Evaluation & Reporting 
•Test Execution 
Test Closure 
•Exit Criteria 
•Test Execution 
• > Acceptance 
Testing in the Software Development Lifecycle 
Sequential Development:
Testing in the Software Development Lifecycle 
Requirements 
Analysis 
Design 
Code 
Component Test 
Integration Test 
System Test 
Acceptance Test
•Iterative-Incremental Development: 
Testing in the Software Development Lifecycle 
Planning 
Closure 
Analysis 
Design 
Implementation 
& 
Execution 
Evaluation 
Standard test processes for each iteration
Testing in the Software Development Lifecycle 
Analysis 
& 
Design 
Testing 
Implementation 
Build 
Demonstrate 
Refine
Iterative 
Incremental 
Agile 
Testing in the Software Development Lifecycle
Don’t depend on models naming to indicate the proper moment of involvement, but: 
•Understand the expectations from involvement 
•Moment of involvement 
•Amount of involvement 
•Time required 
•Information available 
Testing in the Software Development Lifecycle
1.3 Test Planning, 
Monitoring and Control
•Test planning for the most occurs at the initiation of the test effort 
•Identify and plan the activities, resources required to meet the mission and objectives identified in the test strategy 
Test Planning, Monitoring and Control 
Test Analyst working with the Test Manager
Plan 
•The needed test types 
•Estimates needed to test environment 
•Configuration test 
•Techniques and Coverage 
•Documentation Test 
•Installation Test 
•Align with SDLC 
•Confirmation and Regression 
•Risk Analysis 
Test Planning, Monitoring and Control
Monitor & Control 
Test Manager’s job, concerned with compiling and reporting the summarized metric information 
Test Analyst’s job to gather, ensure that the data is accurate, timely and objective 
Test Planning, Monitoring and Control
1.4 Test Analysis
Entry Criteria 
Test Object documentation are available 
Documents reviewed and updated 
Budget and Schedule availability 
Test Analysis – Entry Criteria
•It’s important to choices when identifying and documenting test conditions the level of details and structure of documentation and to be aligned with your test strategy. 
•Using hierarchical approach to defining test conditions can help to ensure the coverage is sufficient for the high-level items. 
Test Analysis - Documentation
Test Analysis - Mind Mapping
How to create mind map 
•Center you page with your idea/objective 
•Don’t be serious, branch whatever you think about 
•Associate without evaluation 
•Think as fast as you can 
•Think out of the box with no boundaries 
•Don’t judge, just brainstorm 
•Go, go, go, keep your hand moving, draw empty box and let your mind find ideas , get the ideas out of your head 
•Connect ideas, create themes, affinities 
Test Analysis - Mind Mapping
Laws 
•Start in the center 
•Connecting lines 
•Coloring 
•Correlation 
•Keep it radiant 
Test Analysis - Mind Mapping
Test Analysis - Mind Mapping 
-We can design the test data 
-We can design the test cases 
-We can design the test environment
1.5 Test Design
•Using same scope determined during test planning, the test process continues as the Test Analyst designs the tests which will be implemented and executed. The process of test design includes the following activities: 
Test Design 
Details level 
Design technique for best coverage 
Create test cases
Some test items are better addressed by defining only the test conditions rather than scripted tests 
Test conditions guide the unscripted testing 
The pass/fail criteria should be clearly identified 
Understandable by other testers 
Understandable by other stakeholders, review the tests, audits, approve the tests 
Test Design
Concert Test Case 
•A test case with implementation level (I/O) 
•Logical operators replaced by actual values 
Logic Test Case 
•A test case without implementation level (I/O). 
•Logical operators are used 
Test Design – Types
Test Design - Test Case Types
•If you will use the Logical Test Cases to develop concrete test cases, wait until requirements become more defined and stable.an then the creation is done sequentially, flowing from logical to concrete with only the concrete test cases used for execution. 
Test Design
Test Design - Process 
Test Strategy / Test Plan 
Test Conditions 
Test Design Techniques 
Test Case Creation 
Refinements
Characteristics of good test case 
•Repeatable 
•Verifiable 
•Traceable 
Test Design - Characteristics
Challenges 
•Tedious Activity 
•Error-prone Activity 
•Complex interactions 
•Test basis not clear, contradictory 
•Outputs, data, post-conditions 
Test Design – Expected Results 
Test Oracles 
SMEs 
Low Added Value Spurious Results
•Test oracle is a source we use to determine the expected results of a test. 
•Compare these expected results with the actual results when we run a test. 
•Examples: 
•Specifications and Documentation 
•User manual 
•Existing system 
•Previous release 
•Other products 
•Individual's specialized knowledge 
Test Design – Test Oracle
•What not an Oracle? 
•Never use the code itself as an oracle, even for structural testing, because that's simply testing that the compiler work or not. 
Test Design – Test Oracle
•Oracle Problem: 
•In reality: 
•Documents Contradict each other 
•Have gaps 
•Omitting important characteristics of the product 
•Sometimes missing entirely 
•Exist but useless 
•Delivered late 
Test Design – Test Oracle
•Oracle Problem: 
•The Test Analyst must solve the Oracle Problem. 
•How to know the “Correct Results"? 
•The difficulty of knowing. 
Test Design – Test Oracle
Project risks 
•What must/must not be documented 
The “value added” 
•Which the documentation brings to the project 
Standards 
•Followed and/or regulations to be met 
Lifecycle model used 
•Traditional or Agile 
The requirement for traceability 
TAD – How Much Documentation
•Depending on the scope of the testing, test analysis and design address the Quality Characteristics for the test object(s). 
TAD – and Quality Characteristics
•A user story should include a definition of the acceptance criteria. 
•If the acceptance criteria fulfilled , the system considered to be ready for integration with the other completed functionality. 
TAD – and Agile
•May be defined, May not be finalized until test implementation. 
•Test infrastructure includes more than test objects and testware. 
•All items required to run the tests. 
TAD – and Test Infrastructure 
Rooms 
Equipment 
Personnel 
Software 
Tools 
Peripherals 
Communications equipment 
User authorizations
•The exit criteria will vary depending on the project parameters. 
•Measurable and ensure that all the information and preparation required for the subsequent steps have been provided. 
TAD – and Exit Criteria
1.6 Test Implementation
•Test Suites 
•Risk priority order 
•Right people 
•Equipment 
•Data 
•Functionality 
Execution Order 
Test Implementation - EO
• Test Analysts should finalize and confirm the order in which manual 
and automated tests are to be run, carefully checking for 
constraints that might require tests to be run in a particular order. 
• Dependencies must be documented and checked. 
Test Implementation - DR 
Test 1 Test 2 Test 3 Test 4 Test 5 Test 6 Test 7 Test 8 
Test 1 
Test 2 X 
Test 3 X 
Test 4 X 
Test 5 X 
Test 6 X 
Test 7 X X 
Test 8 X
•The level of detail and associated complexity for work done during test implementation may be influenced by the detail of the test cases and test conditions. 
•In some cases regulatory rules apply, and tests should provide evidence of compliance to applicable standards such as the United States Federal Aviation Administration’s DO-178B/ED 12B [RTCA DO- 178B/ED-12B] which is standard for Software Considerations in Airborne Systems and Equipment Certification is a document dealing with the safety of software used in certain airborne systems, SWIFT book. 
Test Implementation - DR 
http://www.davi.ws/avionics/TheAvionicsHandbook_Cap_27.pdf http://www.swift.com/resources/documents/standards_inventory_of_messages.pdf
•Test data is needed for testing, and in some cases these sets of data can be quite large. 
•Test Analysts create input and environment data to load into databases and other such repositories. 
Test Implementation - Data 
•Test Analysts also create data to be used with data-driven automation tests as well as for manual testing. 
http://www.generatedata.com/
•Test implementation is also concerned with the test environment(s). 
•During this stage the environment(s) should be fully set up and verified prior to test execution. 
Test Implementation - TE
A "fit for purpose" Attributes 
•Capable of enabling the exposure of the defects present during controlled testing 
•Operate normally when failures are not occurring 
•Replicate, if required 
•Production or end-user environment for higher levels of testing 
Test Implementation - TE
•Testers must ensure that those responsible for the creation and maintenance of the test environment are known and available, and that all the testware and test support tools and associated processes are ready for use. 
•This includes configuration management, defect management, and test logging and management. 
Test Implementation - TE 
•In addition, Test Analysts must verify the procedures that gather data for exit criteria evaluation and test results reporting.
•Dynamic Test Strategies: 
•Risk-based analytical test strategies are often blended with dynamic test strategies. 
•In this case, some percentage of the test implementation effort is allocated to testing which does not follow predetermined scripts (unscripted) 
•Unscripted testing should not be ad hoc or aimless as this can be unpredictable in duration and coverage unless time boxed and chartered. 
Test Implementation
1.7 Test Execution
Testing Errors 
False-Negative 
Missed Failure 
False-Positive 
Misclassified a Behavior 
Test Execution
Uniquely identified 
Specific versions 
Tests, activities, events 
Strategy dependence 
Support test processes 
Influenced by regulatory 
Address coverage & repeatability 
Test Execution - Logging
Consideration 
•Observations 
•misbehaving 
•Test Suites development continuous process. 
•Gaps 
•Rerun manual test 
•Investigate 
•Mine the defects data 
•Scripter and unscripted 
•Regression Capabilities 
Test Execution – Tips
Software Testing is our Profession. 
www.testproeg.com 
Contact Us 
+2 0222756841 
+2 01021902447 
+2 01000190709 
30 Ahmad El-Zomor St., Nasr City, Cairo, Egypt 
E-Mail: info@testproeg.com

ISTQB CTAL - Test Analyst

  • 1.
    Test Analyst ISTQB®CTAL TAv1.0 Samer Desouky
  • 4.
    ISTQB® – CTAL- TAv1.0 01- Test Process
  • 5.
    Activities •Perform •Prioritize Techniques •Select & Apply •Confidence & Coverage Types •Functionality •Usability Reviews •Knowledge of typical mistakes Who is the Test Analyst Documentation •Needs Defect •Classification Schema Tools •Support the test process
  • 6.
    Confidence •RISK •Prioritization •Dependency •Coverage Who is the Test Analyst
  • 7.
    1.2 Testing inthe Software Development Lifecycle
  • 8.
    Planning, monitoring andcontrol Analysis and design Implementation and execution Evaluating exit criteria and reporting Test closure activities Testing in the Software Development Lifecycle Fundamental test process
  • 9.
    •Better fit thesoftware development lifecycle, and to facilitate effective test monitoring and control: •Planning, monitoring and control •ANALYSIS •DESIGN •IMPLEMENTATION •EXECUTION •Evaluating exit criteria and reporting •Test closure activities Testing in the Software Development Lifecycle
  • 10.
    Activities implemented sequentiallyor in parallel Test cases Design and Execution are the primary areas of concentration Testing Approach Influenced by •SDLC •System Type Test Strategy should consider The long-term lifecycle approach to testing The moment of involvement is different for the various lifecycles: •Amount of involvement •Time required •Information available Testing activities must be aligned with the chosen software development lifecycle model. Testing in the Software Development Lifecycle Keynotes
  • 11.
    Test Planning •ProjectPlanning Monitoring & Control •Until Execution & Closure TAD •Requirement Elicitation •System Design Test Environment Implementation •System Design •Coding Test Execution •Entry Criteria •CT has been done Evaluation & Reporting •Test Execution Test Closure •Exit Criteria •Test Execution • > Acceptance Testing in the Software Development Lifecycle Sequential Development:
  • 12.
    Testing in theSoftware Development Lifecycle Requirements Analysis Design Code Component Test Integration Test System Test Acceptance Test
  • 13.
    •Iterative-Incremental Development: Testingin the Software Development Lifecycle Planning Closure Analysis Design Implementation & Execution Evaluation Standard test processes for each iteration
  • 14.
    Testing in theSoftware Development Lifecycle Analysis & Design Testing Implementation Build Demonstrate Refine
  • 15.
    Iterative Incremental Agile Testing in the Software Development Lifecycle
  • 16.
    Don’t depend onmodels naming to indicate the proper moment of involvement, but: •Understand the expectations from involvement •Moment of involvement •Amount of involvement •Time required •Information available Testing in the Software Development Lifecycle
  • 17.
    1.3 Test Planning, Monitoring and Control
  • 18.
    •Test planning forthe most occurs at the initiation of the test effort •Identify and plan the activities, resources required to meet the mission and objectives identified in the test strategy Test Planning, Monitoring and Control Test Analyst working with the Test Manager
  • 19.
    Plan •The neededtest types •Estimates needed to test environment •Configuration test •Techniques and Coverage •Documentation Test •Installation Test •Align with SDLC •Confirmation and Regression •Risk Analysis Test Planning, Monitoring and Control
  • 20.
    Monitor & Control Test Manager’s job, concerned with compiling and reporting the summarized metric information Test Analyst’s job to gather, ensure that the data is accurate, timely and objective Test Planning, Monitoring and Control
  • 21.
  • 22.
    Entry Criteria TestObject documentation are available Documents reviewed and updated Budget and Schedule availability Test Analysis – Entry Criteria
  • 23.
    •It’s important tochoices when identifying and documenting test conditions the level of details and structure of documentation and to be aligned with your test strategy. •Using hierarchical approach to defining test conditions can help to ensure the coverage is sufficient for the high-level items. Test Analysis - Documentation
  • 24.
    Test Analysis -Mind Mapping
  • 25.
    How to createmind map •Center you page with your idea/objective •Don’t be serious, branch whatever you think about •Associate without evaluation •Think as fast as you can •Think out of the box with no boundaries •Don’t judge, just brainstorm •Go, go, go, keep your hand moving, draw empty box and let your mind find ideas , get the ideas out of your head •Connect ideas, create themes, affinities Test Analysis - Mind Mapping
  • 26.
    Laws •Start inthe center •Connecting lines •Coloring •Correlation •Keep it radiant Test Analysis - Mind Mapping
  • 27.
    Test Analysis -Mind Mapping -We can design the test data -We can design the test cases -We can design the test environment
  • 28.
  • 29.
    •Using same scopedetermined during test planning, the test process continues as the Test Analyst designs the tests which will be implemented and executed. The process of test design includes the following activities: Test Design Details level Design technique for best coverage Create test cases
  • 30.
    Some test itemsare better addressed by defining only the test conditions rather than scripted tests Test conditions guide the unscripted testing The pass/fail criteria should be clearly identified Understandable by other testers Understandable by other stakeholders, review the tests, audits, approve the tests Test Design
  • 31.
    Concert Test Case •A test case with implementation level (I/O) •Logical operators replaced by actual values Logic Test Case •A test case without implementation level (I/O). •Logical operators are used Test Design – Types
  • 32.
    Test Design -Test Case Types
  • 33.
    •If you willuse the Logical Test Cases to develop concrete test cases, wait until requirements become more defined and stable.an then the creation is done sequentially, flowing from logical to concrete with only the concrete test cases used for execution. Test Design
  • 34.
    Test Design -Process Test Strategy / Test Plan Test Conditions Test Design Techniques Test Case Creation Refinements
  • 35.
    Characteristics of goodtest case •Repeatable •Verifiable •Traceable Test Design - Characteristics
  • 36.
    Challenges •Tedious Activity •Error-prone Activity •Complex interactions •Test basis not clear, contradictory •Outputs, data, post-conditions Test Design – Expected Results Test Oracles SMEs Low Added Value Spurious Results
  • 37.
    •Test oracle isa source we use to determine the expected results of a test. •Compare these expected results with the actual results when we run a test. •Examples: •Specifications and Documentation •User manual •Existing system •Previous release •Other products •Individual's specialized knowledge Test Design – Test Oracle
  • 38.
    •What not anOracle? •Never use the code itself as an oracle, even for structural testing, because that's simply testing that the compiler work or not. Test Design – Test Oracle
  • 39.
    •Oracle Problem: •Inreality: •Documents Contradict each other •Have gaps •Omitting important characteristics of the product •Sometimes missing entirely •Exist but useless •Delivered late Test Design – Test Oracle
  • 40.
    •Oracle Problem: •TheTest Analyst must solve the Oracle Problem. •How to know the “Correct Results"? •The difficulty of knowing. Test Design – Test Oracle
  • 41.
    Project risks •Whatmust/must not be documented The “value added” •Which the documentation brings to the project Standards •Followed and/or regulations to be met Lifecycle model used •Traditional or Agile The requirement for traceability TAD – How Much Documentation
  • 42.
    •Depending on thescope of the testing, test analysis and design address the Quality Characteristics for the test object(s). TAD – and Quality Characteristics
  • 43.
    •A user storyshould include a definition of the acceptance criteria. •If the acceptance criteria fulfilled , the system considered to be ready for integration with the other completed functionality. TAD – and Agile
  • 44.
    •May be defined,May not be finalized until test implementation. •Test infrastructure includes more than test objects and testware. •All items required to run the tests. TAD – and Test Infrastructure Rooms Equipment Personnel Software Tools Peripherals Communications equipment User authorizations
  • 45.
    •The exit criteriawill vary depending on the project parameters. •Measurable and ensure that all the information and preparation required for the subsequent steps have been provided. TAD – and Exit Criteria
  • 46.
  • 47.
    •Test Suites •Riskpriority order •Right people •Equipment •Data •Functionality Execution Order Test Implementation - EO
  • 48.
    • Test Analystsshould finalize and confirm the order in which manual and automated tests are to be run, carefully checking for constraints that might require tests to be run in a particular order. • Dependencies must be documented and checked. Test Implementation - DR Test 1 Test 2 Test 3 Test 4 Test 5 Test 6 Test 7 Test 8 Test 1 Test 2 X Test 3 X Test 4 X Test 5 X Test 6 X Test 7 X X Test 8 X
  • 49.
    •The level ofdetail and associated complexity for work done during test implementation may be influenced by the detail of the test cases and test conditions. •In some cases regulatory rules apply, and tests should provide evidence of compliance to applicable standards such as the United States Federal Aviation Administration’s DO-178B/ED 12B [RTCA DO- 178B/ED-12B] which is standard for Software Considerations in Airborne Systems and Equipment Certification is a document dealing with the safety of software used in certain airborne systems, SWIFT book. Test Implementation - DR http://www.davi.ws/avionics/TheAvionicsHandbook_Cap_27.pdf http://www.swift.com/resources/documents/standards_inventory_of_messages.pdf
  • 50.
    •Test data isneeded for testing, and in some cases these sets of data can be quite large. •Test Analysts create input and environment data to load into databases and other such repositories. Test Implementation - Data •Test Analysts also create data to be used with data-driven automation tests as well as for manual testing. http://www.generatedata.com/
  • 51.
    •Test implementation isalso concerned with the test environment(s). •During this stage the environment(s) should be fully set up and verified prior to test execution. Test Implementation - TE
  • 52.
    A "fit forpurpose" Attributes •Capable of enabling the exposure of the defects present during controlled testing •Operate normally when failures are not occurring •Replicate, if required •Production or end-user environment for higher levels of testing Test Implementation - TE
  • 53.
    •Testers must ensurethat those responsible for the creation and maintenance of the test environment are known and available, and that all the testware and test support tools and associated processes are ready for use. •This includes configuration management, defect management, and test logging and management. Test Implementation - TE •In addition, Test Analysts must verify the procedures that gather data for exit criteria evaluation and test results reporting.
  • 54.
    •Dynamic Test Strategies: •Risk-based analytical test strategies are often blended with dynamic test strategies. •In this case, some percentage of the test implementation effort is allocated to testing which does not follow predetermined scripts (unscripted) •Unscripted testing should not be ad hoc or aimless as this can be unpredictable in duration and coverage unless time boxed and chartered. Test Implementation
  • 55.
  • 56.
    Testing Errors False-Negative Missed Failure False-Positive Misclassified a Behavior Test Execution
  • 57.
    Uniquely identified Specificversions Tests, activities, events Strategy dependence Support test processes Influenced by regulatory Address coverage & repeatability Test Execution - Logging
  • 58.
    Consideration •Observations •misbehaving •Test Suites development continuous process. •Gaps •Rerun manual test •Investigate •Mine the defects data •Scripter and unscripted •Regression Capabilities Test Execution – Tips
  • 59.
    Software Testing isour Profession. www.testproeg.com Contact Us +2 0222756841 +2 01021902447 +2 01000190709 30 Ahmad El-Zomor St., Nasr City, Cairo, Egypt E-Mail: info@testproeg.com