Your SlideShare is downloading. ×
12 Automated Testing Tools
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

12 Automated Testing Tools

439

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
439
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
18
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Automatic Testing Tool
  • 2. Introduction
    • The tool provides facilities for: 
    • Dynamic Testing : executing the software under test in order to verify its compliance
    • Coverage Analysis : measuring the proportion of software exercised by dynamic testing 
    • Static Analysis : examining source code to assess its complexity
  • 3.  
  • 4. Functional Overview
    • Dynamic Testing facilities allow both host and target.  
          • Coverage Analysis facilities allow to assess the coverage of individual software units, programs/tasks and entire software systems.
          • Static Analysis: Software that has been thoroughly dynamically tested can still have problems.
          • Static analysis provides information on qualities such as maintainability and compliance with coding standards.  
    • The number of code statements and the McCabe's cyclomatic complexity metric are calculated. 
  • 5.
    • Users can define their coverage analysis, and static analysis criteria.
    • Dynamic Testing
    • The Testing Problem
  • 6.  
  • 7.
    • There must be some form of specification for the software
    • The testing tool must handle test cases generated using
      • functional (black-box)
      • structural (white-box)
      • isolation  
      • Integration testing
      • positive testing
      • negative testing  
  • 8.
    • The Solution
  • 9.
    • A test script controls tests.
    • The test script is compiled and linked with Test Harness (TH) and the software under test.
    • This produces an executable program that produces a test Results file. 
    • The benefits of this approach are: 
        • Documentation : The tool provides documented testing. 
        • Repeatability : Tests can be easily. 
        • Maintainability : Tests Scripts are easy to understand and updated. 
  • 10. The Test Harness
    • Test Harness (TH) provides facilities to run, verify the results of, document, and repeat dynamic tests.
    • TH consists of a set of library directives accessed from the test script.
    • The test script calls the software under test and the TH directives embedded in the script check the effects of the call on environment. 
  • 11.  
  • 12.
    • When a test is run, a Results file is produced detailing every step of the test and highlighting any failures.
    • A Results table is displayed summarizing the results for each test case and providing total figures.
    • An overall statement of test pass or fail is provided and returned to the command shell.  
    • Checking Values
    • Most important aspect of dynamic testing is checking that the outputs from the software under test are as expected.
  • 13.
    • TH verifies data using CHECK directives. These cause the comparison of a data item with its expected value. 
    • Simulating External Software
    • Isolation testing implies that calls to external units and external data must be simulated. 
    • External calls can be simulated, ensuring that they are made in the expected order and that input parameters have the correct values at each call.
    • Return values can be individually set on different calls to the same simulated function.
  • 14.
    • Similarly, external data areas may be simulated and checked. 
    • Timing Analysis
    • Not only the correct functioning of software determines its acceptability.
    • Some applications need to perform certain activities both correctly and within defined time constraints. 
    • The tool permits execution times to be recorded and tests passed or failed depending on the performance of the software. 
  • 15. Results
    • TH directives mark the progress of the test run in the output.
    • Any unexpected event is highlighted by a message.
    • For example: a failed check will be marked with FAILED and a diagnostic will give both the actual and expected values of the item being checked. 
    • The output below contains a typical section of a TH output file:
  • 16.
    • Test Results For : example 
    • Results File : example.ctr
    • Tests Run At : Feb 17 09:38:32
    • Start Test 001
    •    EXECUTE: my_function,
    •      Expected calls = 1
    •      START_STUB : my_stub
    •          CALL_REF/ACTION: Action 1, Call 1
    •          Check PASSED : my_stub_string
    •          Item = "Hello, world"
    •      END_STUB : my_stub
    •    DONE : my_function
    • Check FAILED : my_external
    • Expected 0x0000712B 28971 
    • Item 0x000080E8 33000 
    • End Test 001
  • 17.  
  • 18. Test Script Generation
    • The Test Script generator (TS) prepares dynamic test scripts for execution with TH.
    • TS takes information from a test case definition file and generates a test script.
    • The test case definition file specifies test cases, establishes initial conditions and expected results. 
    • TS can produce comprehensive test script templates which can be the basis of manually coded test scripts. 
    • TS works by scanning the test case definition file and the software under test to produce the test script. 
  • 19.  
  • 20.
    • Test scripts produced by TS feature positive and negative data checking.
    • TS automatically codes check routines for user defined types and stubs for external functions/classes. 
    • Analysis
    • Test Analysis (TA) provides the user with Coverage Analysis and Static Analysis features.
  • 21. Coverage Analysis
    • Coverage analysis measures the proportion of software executed during dynamic testing. 
      • statement coverage
      • decision (or branch ) coverage
      • boolean expression coverage
      • call pair coverage : measures the proportion of calls to other functions which have been exercised.
      • call coverage : measures that expected functions have been called. 
      • data value coverage : checks that program variables have held a series of (user defined) values during the testing process. 
  • 22. Static Analysis
    • Static Analysis provides an assessment of various non-functional features relating to the software. 
    • Enforcement of coding standards  
    • Measurement of code complexity and structure .  
    • Coding Standards
    • The metrics examined are incorporated into overall test pass/fail criteria.
      • For example, users may check that no goto statements (or labels) are used, that only one return statement is present and that there are no switch statements without a default . 
  • 23.
    • Data areas and types, which are declared but not used, are highlighted.
    • Users can define their static analysis metrics to check on the use of code.
      • For instance, these facilities can be used to restrict access to certain library routine calls (such as malloc ).  
  • 24. Code Complexity
    • Many metrics are supported: 
    • McCabe's measure and Myers' extension 
    • Essential McCabe's 
    • Hansen's measure of software complexity by the pair 
    • Halstead's software science metrics 
    • Harrison's scope ratio  
  • 25. Using Analysis
    • Test Analysis may be used in many different ways and, if required, be extended by the user to meet client specific requirements. 
    • The Analysis comprises: 
      • a special C pre-processor ( TP ). 
      • an Instrumenter program, to analyse source code files and insert coverage 'probes' ( TI ); 
      • an additional library of test directives, which may be incorporated into a test script ( the TA library ). 
    • The Analysis can be used in two main ways.
  • 26.
      • An extension to TH, allowing the user to fully integrate dynamic testing with coverage analysis and static verification.
    • This diagram illustrates the use of Cantata Analysis in this way: 
  • 27.  
  • 28.
    • The pre-processor and instrumenter are used to produce instrumented source code that is source code containing probes to facilitate the collection of coverage data.
    • The instrumenter produces an annotated code listing containing the source code and a static analysis report. 
    • The instrumented source code is compiled and linked with a test script and the TA library.
    • When run, the resultant executable produces test results which includes both static analysis, coverage analysis information, and dynamic test results.  
  • 29. Stand-Alone Analysis
    • Test Analysis can also be used stand-alone.
    • Developers can use their own or a third party tool test software, while generating analysis reports using Test Analysis:  
  • 30.  
  • 31. Coverage Analysis Results
    • Coverage Analysis Within a Test Script
    • Users have access to coverage and static analysis results from within the test script.
    • The next example shows a simple decision coverage report, indicating execution of each decision:  
  • 32.  
  • 33.
    • >> WARNING: Switch executed with unknown case value
    • Total of decision outcomes = 14
    • Total outcomes exercised at least once = 6
    • Decision coverage = 42%
    • >> WARNING: DECISION COVERAGE INCOMPLETE
    • Stand-Alone Coverage Analysis
    • In stand-alone mode, coverage analysis can be used to provide coverage reporting from units, modules or complete application programs. 
  • 34. Static Analysis Results
    • Static analysis results can be accessed in a number of ways: 
      • as part of the instrumenter list file 
      • as a comma separated value ( .CSV ) file, which can be exported to spreadsheet or database packages 
      • from within the TH test script, where they may be checked 
    • An example of each of these formats is shown next.  
  • 35.
    • Static Analysis Measures for "control.c a03m00"
    • SOURCE_LINES                     557
    • CODE_LINES                       198
    • COMMENT_LINES                    353
    • BLANK_LINES                      45
    • EXPRESSION_STATEMENTS            53
    • FOR_LOOP_STATEMENTS              3
    • WHILE_LOOP_STATEMENTS            0
    • DO_LOOP_STATEMENTS               0
    • IF_STATEMENTS                    13
    • SWITCH_STATEMENTS                1
    • RETURN_STATEMENTS                1
    • GOTO_STATEMENTS                  0
    • STATEMENTS                       74
    • DECLARATIONS                     13
    • COMMENTS                         215
  • 36.
    • .
    • .
    • MAXIMUM_NESTING_LEVEL            4
    • AVERAGE_NESTING_LEVEL            0.58
    • .
    • .
    • .
    • MCCABE                           19
    • ESSENTIAL_MCCABE                 1
    • MYERS_MCCABE_LOWER               19
    • MYERS_MCCABE_UPPER               21
    • HANSEN_CYCLOMATIC_NUM            18
    • HANSEN_OPERATOR_COUNT            98
    • HARRISON_SCOPE_RATIO             0.45
    • HALSTEAD_NUM_UNIQUE_OPRS         21
    • HALSTEAD_TOTAL_NUM_OPERATORS     194
  • 37.
    • HALSTEAD_NUM_UNIQUE_OPERANDS     75
    • HALSTEAD_TOTAL_NUM_OPERANDS      221
    • .
    • .
    • .
    • CLASSES                          0
    • NEW                              0
    • DELETE                           0
    • THROW                            0
    • TRY_CATCH                        0
    • ANONYMOUS_UNIONS                 0
    • PARAMETERS                       2
    • UNUSED_PARAMETERS                0
    • AUTOMATICS                       13
    • STATICS                          0
    • UNUSED_DATA                      0
    • LOCAL_TYPES                      0
  • 38.
    • UNUSED_LOCAL_TYPES               0
    • DATA ANALYSIS
    • -------------------------------------------------
    • Definitions and Declarations Outside of Any Function
    • -------------------------------------------------
    • Name                     Flags              No. of References
    • -----              -----------------
    • a03m01                   function           9
    • a03m02                   function           5
    • a03m03                   function           2
    • .
    • .
    • .
  • 39.
    • ct02_dirsep_ca           extern             1
    • ct02_hdrftr_pca          extern             6
    • ct02_notmdte_ca          extern             0 UNUSED
    • .
    • .
    • Definitions and Declarations Within Function : a03m00
    • --------------------------------------------
    • Name                     Flags                  No. of References
    •                  -----------------
    • cr_argv_pca              parameter              1
    • cr_numargc_n             parameter              0 UNUSED
    • vl_baselen_i                                    2
  • 40.
    • vl_error_b                                      10
    • vl_error_n                                      4
    • vl_exit_z                                       0 UNUSED
    • vl_fileloop_i                                   6
    • vl_format_pca                                   4 
    • ============== Overall Preprocessor Measures For The File ============= 
    • TOTAL_MACROS                     1272
    • MACROS_TO_BE_EXPANDED            1270
    • MACROS_NOT_TO_BE_EXPANDED        2
    • SUBSTITUTED_MACROS               111
    • UNSUBSTITUTED_MACROS             0
    • DIRECT_INCLUDE_FILES             10
    • INDIRECT_INCLUDE_FILES           1
    • MAX_INCLUDE_NESTING              2
  • 41.
    • MAX_CONDITION_COMP_NESTING       3
    • ==================== Overall Measures For The File ====================
    • FILE_SOURCE_LINES                767
    • FILE_CODE_LINES                  238
    • FILE_COMMENT_LINES               465
    • FILE_BLANK_LINES                 105
    • FILE_STATEMENTS                  74
    • FILE_DECLARATIONS                50
    • FILE_COMMENTS                    309
    • FILE_FUNCTIONS                   1
    • FILE_CHECKSUM                    846162015
    • FILE_CLASSES                     0
    • FUNCTION_CLASSES                 0
    • FILE_FRIENDS                     0
    • FUNCTIONS_IN_CLASSES             0

×