Your SlideShare is downloading. ×
0
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
PowerPoint
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

PowerPoint

391

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
391
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Software Analysis – Call Trees, Statement Coverage and Dynamic Analysis Chris Greenough and David J Worth Software Engineering Group Computational Science & Engineering Department Rutherford Appleton Laboratory [email_address]
  • 2. What is software analysis? <ul><li>Software analysis is wide ranging. It has elements in </li></ul><ul><ul><li>Software quality </li></ul></ul><ul><ul><li>Testing </li></ul></ul><ul><ul><li>Performance </li></ul></ul><ul><ul><li>Software metrics </li></ul></ul><ul><ul><li>…and many others </li></ul></ul><ul><li>We will consider only a very small subset of these: </li></ul><ul><ul><li>Testing – data coverage analysis </li></ul></ul><ul><ul><li>Documentation – call tree generation </li></ul></ul><ul><ul><li>Build/Configuration – dependency analysis </li></ul></ul>
  • 3. Software Testing! <ul><li>You can never do enough! </li></ul><ul><li>This can range over many forms and takes place in all parts of the software life cycle. </li></ul><ul><li>Implementation and Testing phases are where the maximum concentration is going to be. </li></ul><ul><li>Software Testing is the process of executing a program or system with the intent of finding errors. </li></ul><ul><li>It involves any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. </li></ul><ul><li>Detecting all of the different failure modes for software is generally infeasible. </li></ul>
  • 4. Testing classifications <ul><li>There is a plethora of testing methods and testing techniques, serving multiple purposes in different life cycle phases. </li></ul><ul><li>Classified by purpose, software testing can be divided into: </li></ul><ul><ul><li>correctness testing </li></ul></ul><ul><ul><li>performance testing </li></ul></ul><ul><ul><li>reliability testing </li></ul></ul><ul><ul><li>security testing. </li></ul></ul><ul><li>Classified by life-cycle phase, software testing can be classified into the following categories: </li></ul><ul><ul><li>requirements phase testing </li></ul></ul><ul><ul><li>design phase testing </li></ul></ul><ul><ul><li>program phase testing </li></ul></ul><ul><ul><li>evaluating test results </li></ul></ul><ul><ul><li>installation phase testing </li></ul></ul><ul><ul><li>acceptance testing </li></ul></ul><ul><ul><li>maintenance testing </li></ul></ul><ul><li>By scope, software testing can be categorized as follows: </li></ul><ul><ul><li>unit testing </li></ul></ul><ul><ul><li>component testing </li></ul></ul><ul><ul><li>integration testing </li></ul></ul><ul><ul><li>system testing. </li></ul></ul>
  • 5. Summary on Testing (Pan 1999) <ul><li>Software testing is an art. Most of the testing methods and practices are not very different from 20 years ago. It is nowhere near maturity, although there are many tools and techniques available to use. </li></ul><ul><li>Good testing also requires a tester's creativity, experience and intuition, together with proper techniques. </li></ul><ul><li>Testing is more than just debugging. Testing is not only used to locate defects and correct them. It is also used in validation, verification process, and reliability measurement. </li></ul><ul><li>Testing is expensive. Automation is a good way to cut down cost and time. Testing efficiency and effectiveness is the criteria for coverage-based testing techniques. </li></ul><ul><li>Complete testing is infeasible. Complexity is the root of the problem. At some point, software testing has to be stopped and product has to be shipped. The stopping time can be decided by the trade-off of time and budget. Or if the reliability estimate of the software product meets requirement. </li></ul><ul><li>Testing may not be the most effective method to improve software quality. Alternative methods, such as inspection, and clean-room engineering, may be even better. </li></ul>
  • 6. plusFort spag for Coverage and Dynamic Analysis
  • 7. Coverage Analysis <ul><li>Use coverage analysis for: </li></ul><ul><ul><li>Identifying untested code </li></ul></ul><ul><ul><li>Profiling execution count </li></ul></ul><ul><ul><li>Detecting hotspots </li></ul></ul><ul><li>Items in spag.fig </li></ul><ul><ul><li>item 6 set to 2 </li></ul></ul><ul><ul><li>item 234 set to directory for analysis files </li></ul></ul><ul><ul><li>(items 230 and 232 for other output files) </li></ul></ul>
  • 8. Process for Coverage Analysis <ul><li>Run spag </li></ul><ul><ul><li>creates new source... </li></ul></ul><ul><ul><li>...with calls to probe routines added </li></ul></ul><ul><li>Compile new source </li></ul><ul><ul><li>can your compiler handle names like PR$ENT? </li></ul></ul><ul><li>Run test cases </li></ul><ul><ul><li>with instrumented executable </li></ul></ul><ul><li>Run cvranal </li></ul><ul><ul><li>cvranal [FIG=file] [TO=file] cvr_files </li></ul></ul>
  • 9. The CVRANAL Tool <ul><li>cvranal [FIG=file] [TO=file] cvr_files </li></ul><ul><li>cvranal.fig </li></ul><ul><ul><li>Item 501 Set to 0 for normal run, 1 to zero all execution counts </li></ul></ul><ul><ul><li>Item 513 Set to column where counts added to source (>72) </li></ul></ul><ul><li>Outputs </li></ul><ul><ul><li>Log file </li></ul></ul><ul><ul><ul><li>Coverage data per routine </li></ul></ul></ul><ul><li>NO ( 0%) Untested code blocks in subprogram CPMIX in file cap.f </li></ul><ul><li> 3 ( 50%) Untested code blocks in subprogram HEATCAP in file cap.f </li></ul><ul><li> at lines 74 75 77 </li></ul><ul><ul><ul><li>Hotspots </li></ul></ul></ul><ul><li>807780 : tot = tot+comp(i) </li></ul><ul><li> : in s/prog MWHT at line 1816 of singch.f </li></ul><ul><ul><li>Execution count per line added to source </li></ul></ul>
  • 10. Dynamic Analysis <ul><li>AKA Run-time analysis </li></ul><ul><li>Use dynamic analysis for: </li></ul><ul><ul><li>Identifying variables unset when used </li></ul></ul><ul><ul><li>Identifying “incorrect” values in a variable </li></ul></ul><ul><li>Items in spag.fig </li></ul><ul><ul><li>item 1 set to 4 </li></ul></ul><ul><ul><li>item 36 set to 1 to assume local variables are static </li></ul></ul><ul><ul><li>item 234 set to directory for analysis files </li></ul></ul><ul><ul><li>(items 230 and 232 for output files) </li></ul></ul>
  • 11. Process for Dynamic Analysis <ul><li>Run spag </li></ul><ul><ul><li>creates new source... </li></ul></ul><ul><ul><li>...with calls to probe routines added </li></ul></ul><ul><li>Compile new source </li></ul><ul><ul><li>can your compiler handle names like QD$D8? </li></ul></ul><ul><li>Run test cases </li></ul><ul><ul><li>with instrumented executable </li></ul></ul><ul><li>View file PROBES.LOG </li></ul>
  • 12. Dynamic Analysis Output <ul><li>File PROBES.LOG </li></ul><ul><ul><li>GAMCT value is undefined </li></ul></ul><ul><ul><li>subprogram AA0001 line 155 file </li></ul></ul><ul><ul><li>/home/wksh10/examples/plusFORT/model/singch.f </li></ul></ul><ul><ul><li>V(I) value is undefined </li></ul></ul><ul><ul><li>subprogram DDANRM line 1688 file </li></ul></ul><ul><ul><li>/home/wksh10/examples/plusFORT/numerical/dassl.f </li></ul></ul>
  • 13. NagWare Tools for Coverage & Dynamic Anlaysis
  • 14. NagWare Transformation Tools <ul><li>The NAGWare Fortran Tools provide users with a set of anlaysis tools for Fortran 77 and Fortran 95 code. </li></ul><ul><li>The NAGWare f95 Tools accept as input Fortran 77 and fixed or free format Fortran 95. </li></ul><ul><li>Output from the transformational tools is always free format, so these tools are effectively fixed to free format translators. </li></ul><ul><li>These tools can be used in a range of ways: </li></ul><ul><ul><li>Standardisation </li></ul></ul><ul><ul><li>Enforcing coding standards </li></ul></ul><ul><ul><li>Converting from fixed format Fortran 77 to free format Fortran 95 </li></ul></ul><ul><ul><li>Re-engineering code </li></ul></ul>
  • 15. f95 Tools Contents Summary <ul><li>Analysers </li></ul><ul><ul><li>nag_coverage95 - coverage analysis </li></ul></ul><ul><ul><li>nag_depend95 - dependency analysis </li></ul></ul><ul><ul><li>nag_fcalls95 - call tree generation </li></ul></ul><ul><ul><li>nag_modules95 - module generation and language checking </li></ul></ul><ul><ul><li>nag_xref95 - cross-reference generation </li></ul></ul>
  • 16. f77 Tools Contents Summary <ul><li>Analysers </li></ul><ul><ul><li>nag_fcalls - call tree generator </li></ul></ul><ul><ul><li>nag_fxref - cross reference generator </li></ul></ul><ul><ul><li>nag_libdoc - library documentor </li></ul></ul><ul><ul><li>nag_metrics - software metrics analysis </li></ul></ul><ul><ul><li>nag_pfort - portability analyser </li></ul></ul>
  • 17. Coverage with nag_coverage95 <ul><li>nag_coverage95 is the NAGWare f95 Coverage tool which instruments Fortran 95 source file(s) producing code that has the same effect but with extra monitoring code added. </li></ul><ul><li>The tool breaks down the input Fortran code into segments of straight-line code and each segment is then instrumented with a counter. When compiled and executed, the instrumented program will produce a trace detailing how many times a particular segment of code was executed. </li></ul><ul><li>Those segments which are executed most may be the best places to concentrate efforts in optimising the code if speed of execution is required. If some segments are not executed at all when running with test data, the testing strategy may need to be improved. </li></ul><ul><li>The tool also reports segments which contain dead code (code that cannot be reached). </li></ul>
  • 18. Coverage with nag_coverage95 <ul><li>nag_coverage95 [ options ] infile ... [-first ...] [-last ...] </li></ul><ul><li>-single_run filename - The default operation of the tool is to produce an instrumented program that writes single run statistics to the file `base.out', (see `FILES' below). This option allows the output file to be named. Only one of -cumulative and -single_run may be specified. </li></ul><ul><li>-cumulative filename - This option causes the cumulative trace output across multiple executions to be output to filename. It enables the instrumented program to be invoked with different input data so that a more general view of its behaviour can be obtained. The file `filename.hist' is created, if it does not already exist, when the executable program is run. It contains the internal tables of cumulative execution frequencies and is not intended to be viewed. To reset the cumulative counts to zero, simply remove this file. Only one of -cumulative and -single_run may be specified. </li></ul><ul><li>-first obj1 - Instruct the tool to load with obj1 ... before the instrumented object files. This option may appear only after the list of source files to be instrumented (see also -last). </li></ul><ul><li>-L dir - Add dir to the list of directories to be searched by the loader when producing an executable. </li></ul>
  • 19. Coverage with nag_coverage95 <ul><li>-l lib - Instruct the loader to load with lib when producing an executable. The space between the option and its argument may be omitted provided that the result does not clash with other options (such as -load). </li></ul><ul><li>-last obj1 - Instruct the tool to load with obj1 ... after the instrumented object files. This option may appear only after the list of source files to be instrumented (see also -first). </li></ul><ul><li>-load - Instruct the tool to load all object files into an executable. This is the default if the -source and -objects options are omitted. </li></ul><ul><li>-objects - Compile the instrumented source to produce object files, but do not load them into an executable unless -load is also specified. Without this option, -load will cause intermediate object files to be deleted after loading. This option may be useful if the instrumented object files are to be placed in a library archive. </li></ul><ul><li>-source - Create the instrumented source, but do not compile them unless -load or -objects is also specified. Without this option, -load and -objects will delete the instrumented files after compilation. </li></ul>
  • 20. nag_coverage95 example <ul><li>All files must be compiled with nag_modules95 or the NagWare Fortran 95 compiler to the module (.mod) files. </li></ul><ul><li>Seg3p1.f90 is a simple finite element program requiring lib felib90.a and a definitions file def3p1.f90 (a module). </li></ul><ul><li>Steps: </li></ul><ul><ul><li>Ensure whole library compiled with f95 or nag_modules95 </li></ul></ul><ul><ul><li>Compile local files – def3p1.f90  def3p1.mod and def3p1.o </li></ul></ul><ul><ul><li>Locate necessary libraries and module files </li></ul></ul><ul><ul><li>nag_coverage95 –I../modules90 –L../lib90 –I../felib90 seg3p1.f90 –last def3p1.o </li></ul></ul><ul><li>Many messages about source files but links an executable called seg3p1.inst.exe which can be run. </li></ul><ul><li>Execution produces seg3p1.out which is the coverage </li></ul><ul><li>Run coverage95 to view combined source listing and coverage results. </li></ul>

×