• Like
Software quality metrics methodology _tanmi kiran
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Software quality metrics methodology _tanmi kiran

  • 902 views
Published

 

Published in Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
902
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
62
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Kiran Lata Gangwar Tanmi Kapoor M.Tech (S.E.)
  • 2. IEEE Standard for a Software Quality Metrics Methodology  Sponsor Software Engineering Standards Committee of the IEEE Computer Society  Approved 8 December 1998 IEEE-SA Standards Board
  • 3. Use,  organizational experience  required standards  regulations  laws Consider,  contractual requirements  cost or schedule constraints  warranties  customer metrics requirements  organizational self-interest
  • 4. 1.2 Determine the list of quality requirements:  Survey all involved parties  Create the list of quality requirements 1.3 Quantify each quality factor:  For each quality factor, assign one or more direct metrics to represent the quality factor  assign direct metric values to serve as quantitative requirements for that quality factor
  • 5.  Identify tools  Describe data storage procedures  Establish a traceability matrix  Identify the organizational entities  Participate in data collection  Responsible for monitoring data collection  Describe the training and experience required for data collection  Training process for personnel involved
  • 6.  Test the data collection and metrics computation procedures on selected software that will act as a prototype  Select samples that are similar to the project(s) on which the metrics will be used  Examine the cost of the measurement process for the prototype to verify or improve the cost analysis  Results collected from the prototype to improve the metric descriptions and descriptions of data items
  • 7.  Using the formats in Table, collect and store data in the project metrics database at the appropriate time in the life cycle  Check the data for accuracy and proper unit of measure  Monitor the data collection  Check for uniformity of data if more than one person is collecting it  Compute the metric values from the collected data
  • 8.  Interpret and record the results  Analyze the differences between the collected metric data and the target values  Investigate significant differences
  • 9.  Interpret and record the results  Unacceptable quality may be manifested as  excessive complexity,  inadequate documentation,  lack of traceability,  or other undesirable attributes
  • 10.  Use validated metrics during development to make predictions of direct metric values  Make predictions for software components and process steps  Analyze in detail software components and process steps whose predicted direct metric values deviate from the target values
  • 11.  Use direct metrics to ensure compliance of software products with quality requirements during system and acceptance testing  Use direct metrics for software components and process steps. Compare these metric values with target values of the direct metrics  Classify software components and process steps whose measurements deviate from the target values as noncompliant
  • 12.  The purpose of metrics validation is to identify both product and process metrics that can predict specified quality factor values, which are quantitative representations of quality requirements  Metrics shall indicate whether quality requirements have been achieved or are likely to be achieved in the future
  • 13.  For the purpose of assessing whether a metric is valid  The following thresholds shall be designated: V-square of the linear correlation coefficient B-rank correlation coefficient A-prediction error @-confidence level P-success rate
  • 14. a) Correlation b) Tracking c) Consistency d) Predictability e) Discriminative power f) Reliability
  • 15. 5.3.1 Identify the quality factors sample  A sample of quality factors shall be drawn from the metrics database 5.3.2 Identify the metrics sample  A sample from the same domain (e.g., same software components), as used in 5.3.1, shall be drawn from the metrics database
  • 16. 5.3.3 Perform a statistical analysis  The analysis described in 5.2 shall be performed  Before a metric is used to evaluate the quality of a product or process, it shall be validated against the criteria described in 5.2. If a metric does not pass all of the validity tests, it shall only be used according to the criteria prescribed by those tests 5.3.4 Document the results  Documented results shall include the direct metric, predictive metric, validation criteria, and numerical results, as a minimum
  • 17.  5.3.5 Revalidate the metrics A validated metric may not necessarily be valid in other environments or future applications. Therefore, a predictive metric shall be revalidated before it is used for another environment or application  5.3.6 Evaluate the stability of the environment Metrics validation shall be undertaken in a stable development environment (i.e., where the design language, implementation language, or project development tools do not change over the life of the project in which validation is performed)