Quality in software industry

4,288 views

Published on

Quality in software industry. How important?

Published in: Technology, Business
2 Comments
4 Likes
Statistics
Notes
No Downloads
Views
Total views
4,288
On SlideShare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
97
Comments
2
Likes
4
Embeds 0
No embeds

No notes for slide
  • One of the managers while discussing a certain topic said, "We should release our product with utmost quality". Another manager responded to this statement by asking this question, "How do you measure quality?" The first manager replied, "My definition of quality is a satisfied customer"….
  • Quality in software industry

    1. 1. 1 Quality in Software Industry Richa Goel
    2. 2. 2 Contents • • • • • • • Quality : Meaning Quality in Software Industry Implementation of Quality Measuring Quality Various Types of Metrics Case Studies Software Quality Assurance
    3. 3. 3 What is quality? • A product should meet its specification. • This is problematical for software systems ▫ Some quality requirements are difficult to specify in an unambiguous way; ▫ Software specifications are usually incomplete and often inconsistent.
    4. 4. Quality is….. Invisible when GOOD Impossible to ignore when BAD 4
    5. 5. 5 Total Quality Is… • Meeting Our Customer’s Requirements • Doing Things Right the First Time; Freedom from Failure (Defects) • Consistency (Reduction in Variation) • Continuous Improvement • Quality in Everything We Do
    6. 6. 6 Quality of a product is satisfied customer
    7. 7. 7 Software Quality ??? • Software Quality refers to any measurable characteristics such as correctness, maintainability, portability, testability, usability, reliability, efficiency, integrity, reusability and interoperability. 5
    8. 8. 8 Software quality management • Concerned with ensuring that the required level of quality is achieved in a software product. • Involves defining appropriate quality standards and procedures and ensuring that these are followed. • Should aim to develop a ‘quality culture’ where quality is seen as everyone’s responsibility.
    9. 9. 9 Absence of Quality…
    10. 10. 10 Definition: Software Quality • What is software quality? • What are the attributes of quality for software? This is high quality software because... ?
    11. 11. 11 Software Quality Attributes Portability Efficiency Reliability Usability Testability Understandability Modifiability
    12. 12. 12 Common problems in software processes • • • • Cost overruns Schedule delays Low productivity rate Poor quality - in software, maintenance or fixes
    13. 13. 13 Quality Concepts  Quality of Design refers to the characteristics that designer’s specify for an item.   Quality of Conformance is the degree to which the design specifications are followed during manufacturing. Quality Control is the series of inspections, reviews and tests used throughout the development cycle to ensure that each work product meets the requirements placed upon it. 6
    14. 14. 14 Quality Concepts • Quality of Design Design Development • Quality of Conformance • Quality Control Testing
    15. 15. 15 Software Quality Measurements We best manage what we can measure.
    16. 16. 16 Quality Measurement • Measurement enables the Organization to improve the software process; assist in planning, tracking and controlling the software project and assess the quality of the software thus produced. • Metrics are analyzed and they provide a dashboard to the management on the overall health of the process, project and product. • The validation of the metrics is a continuous process spanning multiple projects.
    17. 17. 17 What to measure? Why to measure? How to measure?
    18. 18. 18 What to check? Should we track: • Number of tests? • Pass rate? • Plan versus actual? • Number of defects? • Code coverage? • Functional coverage? • Performance figures?
    19. 19. 19 What to measure? Resource Defects Process Code Product/ Project
    20. 20. 20 Metric Classification • Products ▫ Explicit results of software development activities ▫ Deliverables, documentation, by products • Processes ▫ Activities related to production of software • Resources ▫ Inputs into the software development activities ▫ hardware, knowledge, people
    21. 21. 21 Types of Software Metrics • Product metrics – e.g., size, complexity, design features, performance, quality level • Process metrics – e.g., effectiveness of defect removal, response time of the fix process • Project metrics – e.g., number of software developers, cost, schedule, productivity
    22. 22. 22 Why measure?  Are we meeting our business objectives?  Are our customers satisfied with our products and services?  Are we earning a fair return on our investments?  Can we reduce the cost of producing the product or service?  How can we improve the response to our customers’ needs or increase the functionality of our products?  How can we improve our competitive position?  Are we achieving the growth required for survival? Without the right information, you are just another person with an opinion
    23. 23. 23 Why Measure Software? • Determine the quality of the current product or process • Predict qualities of a product/process • Improve quality of a product/process
    24. 24. 24 Why measure? • By analyzing the metrics the organization can take corrective action to fix those areas in the process, project or product which are the cause of the software defects.
    25. 25. 25 How to measure?
    26. 26. 26 Purpose of Measuring  Cost saving  Employee satisfaction  Customer satisfaction  Quality
    27. 27. 27 Purpose of Measuring • Cost saving ▫ Quality of Product: Defects per KLOC or FP ▫ Project Status: Tracking against estimated schedule, budget, size • Employee satisfaction ▫ Work Effort: Each team member utilized • Customer satisfaction ▫ Satisfaction: Survey (Six Months) • Quality ▫ Quality of Process: COQ, ROI on QA, amount of rework, quality of team time and teamwork
    28. 28. 28 Goal of metrics • to improve product quality and developmentteam productivity • concerned with productivity and quality measures measures of SW development output as function of effort and time measures of usability
    29. 29. 29 Help in… • Allows manager to (1) assess status of ongoing project (2) track project risks (3) uncover problem areas (4) adjust tasks or workflow (5) evaluate team’s ability to control quality
    30. 30. 30 Define Metrics to Be collected – Project & Support Group Metrics Establish Data Collection Mechanism And send the metrics data to QA Group Analyze Metrics Arrive at Organizational Process Capability Baseline Store data for Future use In Metrics Database
    31. 31. 31 Metrics Data Capture – Frequency & Sources  What should be the frequency of collecting the data?  Sources for the data capture of the Metrics     Timesheets Project plan, Schedule (MPP, XLS) Defect tracking system Ticket tracking sheet
    32. 32. 32 Goal 1: Improve software project planning  How good is the software effort planning?  How good is the software scheduling?
    33. 33. 33 Effort Variance Metric Effort Variance (in %)=[Actual Effort – Planned(Estimated) Effort]*100 [Planned ( Estimated ) Effort ] Actual Effort (in hrs) = Planning, Tracking, Configuration, Defect Prevention, Requirements, Design, Coding, Review, Rework & Testing Planned Effort (in hrs) = Planning, Tracking, Configuration, Defect Prevention, Requirements, Design, Coding, Review, Rework & Testing This is to measure the variance of effort compared with the estimated effort in terms of man days spend with respect to daily hrs
    34. 34. 34 Schedule Variance Metric Schedule Variance =[Actual Duration - Planned Duration] * 100 (in %) [ Planned Duration ]  Actual Duration (in Days) : Actual end date - Planned start date  Planned Duration (in Days) : Planned end date - Planned start date This is to measure the variance of schedule compared with the estimated schedule in terms of duration in calendar days
    35. 35. 35 Schedule Metrics - Sample Project Code and Name: Phase Requirement Actual Schedule Planned Schedule Design Actual Schedule Requirement Design Coding Testing Delivery / Release Prepared By : Reviewed By (MR) : Data Given by : Sign Date : Sign/Date : Planned Schedule Code/Unit Test Actual Schedule Planned Schedule
    36. 36. 36 Goal 2 : Improve Productivity What is the productivity in different types of projects?
    37. 37. 37 Productivity Metrics Development = Testing = ____Total project effort ____ Project size in LOC or FP Total effort spent in execution of test cases Total number of test cases executed Maintenance = Cumulative Actual Effort for a category of CRs completed Number of CRs completed in the category
    38. 38. 38 Goal 3 : Reduce number of defects  How many defects are we getting with respect to the effort spent?  How efficient and Effective are our reviews?
    39. 39. 39 Defect Density Metric A "Defect" is a:  Deviation from a standard  Deviation from requirement  Anything that causes customer dissatisfaction Defect Density (DD) = Total no. of defects Total effort (In hrs)  Total no. of defects = Pre Delivery Defects ( reviews & testing ) + Post Delivery Defects ( reported after delivery to customer )  Total Effort ( in hrs ) = Planning, Tracking, Configuration Management, Defect Prevention, Requirements, Design, Coding, Review, Rework & Testing
    40. 40. 40 Review Efficiency Metric Review efficiency = Review defects (including unit testing defects)*100 Review effort ( including unit testing effort )  Review Defects = No. of defects in ( Requirements, Design, CUT )  Review Effort (in hrs) = ( Requirements, Design, CUT ) Review Efforts     SRS review defects = 1 Design Review defects = 1 Code review defects = 1 Unit Testing defects = 0  Review Effort = 15 Review Efficiency will be 0.2
    41. 41. 41 Review Effectiveness Metric Review effectiveness = Total no. of Review defects * 100 Total no. of defects ( Review + Testing ) DEFECT DATA  SRS review defects = 2  Design Review defects =  Code review defects =  Testing defects = 2 4 2 Review effectiveness will be 80%
    42. 42. 42 Goal 4 : Reduce Cost of Quality  What is the cost of detection (Appraisal Costs)? What is the cost of Correction (Failure Costs)?  What is the cost of Prevention (Prevention Costs)?
    43. 43. 43 Cost Of Quality (COQ) Metric COQ ( in % ) = ( AC +PRC +FC ) * 100 ( AC +PRC +FC + PDC )  Appraisal Cost ( AC ) = Review + Audit + Testing  Prevention Cost ( PRC ) = Project Management + Training + CM + Defect Prevention + RFC / change request  Failure Cost ( FC ) = Rework + Idle Time + Complaints + post sales defects  Production Cost ( PDC ) = Requirements, Design and Coding + Tools/Scripts Development + manuals
    44. 44. 44 Goal 5 : Improve customer satisfaction  Are we meeting our commitments to the customer?  How satisfied are customers with our services and products?
    45. 45. 45 Customer Satisfaction Metrics  Customer Satisfaction Index / Rating  Rating on a scale of 1 – 5 through Customer Satisfaction Surveys  SLA Compliance  [ ( Actual Incidents where SLA was met ) * 100 ] / [ Total No. of incidents ] resolved  SLA Variance  [ ( Actual Resolution Time – SLA ) * 100 ] / [ SLA ]
    46. 46. 46 Tell me now…  Find Cost Of Quality (COQ) ?     Appraisal Cost = 3 % Prevention Cost = 7 % Production Cost = 78 % Failure Cost = 12 % COQ = 22 %  Review efficiency in a project is 0.1, what does it mean? One Review defect has been detected after spending 10 hours of review efforts  Defect Density helps in…… Reducing rework and controlling effort variance and schedule variance
    47. 47. 47 Case Studies
    48. 48. 48 Management review report • • • • • Progress reports Periodic performance reports Customer satisfaction feedback Follow up reports Review of significant findings
    49. 49. 49 Software Development Projects
    50. 50. 50 Development metrics • In software development projects, we capture the efficiency of correctness and robustness of the code. • We track them via: ▫ LOC (Line of Code) ▫ Function Points ▫ Cyclomatic Complexity
    51. 51. 51 Line of Code analysis • Derived by normalizing (dividing) any direct measure (e.g. defects or human effort) associated with the product or project by LOC. • Size oriented metrics are widely used but their validity and applicability is widely debated.
    52. 52. 52 Function oriented Metrics • Function points are computed from direct measures of the information domain of a business software application and assessment of its complexity. • Once computed function points are used like LOC to normalize measures for software productivity, quality, and other attributes. • The relationship of LOC and function points depends on the language used to implement the software.
    53. 53. 53 Software Testing Projects
    54. 54. 54 Check Points • Main objectives of a project: High Quality & High Productivity (Q&P) • Quality has many dimensions ▫ reliability, maintainability, interoperability etc. • More defects => more chances of failure => lesser reliability • Hence quality goal: Have as few defects as possible in the delivered software!
    55. 55. 55 Metrics for Software Testing • Defect Removal Effectiveness DRE= Defects removed during development phase x100% Defects latent in the product Defects latent in the product = Defects removed during development phase+ defects found later by user • Efficiency of Testing Process (define size in KLoC or FP, Req.) Testing Efficiency= Size of Software Tested Resources used
    56. 56. 56 Support Groups Metrics
    57. 57. 57 SEPG & SQA Metrics  Person / Process Trainings  PI Index - Process Improvement suggestions per quarter  SEPG Efforts  Total effort spent in SEPG Activities  Total effort spent in SQA Activities  No. of NCs per project
    58. 58. 58 ISS Metrics  % Calls resolved same day = Number of calls ( 24hrs category ) resolved within one day Number of user calls ( 24 Hrs category )  % Calls resolved in two days = Number of calls ( 48hrs category ) resolved within two days Number of user calls ( 48 Hrs category )  % Network up time = Network up time ( for the month ) Total Available time
    59. 59. 59 RMG Metrics  Average hiring cost per hire =  Selection ratio = Total hiring cost for the month Total no. of joinees Total No. of offers Total no. of candidates interviewed  Offer to joinee ratio = Total no. of joinees Total no. of Offers  Offer to Decline ratio = Total no. of Declines Total no. of Offers
    60. 60. 60 Some important metrics Indicator Metric (Unit / s) Effort Effort Variance (%) Schedule Schedule Variance (%) Size Size Variance (%) Formula Base Measure Source / s /s (Actual Effort – Actual Effort Planned Effort) * Planned Effort 100 / (Planned Effort) (Actual End Date Planned project – Planned End duration Date) * 100 / Planned End Date (Planned Project Actual End Date Duration) (Actual Software Size – Estimated Actual size Software Size) * Estimated size 100 / (Estimated Software Size Productivity Productivity (KLOC / manday (Size) / (Effort) Or FP / manday) Effort Size Timesheet Project schedule Project schedule Estimation sheet Estimation Sheet
    61. 61. 61 Some important metrics Indicat Metric (Unit or / s) Defect Density Quality Quality Quality Formula Base Measure / s Source / s No. of total Defects (Pre Review & testing (Defects / KLOC (Total No. of Defect log delivery + Defects) / (Size) Post delivery) Or Estimation Sheet Size Defects / FP) Review Total no. of (Total no. of review review defects Efficiency Review & testing defects) / (review Defect log (Defects / Review effort hours) Manhour) in hours (Appraisal efforts + Failure efforts + Cost of Quality Review efforts Timesheet Prevention efforts) * (%) 100 / (Total Project Testing efforts Project schedule Effort) Rework efforts
    62. 62. 62 Sample Tracking Sheet
    63. 63. 63 Software Quality Assurance
    64. 64. 64 What is Software Quality Assurance? • Used to Monitor and Improve the Software Development Process • Making Sure That Standards and Procedures are Followed • Ensures that Problems are Found and Dealt with • Orientated to ‘Prevention’
    65. 65. 65 Standards and Procedures • Framework for which Software Evolves • Standards ▫ Established Criteria to which Software Products are Compared • Procedures ▫ Established Criteria to which Development and Control Procedures are Followed • SQA is based on the Following of Standards and Procedures
    66. 66. 66 Techniques • Audit ▫ The Major Technique used in SQA ▫ Perform Product Evaluation and Process Monitoring ▫ Performed Routinely throughout the Software Development Process ▫ Look at a Process and/or Product in depth and compare to Established Standards and Procedures ▫ Provide an indication of the Quality and Status of the Software Product
    67. 67. 67 Benefit of Software Quality Assurance in Projects • Without SQA, many Software Groups would not reach their release goals/deadlines on time • Lowers time spent on mundane areas and lets more time be focused on important areas • Decreases the time from Development to Deployment • Can help catch errors before they are too costly to fix
    68. 68. 68 SQA Activities • • • • Document Auditing Document Reviews Metrics Calculation and Analysis Meetings and Discussions ▫ Final Reports Submission
    69. 69. 69 Reviews
    70. 70. 70 Types of Review • Formal Review • Informal Review
    71. 71. 71 Types of Review Formal Review • Planning • Documented • Thorough • Focused on a certain purpose
    72. 72. 72 Types of Review Informal Review • Undocumented • Fast • Few defined procedures • Useful to check that the author is on track
    73. 73. 73 Why do peer reviews? • • • • To improve quality. Catches 80% of all errors if done properly. Catches both coding errors and design errors. Training and insurance.
    74. 74. 74 Thank You!!

    ×