• Like
"Automated Defect Prevention for Embedded System Software ...
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

"Automated Defect Prevention for Embedded System Software ...

  • 667 views
Published

 

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
667
On SlideShare
0
From Embeds
0
Number of Embeds
4

Actions

Shares
Downloads
20
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Increasing Software Quality Through Automated Error Prevention (AEP) Automated Error Prevention (AEP) provides a very non-intrusive and cost-effective means for software development organizations to begin increasing software quality, reliability and maintainability. AEP accomplishes this by focusing on preventing errors thru automated code inspection, automated unit testing and the gathering of software quality metrics over the software lifecycle. Many software development organizations are torn between how to increase software quality while at the same time remaining cost-effective in their development schedules. While these same organizations are required to focus on increasing software quality in compressed development cycles many are at a loss at understanding how to apply a true focus on "software quality" early on in their development lifecycles. This presentation outlines how to quickly begin increasing the quality of your software and reducing the amount of costly defects/errors found during integration and beyond, while at the same time aligning with your organization's focus on process improvement.
  • Although organizations are beginning to change many are approaching their focus on quality and test as they have for the last 25 years: Bullet 1: When the decision is made to move forward on the development of a product (or even a project) the deliverable, whatever it is, is described in terms of its “functionality” via requirements. This is, of course, the only way to know exactly “what” must be delivered as an end-product. These “functional requirements” can generally be quite easily described. For example, “The user shall enter his/her account number in the appropriate form field and press Enter to retrieve his/her account information.” But there are no requirements indicating that “the product shall be delivered in a “high quality” state thereby ensuring minimal customer aggravation and carnage”. Of course the reason this cant be done is that “high quality” is a very nebulous and very subjective term. There are varying levels of quality .. They are typically described as a result of customer dissatisfaction or a quantifiable measure after-the-fact (ie number of defects issues thru a CCB). Or if there are “quality requirements” they are put on the backburner as they will be taken care of during integration/test. That’s where you resolve all of the bugs anyway.. Right? Bullet 2: It is interesting the number of organizations, many of whom are CMMI Level 3 and above, still apply a development process, albeit now “iterative” as opposed to “waterfall”, which focus on Design-Code-”Integration and Test”. I&T lumped together. Test once you have integrated. This implies there is no real focus on test (ie quality) before the Integration phase. Bullet 3: If Testing is done earlier than Integration it is typically minimal and is typically very short-lived. We will discuss this a bit later. Bullet 4: The whole focus on quality is really “testing for and fixing bugs” .. A very reactive approach to quality. Bullet 5: Most organizations have a “failure to communicate” between various groups. The Developers’ don’t talk to Test and the Testers don’t talk to QA. Or at least when they do talk, they don’t speak the same “language”. It’s more of an “eye-rolling” and “finger-pointing” culture. Bullet 6: For most organizations “quality” is primarily driven by the Test and/or the QA organization.
  • This slide shows you how important “preventing errors” is in both the short-term and long-term. Notice that 85% of defects are introduced in the coding phase of software development. Yet if these defects are not found and fixed they can have an enormous impact on the cost and stability of your code in later phases of development. What would cost $25 to fix today could cost $1000’s of dollars once fielded. What this slide is really telling you is to have a major focus on Coding and Unit Testing .. (next slide)
  • Although organizations are beginning to change many are approaching their focus on quality and test as they have for the last 25 years: Bullet 1: When the decision is made to move forward on the development of a product (or even a project) the deliverable, whatever it is, is described in terms of its “functionality” via requirements. This is, of course, the only way to know exactly “what” must be delivered as an end-product. These “functional requirements” can generally be quite easily described. For example, “The user shall enter his/her account number in the appropriate form field and press Enter to retrieve his/her account information.” But there are no requirements indicating that “the product shall be delivered in a “high quality” state thereby ensuring minimal customer aggravation and carnage”. Of course the reason this cant be done is that “high quality” is a very nebulous and very subjective term. There are varying levels of quality .. They are typically described as a result of customer dissatisfaction or a quantifiable measure after-the-fact (ie number of defects issues thru a CCB). Or if there are “quality requirements” they are put on the backburner as they will be taken care of during integration/test. That’s where you resolve all of the bugs anyway.. Right? Bullet 2: It is interesting the number of organizations, many of whom are CMMI Level 3 and above, still apply a development process, albeit now “iterative” as opposed to “waterfall”, which focus on Design-Code-”Integration and Test”. I&T lumped together. Test once you have integrated. This implies there is no real focus on test (ie quality) before the Integration phase. Bullet 3: If Testing is done earlier than Integration it is typically minimal and is typically very short-lived. We will discuss this a bit later. Bullet 4: The whole focus on quality is really “testing for and fixing bugs” .. A very reactive approach to quality. Bullet 5: Most organizations have a “failure to communicate” between various groups. The Developers’ don’t talk to Test and the Testers don’t talk to QA. Or at least when they do talk, they don’t speak the same “language”. It’s more of an “eye-rolling” and “finger-pointing” culture. Bullet 6: For most organizations “quality” is primarily driven by the Test and/or the QA organization.
  • Although organizations are beginning to change many are approaching their focus on quality and test as they have for the last 25 years: Bullet 1: When the decision is made to move forward on the development of a product (or even a project) the deliverable, whatever it is, is described in terms of its “functionality” via requirements. This is, of course, the only way to know exactly “what” must be delivered as an end-product. These “functional requirements” can generally be quite easily described. For example, “The user shall enter his/her account number in the appropriate form field and press Enter to retrieve his/her account information.” But there are no requirements indicating that “the product shall be delivered in a “high quality” state thereby ensuring minimal customer aggravation and carnage”. Of course the reason this cant be done is that “high quality” is a very nebulous and very subjective term. There are varying levels of quality .. They are typically described as a result of customer dissatisfaction or a quantifiable measure after-the-fact (ie number of defects issues thru a CCB). Or if there are “quality requirements” they are put on the backburner as they will be taken care of during integration/test. That’s where you resolve all of the bugs anyway.. Right? Bullet 2: It is interesting the number of organizations, many of whom are CMMI Level 3 and above, still apply a development process, albeit now “iterative” as opposed to “waterfall”, which focus on Design-Code-”Integration and Test”. I&T lumped together. Test once you have integrated. This implies there is no real focus on test (ie quality) before the Integration phase. Bullet 3: If Testing is done earlier than Integration it is typically minimal and is typically very short-lived. We will discuss this a bit later. Bullet 4: The whole focus on quality is really “testing for and fixing bugs” .. A very reactive approach to quality. Bullet 5: Most organizations have a “failure to communicate” between various groups. The Developers’ don’t talk to Test and the Testers don’t talk to QA. Or at least when they do talk, they don’t speak the same “language”. It’s more of an “eye-rolling” and “finger-pointing” culture. Bullet 6: For most organizations “quality” is primarily driven by the Test and/or the QA organization.
  • Although organizations are beginning to change many are approaching their focus on quality and test as they have for the last 25 years: Bullet 1: When the decision is made to move forward on the development of a product (or even a project) the deliverable, whatever it is, is described in terms of its “functionality” via requirements. This is, of course, the only way to know exactly “what” must be delivered as an end-product. These “functional requirements” can generally be quite easily described. For example, “The user shall enter his/her account number in the appropriate form field and press Enter to retrieve his/her account information.” But there are no requirements indicating that “the product shall be delivered in a “high quality” state thereby ensuring minimal customer aggravation and carnage”. Of course the reason this cant be done is that “high quality” is a very nebulous and very subjective term. There are varying levels of quality .. They are typically described as a result of customer dissatisfaction or a quantifiable measure after-the-fact (ie number of defects issues thru a CCB). Or if there are “quality requirements” they are put on the backburner as they will be taken care of during integration/test. That’s where you resolve all of the bugs anyway.. Right? Bullet 2: It is interesting the number of organizations, many of whom are CMMI Level 3 and above, still apply a development process, albeit now “iterative” as opposed to “waterfall”, which focus on Design-Code-”Integration and Test”. I&T lumped together. Test once you have integrated. This implies there is no real focus on test (ie quality) before the Integration phase. Bullet 3: If Testing is done earlier than Integration it is typically minimal and is typically very short-lived. We will discuss this a bit later. Bullet 4: The whole focus on quality is really “testing for and fixing bugs” .. A very reactive approach to quality. Bullet 5: Most organizations have a “failure to communicate” between various groups. The Developers’ don’t talk to Test and the Testers don’t talk to QA. Or at least when they do talk, they don’t speak the same “language”. It’s more of an “eye-rolling” and “finger-pointing” culture. Bullet 6: For most organizations “quality” is primarily driven by the Test and/or the QA organization.
  • If more defects are introduced in these early phases and the cost for addressing them is less then WHY don’t organizations do more? Because “Early Testing is Difficult” ..
  • If more defects are introduced in these early phases and the cost for addressing them is less then WHY don’t organizations do more? Because “Early Testing is Difficult” ..
  • If more defects are introduced in these early phases and the cost for addressing them is less then WHY don’t organizations do more? Because “Early Testing is Difficult” ..
  • If more defects are introduced in these early phases and the cost for addressing them is less then WHY don’t organizations do more? Because “Early Testing is Difficult” ..
  • If more defects are introduced in these early phases and the cost for addressing them is less then WHY don’t organizations do more? Because “Early Testing is Difficult” ..

Transcript

  • 1. Parasoft ADP solutions Automated Defects Prevention for Embedded Systems Software Development by Wiktor Grodowski
  • 2.
    • Mostly focuses on Functional Testing tied back to Requirements
    • Design – Code – “Integration and Test”
    • Early Testing is minimal and short-lived
    • Testing efforts are focused on the isolation and subsequent fixing of “bugs” – Error Detection
    • “ Failure to Communicate” between Development, Test and Quality Assurance (QA)
    Observations on Software Testing .. Not much has changed in 25 years ..
  • 3. The Cost of Waiting
  • 4.
    • ADP = Automated Defects Prevention
    • Defects Prevention != Error Detection
    • ADP = Defects Prevention + Automation
    • ADP is not a replacement for CMM / CMMI and alike. It is a change in a software development approach.
    • ADP is practical, flexible, down to earth, based on over 20 years of experience
    • http://www.adpqb.org
    What is ADP all about?
  • 5.
    • ADP originates from Deming's and Jurand's works
    • Attempting to introduce Deming's methodology into SDLC
    • Follow the basic steps:
      • Identify a defect
      • Find its cause
      • Locate where and why it was introduced
      • Implement preventive methods
      • Monitor the process
    • Start preventing errors instead of detecting them!
    A brief history
  • 6.
    • Plan – establish objectives and processes
    • Do – implement new process
    • Check – measure the results and compare against expected outcome
    • Act – analyze the differences, determine where to apply changes to include improvement
    Deming's wheel
  • 7. ADP principles
    • Principle 1: Establishment of Infrastructure
      • Build a strong foundation through integration of people and technology
    • Principle 2: Application of General Best Practices
      • Learn from others’ mistakes
      • MISRA, JSF, other coding standards
    • Principle 3: Customization of Best Practices
      • Learn from your own mistakes
    • Principle 4: Measurement and Tracking of Project Status
      • Understand the past and present to make decisions about the future
    • Principle 5: Automation
      • Let the computer do it!
    • Principle 6: Incremental Implementation of ADP’s Practices and Policies
  • 8. Establishment of infrastructure
    • People – extending traditional roles
      • Must promote communication, efficiency, productivity and job satisfaction
      • Each role must include defects prevention tasks
    • Understanding roles
      • All must understand how to adhere to the roles they are given
      • Define group behavior
    • Technology – minimum and more
      • SCMS
      • Build system
      • BTS
      • Reporting system
  • 9. Application of best practices
    • Most software projects share same characteristics
    • Best practices can be used to limit:
      • Human errors
      • Common software defects
    • Where do the best practices come from?
    • Different granularity of best practices:
      • Organizational-level (EVMS – Earned Value Management System, general guidelines for use of CMS)
      • Design-level (suggestions for usage of specific technologies or techniques, such as design patterns, Ajax or SOA)
      • Code-construction level (procedures for peer programming, writing unit tests, code review etc.)
  • 10. Customization of best practices
    • Must address project-specific problems
    • Is based on the Deming's principles
      • Monitor process
      • Identify a defect
      • Find cause of a defect
      • Locate part of a process that let the defect slip through
      • Modify process to prevent the defect from reappearing
      • Monitor process (again)
    • Static analysis tool should be implemented to monitor both practices automatically
  • 11. Measurement and tracking
    • Measures must be quantitative
      • Absolute values (number of defects)
      • Metrics (possession of a given attribute, confidence factor)
    • Measurement:
      • What and how can be measured?
      • What does the measure or metric indicate?
      • What decisions can be based on these information?
    • Tracking:
      • What data can be tracked?
      • How is the data tracked?
      • What do we gain from tracking this data?
  • 12. Automation
    • What is automation?
      • „ Activity that can be performed completely without human intervention”
    • “ Automation of mundane and repetitive tasks improves people satisfaction and effectiveness”
    • Not to substitute people but to improve working conditions for people
    • Automation improves product quality
    • Automation facilitates human communication
    • Automation helps to implement and enforce best practices and organizational standards
    • Automation improves people productivity
    • Automation helps to collect measurement data
  • 13.
    • Code reviews are time-intensive and human-intensive
      • Review process can be done with a “static analysis” tool
    • Quality focus is typically NOT built into product schedules so there is typically little time to do proper testing.
      • Automation is required to leverage these precious cycles
    • Developers write more testing procedures/code than the code they are developing for delivery!
      • ADP “automates” much of the driver, stubs and infrastructure creation as well as provide metrics, like code coverage
    • Developers typically do NOT like testing.
      • Some problems will never be solved!
    Why Automation is Important !
  • 14. Incremental implementation of ADP
    • Continuous change through:
      • Learning
      • Adaptation
      • Innovation
    • Change, however, can be overwhelming
    • ADP needs to be implemented gradually, on a group-by-group basis
      • Pilot group(s)
      • Division(s)
      • Organization
    • Best practices should also be introduced one-by-one
  • 15. Incremental implementation of ADP
    • ADP is both iterative and incremental – the phases do not have to follow sequentially
  • 16. Scheduled Build and Test Server Program Manager Architect Exemplary ADP Deployment Developer Machines Architect / Technical lead Target device SCM Source Control code & tests TCM Team Configuration Manager Test Results Reporting Professional Editions Server Edition Architect Edition Nightly Results Company Standards deployment
  • 17.
    • High costs of defects
      • Number of defects in a final product must be kept at minimal due to a high recall cost
      • Possibility of having human life or human safety at stake, dependent on a stability of an embedded device
    • Important time to market
      • Late product = lost money
    • Mandatory compliance
      • MISRA, JSF, DO-178B, others
      • Contractor's internal standards
    Relation to Embedded Systems
  • 18.
    • Exemplary ROI
      • Unit testing caused drop in residual errors almost by a factor of 2.
    Is it worth it? Revenues ($) ‏ Time (months) ‏ On-time Delayed entry entry Peak revenue Peak revenue from delayed entry Market rise Market fall W 2W Time D On-time Delayed Revenues ($) ‏
  • 19. Is it really worth it?
    • ADP
    • Non-ADP
  • 20.
    • C++Test
      • Coding Standards analysis
      • Flow Analysis
      • Unit Testing - Coverage analysis
      • Code Review
      • Authoring and Automation
    • Concerto
      • Work Planning and Tracking
      • Reporting and Analytics
      • Acceptance testing
      • Data correlation
      • Process Visibility
    • Insure++
      • Runtime memory analysis
    Parasoft Tools supporting ADP
  • 21. Benefits of ADP X Understands and anticipates common coding mistakes that can lead to poor software quality X Uses testing as a process measurement technique X X Uses testing results to fix errors X Learns from test results and improves processes and prevents future errors X Reduces probability of errors reoccurring X Addresses the critical roles in creating software X Provides tools and processes to keep team members code at same level X Uses testing results to fix the development process X X Find Errors ADP Testing Benefits
  • 22. Questions & Answers www.parasoft-embedded.com