Most Common Validation ErrorsIdenti Fy  Frequent Deficiencies To Accelerate Your Validation Projects (Based on Frank Houston research)QPack™Application Lifecycle ManagementMay 2010
About OrcanosDevelop and implement QPack Medical ALM systemExpert in Application lifecycle managementLeading ALM 2.0 and CFR 21 Part 11 Forums Over 50 installationsIndustry based solutions: Software/Medical Device/SemiconductorsSiemens Business and Technology partnersOracle partner (OEM)Offices: Israel, UK, Germany
Customers
Working Without ALM SystemMarketDefinitionDevelopTest9 Most Common validation ErrorsMissing Information
Inconsistency
Lack of Needed Detail
Traceability
 Vague Wording
 Unverifiable Test Results
 GDP
 Incomplete Testing
 AmbiguityTest Results
QPack Application Lifecycle ManagementMarketDefinitionDevelopTestDesignTestPlanDefinitionMarketGatheringRequirementDefinition2.0Risk Mng.1.0CustomerFacingTestExecutionDeliveryDefectReporting
QPack ALM SolutionTeamcenter PLMALM ANALYTICSProduct RequirementsTasksMarketing RequirementsTestPlanProject ManagementQPackTest ExecutionRiskManagementDefectsTrackingSOA
Lessons Learned from 1,720 ObservationsThe goal was to bring the company’s software validation evidence up to the level of the U.S. FDA expectation.Identified which documents most frequently contains errorsThe results are typical problems most companies faceApplying Pareto AnalysisFinds 9 Types of deficiencies representing about 41% of categories
Missing InformationDocuments or records omitted fundamental information or content that should have been included.Compliancy to specific standardsMissing other engineering information HW & Mechanics
InconsistencyDocuments contained statements inconsistent with other statements about the same topic in the same document or in the same validation package.JargonVarying TerminologyContradiction in logic
Lack of Needed DetailThis deficiency applied mostly to requirements documents. The requirements in the validation package did not adequately describe the software.Poor requirements documentsDataUser InteractionKey process
Traceability3 frequent traceability problems:The traceability matrix did not account for a traceable specification or an observation step in a test scriptBroken trace (Barren or Orphans)Requirement details were not explicitly numbered and traced to associated test steps.
Vague WordingDocuments used generalities“in accordance to an approved procedure”, or “applicable regulatory requirements”, or “all associated GxP and business processes”vague words such as “may”, “possibly”, “more or less”, and “approximately”
Unverifiable Test ResultsExpected results were not described sufficiently so that an independent reviewer could compare and verify actual results. The IEEE Standard for Software Test documentation, Std. 829.1988, Clause 6.2.4 says you should, “...provide the exact value (with tolerances where appropriate) for each required output or feature”For executed scripts, actual results were not recorded or captured in a way that allowed an independent reviewer to compare them to expected results.Actual-result column with no reference to a screen shot
GDP3 frequent Good Documentation Practice problems:Hand-recorded data and testing evidence, such as test results, were presented in a way that could cause doubts about their authenticity (Ex. Cross-Out) Data that confirmed a specific requirement was hard to find in the evidence provided (for example, a busy screen shot crammed with data).Handwritten corrections were made that changed the sense of a requirement or an expected test result

Orcanos medical-common-validation-errors

  • 1.
    Most Common ValidationErrorsIdenti Fy Frequent Deficiencies To Accelerate Your Validation Projects (Based on Frank Houston research)QPack™Application Lifecycle ManagementMay 2010
  • 2.
    About OrcanosDevelop andimplement QPack Medical ALM systemExpert in Application lifecycle managementLeading ALM 2.0 and CFR 21 Part 11 Forums Over 50 installationsIndustry based solutions: Software/Medical Device/SemiconductorsSiemens Business and Technology partnersOracle partner (OEM)Offices: Israel, UK, Germany
  • 3.
  • 4.
    Working Without ALMSystemMarketDefinitionDevelopTest9 Most Common validation ErrorsMissing Information
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
    QPack Application LifecycleManagementMarketDefinitionDevelopTestDesignTestPlanDefinitionMarketGatheringRequirementDefinition2.0Risk Mng.1.0CustomerFacingTestExecutionDeliveryDefectReporting
  • 14.
    QPack ALM SolutionTeamcenterPLMALM ANALYTICSProduct RequirementsTasksMarketing RequirementsTestPlanProject ManagementQPackTest ExecutionRiskManagementDefectsTrackingSOA
  • 15.
    Lessons Learned from1,720 ObservationsThe goal was to bring the company’s software validation evidence up to the level of the U.S. FDA expectation.Identified which documents most frequently contains errorsThe results are typical problems most companies faceApplying Pareto AnalysisFinds 9 Types of deficiencies representing about 41% of categories
  • 16.
    Missing InformationDocuments orrecords omitted fundamental information or content that should have been included.Compliancy to specific standardsMissing other engineering information HW & Mechanics
  • 17.
    InconsistencyDocuments contained statementsinconsistent with other statements about the same topic in the same document or in the same validation package.JargonVarying TerminologyContradiction in logic
  • 18.
    Lack of NeededDetailThis deficiency applied mostly to requirements documents. The requirements in the validation package did not adequately describe the software.Poor requirements documentsDataUser InteractionKey process
  • 19.
    Traceability3 frequent traceabilityproblems:The traceability matrix did not account for a traceable specification or an observation step in a test scriptBroken trace (Barren or Orphans)Requirement details were not explicitly numbered and traced to associated test steps.
  • 20.
    Vague WordingDocuments usedgeneralities“in accordance to an approved procedure”, or “applicable regulatory requirements”, or “all associated GxP and business processes”vague words such as “may”, “possibly”, “more or less”, and “approximately”
  • 21.
    Unverifiable Test ResultsExpectedresults were not described sufficiently so that an independent reviewer could compare and verify actual results. The IEEE Standard for Software Test documentation, Std. 829.1988, Clause 6.2.4 says you should, “...provide the exact value (with tolerances where appropriate) for each required output or feature”For executed scripts, actual results were not recorded or captured in a way that allowed an independent reviewer to compare them to expected results.Actual-result column with no reference to a screen shot
  • 22.
    GDP3 frequent GoodDocumentation Practice problems:Hand-recorded data and testing evidence, such as test results, were presented in a way that could cause doubts about their authenticity (Ex. Cross-Out) Data that confirmed a specific requirement was hard to find in the evidence provided (for example, a busy screen shot crammed with data).Handwritten corrections were made that changed the sense of a requirement or an expected test result