Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

IJ Custom PDF - Safer Inspec - SEPOCT 2015


Published on

  • Be the first to comment

  • Be the first to like this

IJ Custom PDF - Safer Inspec - SEPOCT 2015

  1. 1. VOLUME 21, ISSUE 5 SEPTEMBER | OCTOBER 2015 ASS E T I N T E G R I T Y I N T E L L I G E N C E SAFER INSPECTIONEERING Featured Article PAUL J. RAMIREZ, Quality Assurance Engineer at NASA - Jet Propulsion Laboratory (JPL)
  2. 2. 2 Inspectioneering Journal SEPTEMBER | OCTOBER 2015 INTRODUCTION Whether a hardware assembly is being constructed for use nearby on the ground or is to be launched into deep space, the Quality Assurance Engineer (QAE) provides a safety net for mission-criti- cal hardware. The QAE inspects structural welds, micro-electronic installations on antennas, parachute soft goods, and a myriad of other assemblies in order to identify risks and avert hardware failures that may occur millions of miles away or thousands of hours after startup. The variety and complexity of hardware, and the many dimensions of the QA inspection process, combine in such a manner that even the most experienced QAE may miss problems. Even worse—and a major theme of this article—the QAE's very experience and skill set may contribute in unexpected ways to the risk of error. There are no absolute guarantees of successful quality assurance. For example, cross-training QAEs in mul- tiple facets of QA, or deploying QAEs who are Subject Matter Experts (SMEs) reduces, but does not eliminate the risk of faulty QA. Although it does not offer a fool- proof method of QA, this article proposes a new perspective on the methodology of QA that may afford modest reductions in the risk of error. BACKGROUND An internet search of "Quality Assurance Algorithms" discloses the surprising result that there currently exist no widely used algorithmic approaches guiding the process of inspection. Inspectors and QAEs have undoubtedly built their own informal and undocumented proto- cols over time, likely based on their field experience, prior knowl- edge, and disciplinary backgrounds.This informal approach to the inspection process can promote efficient and reliable inspections as long as QAEs encounter essentially familiar challenges in their work. But this very familiarity can impair the QAE’s ability to per- ceive new issues in the structure or assembly under inspection. QAEs, like everyone else, often see what they expect to see. As the cliché goes: “if all you have is a hammer, everything looks like a nail.” For example, the QAE with a materials science background observes small flaws in an assembly and sees evidence of corro- sion. Another inspector, trained differently, attributes the same flaws to the effects of mechanical stress. This article aims to make the inspection process less familiar to the QAE and thereby mitigate the risk of errors stemming from familiarity, habit, and unconscious discipline-linked bias. The QA algorithm set out below focuses on visual inspection of typical hardware assemblies and structures, although occa- sions arise when auditory, tactile, or even olfactory cues may be significant. Inspection of an assembly or system component proceeds through three phases: 1. Detection: identification of visible or concealed defects; 2. Recognition: recalling elements from the QAE's disciplinary and experiential knowledge base that validate attribution of a visible defect to an underlying cause or source; 3. Interpretation: explaining the meaning of the identified defect and causal nexus in such a way as to ensure appropri- ate remediation. These processes usually occur sequen- tially, although interdependent and recur- sive pathways inevitably emerge among their elements. For example, attribution of causality often points up possible remedia- tion, but failed efforts at remediation may counter previous attributions of causality. If the fix doesn't work, the bug or defect may have arisen from a different unidenti- fied source. Emphasis in this discussion centers on visual inspection. Human vision and cog- nition remain an enigma: how the QAE sees, what they see, and what they infer from what they see can challenge even the most experienced inspector. As noted, previous experience can often serve as an obstacle to successful on-going QA. The challenge addressed in this article is not to develop new ways to document or communicate what the QAE sees; it is instead to reflect on how visual inputs are processed upstream in order to help the QAE see things in new ways. Given that hardware sys- tems vary in size and complexity, the problem of conducting a thorough and accurate visual inspection to identify noticeable risks/defects is non-trivial. An assurance algorithm that reflects the three phases of the inspection process (detection, recognition, and interpretation), and thereby encourages awareness of inspec- tion alternatives, promises to deepen the inspection process and improve its outcomes. THE SAFER FLOWCHART Visual inspection performed by any QAE on any structure or assembly should detect and determine all risks and defects exhib- ited by the hardware assembly. Successful visual inspection can be hindered by ill-posed problems created by assumptions (often from the QAE’s own discipline background) about what is and isn’t seen. The diagram below posits three (3) phases of the QA process: SAFER INSPECTIONEERING BY: PAUL J. RAMIREZ, Quality Assurance Engineer at NASA - Jet Propulsion Laboratory (JPL) ...the QAE's very experience and skill set may contribute in unexpected ways to the risk of error. There are no absolute guarantees of successful quality assurance.
  3. 3. SEPTEMBER | OCTOBER 2015 Inspectioneering Journal 3 1. Triangles O – D - I = Observation/Detection, 2. Diagnosis /Recognition, and 3. Interpretation/Inspection Report and five (5) foci of QA: 1. Structure 2. Assembly 3. Function 4. Energy 5. Risk The proposed algorithm is divided into five elements: Structure, Assembly, Function, Energy, and Risk. Structure addresses the building, framework, and the way parts of an assembly are arranged. Assembly addresses the collection, putting together, and manufacturing processes of the structure which are intended to allow it to perform its intended purpose or function. Function refers to the specific duty of a single component in the assembly. Energy can refer to the energy that is required for the hardware to function, as well as any harnessed, transferred, or converted energy made by the hardware or its working environment. Finally, Risk targets the possible conflicts that each of these elements may have with one another. • Structure – (The building, framework, and the way parts of an assembly are arranged). • Assembly – (The collection, putting together, and manu- facturing processes of the structure to perform its special purpose and or function). • Function – (The special, specific job, use, or duty of the hardware). • Energy – (The type energy used to operate the hardware). • Risk – (Recognizing the summation of risk by revising in reverse). PERMUTATION OF SAFER COMPONENTS The entirety of the hardware assembly can be addressed diagnos- tically from the point of view of each of the SAFER components. For example, Structure guides inspection of the assembly in terms of how appropriate its dimensions and its potential are to encom- pass the assembly's operations. Assembly itself focuses atten- tion on the interrelations of components and examines issues of efficiency and economy of operation. Function introduces a time dimension. For example, functional QA would attempt to envisage the dynamics of the working assembly over time (i.e., does the currently inspected packaging of a parachute make it feasible for successful deployment?). Energy addresses the impact of energy types, quantities, and circuits as the hardware assembly is powered during operation. Risk poses more abstract and, again, time-dependent issues relating to the projected lifetime oper- ation of the assembly within different environments and when exposed to different stresses. SAFER INVENTORY OF QA SUB-ELEMENTS LATERAL THINKING AND THE DOWNSIDE OF SUBJECT MATTER EXPERTISE MostQAEspossesssubjectmatterexpertiseinoneareaoranother. Such expertise allows the QAE to perform a more efficient, and often more effective inspection. But the area of expertise itself also handicaps the QAE. Their background can sometimes predis- pose them to look at the assembly from a distinctive and limited point of view. Questions that might arise to someone with a dif- ferent discipline background might elude them. Although it is theoretically possible to imagine a given assembly being inspected by a team of QAEs from different SME back- grounds, this is clearly impractical. The alternative is to utilize lateral thinking approaches. Just as effective proofreading of a written text is facilitated by reading the text from right to left, the SAFER algorithm forces the QAE to grapple with each SAFER element and retrieve the assumptions underlying that element to aid in the formation of a distinct perspective for viewing the assembly. The purpose of right to left proofreading is to slow the proofreader down and to interfere with their expectations concerning what will arise from the text next. In the same way, the SAFER algorithm is intended to slow the QAE down by con- sistently reminding them of discipline perspectives that are not naturally their own. ALGORITHMIC MODELING OF SAFER SAFER can be represented in the form of a transitive reduction closure algorithm. This makes it possible to solve reachability
  4. 4. 4 Inspectioneering Journal SEPTEMBER | OCTOBER 2015 questions efficiently by creating a binary relation between two elements. These QA elements are then allowed to interrogate one another, thereby creating more risk queries to be answered. THE PURPOSES OF THE SAFER ALGORITHM The narrow purpose of the SAFER algorithm is to standardize the qualitative assessment process of highly technical, one off, or low production rate products. But there are larger benefits that stem from such standardization. For example, QA inevitably works in partial opposition to other parties within the engineering orga- nization. The priorities of hard charging engineers may include a stepped up time schedule and expedited development of proj- ects. Therefore, the QAE may often be seen as someone who slows down work and imposes needless constraints on developers' free flowing ideas. Sometimes that perspective on QA is undoubtedly accurate. At other times, however, it may very well be the case that effective QA greatly benefits project development and ultimately short- ens times to completion of a successful project. One way to improve the contribution of QA to overall project success is to make the purposes and methods of QA clearer to other stake- holders. Sharing the SAFER algorithm with engineers could facilitate better communication by illustrating the QAEs com- mon goal in ensuring the successful production and operation of quality hardware. n For more information on this subject or the author, please email us at
  5. 5. SEPTEMBER | OCTOBER 2015 Inspectioneering Journal 5 PAUL RAMIREZ Paul J. Ramirez holds a degree in Materials Science and Engineering Technology from Don Bosco College, Rosemead CA, and a Self-Designed B.A. degree in Chinese from Chapman University, Orange CA. He is a recipient of the prestigious George Kehl Award from the American Society of Materials and the International Metallographic Society for research on explosively formed projectiles. Paul is also a recipient of the Horatio Alger Military Scholar Award. He is a former Combat Infantryman, Paratrooper, and Unexploded Ordnance Technician. Currently, Paul serves as a Quality Assurance Engineer at NASA/JPL in Pasadena California assisting the Deep Space Network, and project INSIGHT. CONTRIBUTING AUTHORS