Your SlideShare is downloading. ×
Introduction to Design for Six Sigma
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Introduction to Design for Six Sigma

49,462
views

Published on


0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
49,462
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
140
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Notes level one Notes level two Notes level three Notes level four
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • During Concept Generation & Selection, we get the VOC and using QFD we transfer the VOC to requirements. We generate concepts to meet the requirements and then move to design to determine alternatives to the design. Then we optimize the design during Optimize and verify the design during Verify. CDOV is just a different way of looking at Design for Six Sigma. The same Define, Measure and Analyze tasks are included in Concept Generation and Selection. In the Design, Optimize and Verify phases, we use functional test reliability. The we also demonstrate reliability during Verify.
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • During Concept Generation & Selection, we get the VOC and using QFD we transfer the VOC to requirements. We generate concepts to meet the requirements and then move to design to determine alternatives to the design. Then we optimize the design during Optimize and verify the design during Verify. CDOV is just a different way of looking at Design for Six Sigma. The same Define, Measure and Analyze tasks are included in Concept Generation and Selection. In the Design, Optimize and Verify phases, we use functional test reliability. The we also demonstrate reliability during Verify.
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • Lunch Break ?
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • As an engineering specialty, DFM is quite a discipline in its own right. Make with existing capabilities not just existing equipment. There is a lot of good design practice behind these points.
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • During Concept Generation & Selection, we get the VOC and using QFD we transfer the VOC to requirements. We generate concepts to meet the requirements and then move to design to determine alternatives to the design. Then we optimize the design during Optimize and verify the design during Verify. CDOV is just a different way of looking at Design for Six Sigma. The same Define, Measure and Analyze tasks are included in Concept Generation and Selection. In the Design, Optimize and Verify phases, we use functional test reliability. The we also demonstrate reliability during Verify.
  • Instructor Notes
  • Instructor Notes
  • Instructor Notes
  • During Concept Generation & Selection, we get the VOC and using QFD we transfer the VOC to requirements. We generate concepts to meet the requirements and then move to design to determine alternatives to the design. Then we optimize the design during Optimize and verify the design during Verify. CDOV is just a different way of looking at Design for Six Sigma. The same Define, Measure and Analyze tasks are included in Concept Generation and Selection. In the Design, Optimize and Verify phases, we use functional test reliability. The we also demonstrate reliability during Verify.
  • Instructor Notes
  • Transcript

    • 1. 2 nd Annual Design for Six Sigma Conference Introduction to Design for Six Sigma James M. Wasiloff, MBB September 14, 2006
    • 2. Driving Success Using DFSS in the Development of Battle Ships @Raytheon Jon Mckenzie, Director of Six Sigma at Raytheon says, “in early phases of product development we use DFSS in modeling and simulation of how these products will work. After the contract is awarded and we have firmed up exactly what we are going to build and what the system is going to look like, then we use DFSS to derive requirements from the customer, and all the way to the critical design elements that a design engineer will need to put the parts together and make it work” McKenzie explained that DFSS is embedded in Raytheon’s Integrated Product Development System (IPDS), which he said “governs everything we do in the company. If you follow IPDS, you are going to get DFSS along the way.”
    • 3. Driving Success "Boeing Picks McNerney as Chief" - Wall Street Journal, July 1, 2005 Jim McNerney has left his post at 3M Company where he had been the top executive for over four years to take the job as chairman, president and chief executive of the world's largest aerospace company. McNerney is globally recognized as a strong advocate for the deployment of Design for Six Sigma. The following are a sampling of quotes from McNerney while serving at 3M: "Six Sigma is totally changing 3M. Many of the things that had driven the success of our company for the past eighty years no longer apply" "Six Sigma is not a program. It's our game plan. It will challenge all of us. individually and collectively, to be the very best we can be" "Major goal is to have for the first time, common approach to problem solving, new product development, and measurement across entire company" "McNerney preaches Six Sigma to Clients... It changes everybody's lives in the first year. We're betting our performance on Six Sigma. This is something that, if Six Sigma doesn't succeed, the company doesn't succeed." "At 3M, Six Sigma is driven by our executive management teams, who are fully engaged in critical business processes and actively deploying Six Sigma methodologies throughout the organization"
    • 4. Driving Success Subir Chowdhury, author of the book “The Power of Design for Six Sigma” states in his book: “ Most companies spend only 5% of their budget on design, when design typically would determine 70% of the cost of the product- partly because 80 % of all quality problems are unwittingly designed into the product itself. In fact, in government contracts, 30 to 40 % of the budget is set aside for testing and correcting the product. Imagine! So they are admitting in advance that one-third of the budget must be devoted to correcting the problems they plan to create with the first two-thirds of the budget. My gut says, any time testing and fixing are planned for up front, it is a virtual certainty that testing and fixing will be performed. Plan for Failure and you’ll get it”
    • 5. Six Sigma: High Level Perspective DEFINE PROBLEM / ISSUE Prevent defect(s) Eliminate defect(s) CHARACTERIZE DEFINE OPTIMIZE VERIFY MEASURE ANALYZE IMPROVE CONTROL DMAIC Black belt project to improve mfg. capability Mfg. process cannot provide sufficient improvement. Need DFSS project to reduce product sensitivity to mfg. noise. DFSS DMAIC New products Existing products ? Strategy
    • 6. Introduction to Design For Six Sigma
      • Across private industry and government/defense sectors, Design For Six Sigma is a Product Development process that:
        • Effectively translates the Voice of The Customer into a design
        • Models and quantifies the design’s performance and risk
        • Applies statistical tools to understand, optimize, and control key factors (or develop countermeasures to…) that deliver critical customer attributes robustly in the presence of noise
        • Quantifies risk and facilitates business discussions regarding product delivery quality and reliability early in the Product Development process
    • 7. A Historical Perspective on DFSS
      • Six Sigma developed at Motorola and adopted by GE and others (Steps: DMAIC)
      • DFSS concept originally developed at GE in late ’90s
      • GE approach requires enhancement in the DFSS concept for successful application to automotive industry product development:
        • More focus on achieving Customer Satisfaction by improving Robustness and High Time in Service performance
        • Aligned to defense product development practices
        • (e.g., DVP&R, Kano Model, Robust Engineering, Reliability & Robustness checklist, etc.)
    • 8. Key Elements of DFSS
      • Should be viewed as an enhancement to the current design process :
        • Bundles existing product development tools
        • Teaches tools just-in-time at appropriate development phases
        • Provides common 6-sigma based language for PD
        • Not a “locked-in” process that requires the use of specified tools at every milestone
        • Each program may select tools according to the ability to fulfill key DFSS elements
      • Is a stage-gate process
      • Is a team and project driven process
      • Score Card driven process
      • Integral part of IPS and CMMI
    • 9. Gate Reviews and Project Reviews Characterize Optimize Gate Review Gate Review Define Gate Review Gate Review Project Reviews Project Reviews Project Reviews Project Reviews Verify
    • 10. DFSS Scorecard Summary: Gate review tool and project summary Project Team/BBs Rate Deliverables Separates how well the team has done from the answer they got Key Phase Schedule: J F M A M J J A S O N D J F M A M J Concept Design Design Development Optimize Verify Capability Phase Deliverables: Owner Date Risks/Issues 1. Critical Parameter Database G G G 2. G G G 3. G G G 4. Subsystem & Subassembly Reliability Test Plans & Current Data/Results Summary G G G 5. Say/Do Contract book Update G G G 6. Marketing Plan Update / Final G G G 7. Issue management G G G 8. Updated Risk Mitigation Plan G G G 9. Updated Integrated Schedule G G G 10. Customer Requirements Validation G G G 11. G G G 12. SA Checklist Complete/Approved G G G 13. Business Case Update / Final G G G 14. Post-Launch Product Control Plan G G G Total Score Gate Approval G 2006 Confidence Score Performance Score Gate Review score System / Sub-system Robustness Verified Optimized Design Performance Verified Mfg. & Supply Chain Capability Assessment 2005 Decision by Gatekeepers Phase-Gate Deliverables Gate Reviewers Rate Deliverables
    • 11. Project Categories
      • New programs or technologies with large design space and some constraints (not usually a “clean sheet design”, but hopefully allows concept selection)
      • New product applications (beyond conceptual phase) for which design does not meet customer wants/functional requirements; usually, limited design space
      • Current model applications with very small design space and many constraints (high degree of optimization “tuning”)
      • New model applications with very aggressive Reliability Requirements
      • Project examples: New products with 10 x Reliability of legacy design
    • 12. Project Prioritization Scheme Critical to GDLS Critical to Customer Low High Low High High Low Low Low High High Low High
    • 13. Proposed Project Selection Criteria
      • Impact on customer satisfaction
      • Impact on reliability
      • Design “degrees of freedom”
      • Estimated cost avoidance
      • Impact on maintainability
      • Project duration
      • Project complexity/scope
      • Manufacturing location
      • CAE model availability
    • 14. DFSS Key Focus
      • Developing a QFD or other rigorous identification of customer needs to greater depth than is current practice
      • Defining or enhancing a transfer function , “y=f(x)” that mathematically describes “critical to satisfaction” metrics in terms of design variables
      • Better leveraging analytical means to identify & optimize critical design, manufacturing, and assembly elements
      • Summarizing design risk in a scorecard that captures design & manufacturing capability and enforces process discipline
      • Assessing field robustness and using that assessment to guide verification planning & implementation
    • 15. Design for Six Sigma: DCOV Understand Customer - and - Understand History Identify Critical to Satisfaction Drivers (CTS’s) and Related Functional Targets
      • Flow Down to CTS’s to lower level (y’s)
      • Relate CTS’s (y’s) to CTQ design parameters (x’s)
      • Characterize robustness opportunities including high mileage
      • Characterize capability/stability and select robustness strategy
      Design for Producibility Design for Robust Performance Minimize process sensitivity to product & mfg. variations Minimize product sensitivity to mfg. & usage conditions Test & Verify Assess Performance, Reliability & Manufacturing… Not OK OK Perform tradoffs to ensure all CTS’s are met D efine CTS’s C haracterize System O ptimize Product/ Process V erify Results DFSS Assess-ment Estimate  for process capability and for product function over time Capture data in scorecards Understand System - and - Select Concepts
    • 16. DFSS (DCOV) Flow of Analysis & Tools Function Modeling Concept Generation & Selection DoE Robust Design & Tolerance Design Functions VOC KJ QFD Customer Needs/ Statements Customer Requirements FMEA Technical Requirements Reliability/Robustness Demonstration Concepts D C O V
    • 17. DFSS- D COV Process Map 4.0 Verify 5.0 Verify Functionality Define Characterize O ptimize What are the customer requirements? What approach/ system architecture would best meet customer’s requirements? How can we optimize for Robustness? How can we ensure that the customer needs will be consistently met in various environments and situations? How can we design to meet the customer needs? What measurable system requirements support the customer needs? How will we demonstrate robustness/ reliability? How can we verify that this product is reliable and robust? V erify
    • 18. DFSS ( D COV) Flow of Analysis & Tools Function Modeling Concept Generation & Selection DoE Robust Design & Tolerance Design Functions VOC KJ QFD Customer Needs/ Statements Customer Requirements FMEA Technical Requirements Reliability/Robustness Demonstration Concepts D
    • 19. Define Phase
      • The Define phase begins the process with a formal tie of design to Voice of the Customer. This phase involves developing a team and team charter, gathering VOC, performing competitive analysis, and developing CTQs.
      • Crucial Steps:
        • Identify and cascade customer and product requirements
        • Establish the business case
        • Identify technical requirements (CTQ variables and specification limits)
        • Roles and responsibilities
        • Establish project Milestones
      • Key Tools:
        • VOC (Voice of Customer)
        • QFD (Quality Function Deployment)
        • KJ ( K awakita, J iro)
        • Kano Model
        • Benchmarking
    • 20. Gather, Process & Validate the Voice of the Customer (VOC)
      • Who our customer is
      • The wants or needs of the customer (Voice of the Customer)
      KJ
      • An affinity diagram that groups similar things of qualitative nature
      • The customer’s wants or
      • needs translated into System Level Technical Requirements
      QFD VOC
    • 21. KJ (Kawakita, Jiro) Analysis
      • Outputs:
      • Grouped, ranked & prioritized Customer Requirements
      • Customer feedback – ranked importance for each requirement, ranked strength of competitors’ ability to fulfill the Customer Requirements
      Summary: integrate “ images & visualized needs ” from the customers with written Voice of the Customer inputs to obtain prioritized Customer Requirements.
      • KJ Questions/Guidelines
      • How are customers being selected?
      • How will you interview the customers?
      • How are you facilitating selection of the NUDs?
      • How are the requirements being prioritized?
      Conducting KJ Analysis: create affinity diagrams that gather and group VOC wants/needs and rank needs based on what is New, Unique and/or Difficult (NUDs) …
    • 22. KJ Results – Voice of Consumers Easy to use Device must have easy to use buttons Easy to navigate internet Device must have easy text entry method Few or No Dropped Calls Dependable connection, in various environments and terrains Good Imaging Device must have ability to expand memory Device must have easy picture messaging Can go 3 days without Recharging Provides secure m-Commerce Must support banking solution Must provide security regarding identity Supports High Speed Internet Connectivity The device must be able to handle data-intensive service Device should be a compelling replacement for wired high speed internet Fits lightly into A Pocket or Purse Small size Lightweight Prefer Customizable at Point of Sale Dependable connection within building, with weak signals or strong signals Good Entertainment Device must have a good multiplayer gaming solution Device must have high quality music solution Device must have video on demand Must have confidence that phone will have enough power for urgent situations. Optimized Display Size High Resolution Max. Screen to Phone Can Email Quality Photos Easily and Quickly The device must have a push email solution. Device must have a simple email solution Camera as good as stand alone Device must have great picture quality on display
    • 23. Cell Phone VOCs Few or no dropped calls Fits lightly into a pocket or purse Cell Phone Project Can go 3 days without recharging Can email quality photos easily & quickly Supports high speed internet connectivity Provides secure m-Commerce
    • 24. Quality Function Deployment (QFD)
      • NUDs from KJ Analysis are translated to System level Technical Requirements through Quality Function Deployment (QFD).
      System Level Technical Requirements VOC KJ QFD Customer Need Statements Customer Requirements (NUDs)
    • 25. QFD/House of Quality Summary: Translates the Voice Of the Customer (VOC) into New, Unique and/or Difficult Technical Requirements. Identifies conflicts / tradeoffs among Technical Requirements. Links System Technical Requirements to Subsystems, Subassemblies and Components VOC NUD’s (from KJ Analysis) Conflicts / Tradeoffs among Technical Requirements Ranked Importance for Technical Requirements Technical Requirements Targets, Tolerances for Technical Requirements 1 3 2 4 5
    • 26. DFSS-D C OV 4.0 Verify 5.0 Verify Functionality Define Characterize O ptimize What are the customer requirements? What approach/ system architecture would best meet customer’s requirements? How can we optimize for Robustness? How can we ensure that the customer needs will be consistently met in various environments and situations? How can we design to meet the customer needs? What measurable system requirements support the customer needs? How will we demonstrate robustness/ reliability? How can we verify that this product is reliable and robust? V erify
    • 27. DFSS (DCOV) Flow of Analysis & Tools Function Modeling Concept Generation & Selection DoE Robust Design & Tolerance Design Functions VOC KJ QFD Customer Needs/ Statements Customer Requirements FMEA Technical Requirements Reliability/Robustness Demonstration Concepts C
    • 28. Characterize Phase
      • The Characterize phase emphasizes CTQs and consists of identifying functional requirements, developing alternative concepts, evaluating alternatives and selecting a best-fit concept, deploying CTQs and predicting sigma capability.
      • Crucial Steps: Formulate and select concept design
        • Identify potential risks using FMEA
        • For each technical requirement, identify design parameters (CTQs) using engineering analysis such as simulation
        • Robustness Strategy
        • Use DOE (design of experiments) and other analysis tools to determine CTQs and their influence on the technical requirements (transfer functions)
      • Key Tools:
        • Risk assessment and FMEA
        • Engineering analysis and simulation
        • DOE (Design of Experiments)
        • Critical Parameter Management (CPm)
        • Design tools like TRIZ, Axiomatic Design, Functional Modeling, Pugh
    • 29. CPM - a Disciplined Methodology
      • CPM is a disciplined methodology to capture the product performance into a structured repository.
      Marketing Design Systems Engineering Subject Matter Experts MFG Suppliers Standards Partners Critical Parameter Management (CPM)
    • 30. CPM – The Strategy
      • Outputs:
      • Capability Growth Index (CGI)
      • Predicted capability (Cp, Cpk)
      • for meeting Critical Customer
      • Requirements
      Summary: Creates a formal link between the product’s design & customer needs Product Launch Higher Layer Cp/Cpk System Capability Cp/Cpk Transfer Function System Requirements CPM Flow Down Lower Layer Cp/Cpk Subsystem Requirements Modeling Optimize Verify Design Requirements KJ QFD VOC Concept
    • 31. Function Modeling Summary: Develop an architecture-independent (unbiased regarding possible solutions) hierarchy of functions through functional diagrams that illustrate the flow of functions within and across the technical requirements. The results are used during system architecting and design.
      • Output:
      • Functional Flow Diagrams
      • Function Modeling Guidelines
      • Be sure to consider all of the key functions.
      • Be sure to capture all the key flows.
    • 32. Theory of Inventive Problem Solving (TRIZ) Summary: Generate alternative Concepts to fulfill the Technical Requirements, using innovative approaches such as the TRIZ (Theory of Inventive Problem Solving) to overcome tradeoffs among the technical requirements. The candidate concepts are evaluated using the Pugh approach.
    • 33. Pugh Concept Selection Matrix Summary: Compares and selects best ideas & concepts using a simple system of “better than”, “worse than”, and “same” scoring. Identifies best features from each concept and creates hybridized solutions. Each concept compared against “best in class” datum Customer & GDLS desired criteria listed along side of matrix (from top of QFD HOQ) Concepts are compared in terms of fulfillment of criteria Concepts listed along top of matrix
    • 34. FMEA (Failure Modes & Effects Analysis)
      • Outputs:
      • Ranked group of failure modes
      • Impacts of failures on customer, product or process
      • Risk Priority Numbers (RPN) before and after corrective action
      • Corrective action to remove or reduce the risk or impact of a failure mode
      FMEA worksheet for identifying the failure modes with the highest RPN Summary: Failure Modes and Effects Analysis is a structured method for identifying & ranking the significance of various failure modes of the program & their effects on the product or customer
      • FMEA Guidelines
      • The FMEA session should include adequate representation of key people with input on risks.
      • When will you implement corrective actions for the highest RPN failure modes?
    • 35. Design for Manufacturability & Assembly
      • Identifies part consolidation opportunities
      • Exposes manufacturing, assembly, quality, service and cost problems early in the design process
      • Objectively assesses design simplification opportunities
      • Drives optimization of product costs
      • Allows setting of target costs
    • 36. Optimizing for Manufacturing & Assembly Reduces Overhead Costs
      • Fewer parts
        • Less material to inventory
        • Fewer assembly stations
        • Less automatic assembly equipment
        • Less dedicated fabrication tooling
        • Less paperwork and drawings
      • Fewer subassemblies and operations
        • Fewer work holders
        • Fewer assembly tools and fixtures
    • 37. Design for Manufacturability & Assembly (DFM) – Key Points to Remember
      • Design parts for ease of fabrication
      • Simple geometry
      • Few process steps
      • Make with existing equipment & tooling or can be easily contracted out
      • Geometry allows easy machine tool access to machined surfaced
      • Not too small or too large for existing processes
      • Usual tolerances (not too tight)
    • 38. Eliminate Parts by Integration Welding Welding or Caulking Eliminate Welding Operation
    • 39. Reduce the Number of Parts
      • Mold message onto surface
      • Eliminate parts by integration
      • Reduce assembly time
      Label Letters are molded Assembly Cost Reduction: 100%
    • 40. Reduce Fastener Count
      • Note: NCR study estimated that each screw eliminated saves $12K in life cycle costs!
    • 41. DFSS-DC O V 4.0 Verify 5.0 Verify Functionality Define Characterize O ptimize What are the customer requirements? What approach/ system architecture would best meet customer’s requirements? How can we optimize for Robustness? How can we ensure that the customer needs will be consistently met in various environments and situations? How can we design to meet the customer needs? What measurable system requirements support the customer needs? How will we demonstrate robustness/ reliability? How can we verify that this product is reliable and robust? V erify
    • 42. DFSS (DC O V) Flow of Analysis & Tools Function Modeling Concept Generation & Selection DoE Robust Design & Tolerance Design Functions VOC KJ QFD Customer Needs/ Statements Customer Requirements FMEA Technical Requirements Reliability/Robustness Demonstration Concepts O
    • 43. Optimize Phase
      • The Optimize phase requires use of process capability information and a statistical approach to tolerancing. Developing detailed design elements, predicting performance, and optimizing design, take place within this phase.
      • Crucial Steps:
        • Assess process capabilities to achieve critical design parameters and meet CTQ limits
        • Optimize design to minimize sensitivity of CTQs to process parameters
        • Design for robust performance and reliability
        • Establish statistical tolerancing
        • Optimize sigma and cost
      • Key Tools:
        • Design for Manufacturability and Assembly
        • Process capability models
        • Robust design
        • Monte Carlo Methods
        • Tolerancing and Tolerance Design
        • Optimization tools like DOE, Response Surface Methodology, Multiple Y
    • 44. Response Surface Methods Summary: Designed Experiment for optimizing the performance of a Response (Y).
      • Output:
      • Set points for the vital x’s to optimize the mean value of the Response (Y)
      • Set points for the vital x’s to optimize the standard deviation of the Response
      • Mathematical Model (transfer function) of the Response as a function of the vital x’s.
    • 45. Goals of Robust Engineering
      • 1 - Identify the ideal function for the product or process design
      • 2 - Through multi-variable experimentation, chose the nominal design parameters that optimize performance in the presence of factors that cause variability at the lowest cost. This is done through two step optimization process:
        • Step 1: Minimize Variability
        • Step 2: Shift mean to Target
    • 46. Robust Design – P-Diagram 3) Customer Usage and Duty Cycle: 4) External Environment Conditions: 5) Internal Environment (System Interaction): 2) Wear Out: 1) Piece to Piece Variation: Ideal Function Item Signal Error States Control Factors (Customer Intend) (Intended Result) (Noise Factors) (Unintended Result) 1 2 3 4 5
    • 47. Step 1 -Minimize Variability Requirement Target Probability Density Performance
    • 48. Step 2 -Shift Mean to Target Requirement Target Probability Density Performance
    • 49. DFSS-CDO V 4.0 Verify 5.0 Verify Functionality C oncept Generation & Selection D esign O ptimize What are the customer requirements? What approach/ system architecture would best meet customer’s requirements? How can we optimize for Robustness? How can we ensure that the customer needs will be consistently met in various environments and situations? How can we design to meet the customer needs? What measurable system requirements support the customer needs? How will we demonstrate robustness/ reliability? How can we verify that this product is reliable and robust? V erify
    • 50. DFSS (DCO V ) Flow of Analysis & Tools Functional Modeling Concept Generation & Selection DoE Robust Design & Tolerance Design Functions VOC KJ QFD Customer Needs/ Statements Customer Requirements FMEA Technical Requirements Reliability/Robustness Demonstration Concepts V
    • 51. Verify Phase
      • The Verify phase consists of testing and verifying the design. As increased testing using formal tools occurs, feedback of requirements should be shared with manufacturing and sourcing, and future manufacturing and design improvements should be noted.
      • Crucial Steps:
        • Prototype test and validation
        • Assess performance, failure modes, reliability, and risks
        • Design iteration
        • Final phase review
      • Key Tools:
        • Accelerated testing
        • Reliability/Robustness Demonstration and Growth
        • FMEA
        • Disciplined New Product Introduction (NPI)
    • 52. Design Verification
      • Product Design Verification
        • Functional Performance Verification
        • Operational Environment requirements verification
        • Reliability requirements verification
        • Usage requirements Verification
        • Safety requirements verification
      • Manufacturing Process Verification
        • Process Capability verification
        • Production Throughput Verification
        • Production cost verification
    • 53. Reliability Assessment
      • Measure reliability performance of Robust subsystems, subassemblies, and components
        • Predicted reliability vs. Measured Reliability
        • Reliability Growth Activities
      • Measure Reliability, Availability and Maintainability (RAM) of equipment, manufacturing processes and assemblies.
    • 54. Questions to Ask to Ensure I have Selected an Appropriate Individual as a Lean Six Sigma Black Belt Candidate Strong candidates will have scores of 15 and higher J. Wasiloff 4 Jan 2006 Propensity to both learn and teach? Follows program management methodologies? Aptitude for LSS and statistics? Understands business and customer needs? Thinks strategically and acts tactically? Possesses executive presence? Thinks logically, connects Y to projects? Remains on task, not easily sidetracked? Demonstrated change agent? Executes both effectively and efficiently? Low (0) Med (1) High (2)
    • 55. Strategic Project Selection
        • Typically need four to five ideas to generate one good Design for Six Six Sigma project
        • Some projects may be cancelled during one of the phases of the process
        • Employ a methodology for consistent ranking of projects
        • Ensure you have a plan for gathering ideas.
          • Top down
            • Best impact on customer or ROI
            • Aligns with strategic goals
          • Bottom up
            • Knows the problem processes
            • Knows the customer best
            • Influences cultural change
    • 56. Training References (External)
      • American Supplier Institute (ASI)
      • American Society for Quality (ASQ)
      • Eastern Michigan University
      • University of Michigan
      • Moresteam University (on-line)
      • PDSS
      • Six Sigma Management Institute
      • Sigma Pro
      • SBTI
    • 57. Literature References
        • Creveling, Clyde M., Slutsky, Jeffrey Lee and D. Antis, Design for Six Sigma in Technology and Product Development. Prentice Hall, 2002
        • Yang, Kai and El-Haik, Basem. Design for Six Sigma: A Roadmap for Product Development
        • Fowlkes, William Y. and Creveling, Clyde. Engineering Methods for Robust Product Design . Addison-Wesley, 1995.
        • O'Connor, Patrick D. T. Practical Reliability Engineering . John Wiley and Sons, 1991.
        • Reklaitis, G. V., Ravindran, A. and Ragsdell, K. M. Engineering Optimization . John Wiley and Sons, 1983.
        • Suh, Nam P.. The Principles of Design . Oxford University Press, 1990.
    • 58. Q&A
      • Thanks for your participation in this technical interchange session