Measuring Six Sigma with an Emphasis on DFSS


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Barracuda 7200.8 SATA/PATA Drives Ranked #1 by PC World Unprecedented announcement of 12 new products in June, 2004 for all major market segments enabled Seagate to address 97% of the market New products targeting applications ranging from MP3 players to DVRs and other consumer electronics, notebook computers, PCs, servers and corporate data centers: The world's first 1-inch 5GB hard drive 2.5-inch and 3.5-inch form factor solutions Serial ATA, Serial Attached SCSI and Fibre Channel interfaces Speeds of up to 15,000 RPM and storage capacities of up to 500GB NEW PRODUCT REFRESH INFO: First perpendicular drive First automotive drive First embedded encryption systems
  • Background information was also collected (fucntional area, customers, suppliers, deliverables) We also asked about project success rate (lauched/total) and suggestions to improve commercialization process
  • Remind interviewees that they personally are not being assessed, but that the purpose is to assess the level of design excellence rigor used within their area through their feedback.
  • Measuring Six Sigma with an Emphasis on DFSS

    1. 1. Erika J. Wyckoff Master Black Belt Measuring Six Sigma with an Emphasis on DFSS June 10, 2005
    2. 2. Agenda <ul><li>Seagate Background and Six Sigma </li></ul><ul><li>Measuring and Scoring Six Sigma/DFSS Integration </li></ul><ul><li>Assessment Logistics </li></ul><ul><li>Analyzing Quantitative and Qualitative Data </li></ul><ul><li>Lessons Learned </li></ul>
    3. 3. Seagate: Disc Drive Category Leader <ul><li>Seagate is the world’s leading provider of hard disc drives </li></ul><ul><ul><li>Q3 FY2005*: 24.9M drives shipped; revenue of $1.97B </li></ul></ul><ul><li>Provides drives for Enterprise, Desktop, Mobile Computing and Consumer Electronics applications </li></ul><ul><ul><li>Share leader in Desktop, Consumer Electronics and Enterprise </li></ul></ul><ul><ul><li>28% overall market share: highest in the industry </li></ul></ul><ul><ul><li>Broadest product offering in the industry – Largest customer base </li></ul></ul><ul><li>Ownership and vertical integration of critical technologies: heads, media, motors, and printed circuit boards </li></ul><ul><li>Approximately 43,500 employees worldwide </li></ul><ul><li>Major operations and sales offices in 15 countries </li></ul>
    4. 4. The Market of the Future Requires Change in Many Areas <ul><li>Greater customer focus on quality, reliability, and ease-of-use than on capacity and speed </li></ul><ul><li>Demands of the business require a more complex balancing act (pricing, scaling, distribution, etc.) </li></ul><ul><li>Have to reconcile the need to leverage technology platforms while responding to market fragmentation in growth areas </li></ul><ul><li>To achieve BIC productivity, we must do MORE with LESS…and do it faster </li></ul>
    5. 5. Seagate Uses Six Sigma Philosophy to Strengthen its Competitive Advantage <ul><li>Bulk training waves deployed for DMAIC Black and Green Belts </li></ul><ul><ul><li>160 hrs of class training + project </li></ul></ul><ul><ul><li>Process owner, master black belt, and hands-on champion roles </li></ul></ul><ul><ul><li>Rigorous project reviews by MBBs </li></ul></ul><ul><li>Bulk DFSS training waves (Foundations and Project classes) </li></ul><ul><ul><li>Shorter training classes </li></ul></ul><ul><ul><li>Fewer MBBs </li></ul></ul><ul><ul><li>Informal roles (PO, HOC) </li></ul></ul><ul><li>Seagate earned the Six Sigma Excellence award from sixsigmaIQ in February, 2004 </li></ul><ul><ul><li>Best DFSS Project category </li></ul></ul><ul><ul><li>Other finalists included Ford Motor Company and Raytheon </li></ul></ul><ul><ul><li>Evaluated by expert panel comprised of Motorola, Johnson and Johnson, ABB, Honeywell and Lockheed Martin </li></ul></ul>
    6. 6. In Order to Leverage Best Practices, Seagate Began to Measure Six Sigma/DFSS Integration <ul><li>Purpose: </li></ul><ul><li>(1) To ensure enterprise level capture and sharing of best practices </li></ul><ul><li>(2) To understand the extent of Six Sigma tools/methodology application </li></ul><ul><li>(3) To identify gaps and opportunities in the DFSS deployment </li></ul><ul><li>*Given our rapidly evolving product portfolio, the Design Center DFSS assessment is our focus today - similar assessments were conducted for DMAIC methodology in the manufacturing and transactional areas of the business </li></ul><ul><li>Scope: All drive design and component design centers </li></ul>Critical question: How do you define and measure “integration”?
    7. 7. The DFSS Assessment Needed to be Structured to Deliver Key Information… * Due to time and resource constraints, the first level assessment team used interviewee responses and did not attempt to inspect the quality of the DFSS tool usage <ul><li>We wanted to assess not only design engineers, but collaborative partners in the overall design process </li></ul><ul><ul><li>Design engineering, process engineering </li></ul></ul><ul><ul><li>Core team leads, advanced design, quality/reliability, equip. engineering </li></ul></ul><ul><ul><li>Functional managers and practitioners </li></ul></ul><ul><li>Quality of information: </li></ul><ul><li>We took the approach that quality is more important than quantity. In order to capture best practices and context data, we chose a format that would be more difficult to score and analyze than a scaled survey format </li></ul><ul><ul><li>Face-to-face interviews </li></ul></ul><ul><ul><li>One hour in length </li></ul></ul><ul><ul><li>Open-ended questions </li></ul></ul><ul><li>Representation: </li></ul>
    8. 8. The assessments looked at seven key areas: <ul><li>The team used project review lists, extrernal benchmarking, brainstorming, and affinitizing </li></ul><ul><ul><li>MBBs </li></ul></ul><ul><ul><li>Business Unit Leaders </li></ul></ul><ul><ul><li>Six Sigma Executive Staff </li></ul></ul><ul><ul><li>External Six Sigma/DFSS industry expert </li></ul></ul><ul><li>Avoided ‘checkbox’ mentality - conducted as VOC-type interviews to gain understanding of tools and methods </li></ul>A Team of Experts Used a Robust Process to Develop the Assessment Tool Requirements Management Understanding Variation Process Capability Modeling Tradeoff Analysis Verification Gap Closure
    9. 9. A Comprehension Scale Was Adopted to Evaluate Open-Ended Questions… * The assessment scale is leverage from the Black Belt certification process, but the assessment scoring is directed at the functional area , not the interviewee. <ul><li>Scoring was based on a pre-defined scale of 0-3: </li></ul><ul><li>3 = Applies DFSS tools and methodology where appropriate </li></ul><ul><li>2 = Understanding of tools and methodology but does not apply where appropriate </li></ul><ul><li>1 = Understands DFSS concepts </li></ul><ul><li>0 = Does not understand </li></ul><ul><li>Partial scoring was awarded, for example: Good understanding and partial implementation of tools would equal a score of 2.5 </li></ul>
    10. 10. The Protocol Was Ordered by DFSS Dimension… ‘ Item Score’ column for assessor scoring (0,1,2,3) SCORING: 0=area does not understand how this concept/tool relates to successful design. 1=area understands the concept/tool, but does not recognize how it relates to successful design. 2=familiar with concept/tool and recognizes how it relates to successful design. 3=familiar with concept/tool and has made rigorous use of concept/tool in the context of product design work. Gray fill indicates a ‘show me’ question Design excellence dimensions
    11. 11. … and the ‘Measurement System’ was Calibrated with a Pilot Run Master Black Belt team (2-3 assessors) * Formal attribute MSA not performed. <ul><li>Senior level MBBs with prior interviewing and teaching experience </li></ul><ul><li>One person asked questions - others captured additional comments by interviewee </li></ul><ul><li>All three MBBs generated independent scores </li></ul><ul><li>MBBs alternated through all roles to ensure multiple perspectives experienced </li></ul><ul><li>MBB team took short breaks after about 2-3 interviews to ‘calibrate’ individual scores and coallate comments onto master score sheet </li></ul><ul><li>Scores were highly reproducible* between MBBs – additional comments captured were used to clarify final scores if deltas existed between MBBs on a particular question </li></ul>
    12. 12. Design Center People Interviewed Questions Asked Pilot Site 18 612 Site 1 23 851 Site 2 13 444 Site 3 14 452 Site 4 10 340 Total 59 2086 Interviewees Were Then Sampled Across All Design Sites… * Pilot assessment protocol had 34 questions. Two modifications made to protocol during first cycle: added VOC process questions and combined several tradeoff questions. Interviewees were selected by the site Six Sigma contact (either a Business Unit leader or a Senior Master Black Belt) Interviewees selected as stratified random sample – “no cherry picking”
    13. 13. Logistics Were Planned to Ensure Detailed and Open Feedback… <ul><li>Who? </li></ul><ul><li>Practitioners, Managers, and support functions </li></ul><ul><li>2-3 per area </li></ul><ul><li>How long? </li></ul><ul><li>1 hour per person </li></ul><ul><li>Leave 10 minutes between interviews to calibrate and consolidate </li></ul><ul><li>How to conduct? </li></ul><ul><li>Informal setting </li></ul><ul><li>Look for success stories </li></ul><ul><li>Break the ice (introductions and process) </li></ul><ul><li>Asking about their part of the process </li></ul><ul><li>Business casual dress and round tables </li></ul><ul><li>White board or flipcharts provided </li></ul>
    14. 14. Following the Last Visit, the Team Compiled Data for Analysis by Several Factors… Overview by Site Overview by Area Global Site Comparison By Area Issues Summary Table of Lowest Scoring Business Critical Issues, KJ Analysis Global Question Comparison By Dimension Total ranked score (highest to lowest) for each question grouped by DFSS area
    15. 15. Results Reporting was Rolled Out to Top Management First…then Sites <ul><li>All site data was summarized in a global rollup for the head of the VP of all the design organizations </li></ul><ul><li>Assessment tool/process overview delivered </li></ul><ul><li>Results walkthrough with Corporate Six Sigma and Design Center management teams </li></ul><ul><li>Additional requests for detailed drill-downs in some areas </li></ul><ul><li>Site reports distributed for action planning </li></ul>
    16. 16. We Then Collaborated with Design Center Mgmt to Develop an Action Plan… <ul><li>Communicate best practices </li></ul><ul><li>Establish cause-effect linkage of issues </li></ul><ul><li>Prioritize business critical root-causes </li></ul><ul><li>Strengthen management systems to support projects </li></ul><ul><li>Identify common themes for key projects </li></ul><ul><li>Identify skills/resources needed by area </li></ul><ul><li>Identify improvements to goals & metrics </li></ul><ul><li>Identify tools/systems to drive deeper integration </li></ul>
    17. 17. Design Excellence: Developing the product the customer wants in the most efficient way … to Build a DFSS Ecosystem…”Design Excellence”
    18. 18. Lesson Learned: <ul><li>Don’t try to cut the time needed for interviews! </li></ul><ul><li>Tool development should be more collaborative with design centers </li></ul><ul><li>Minimize reporting turnaround time </li></ul><ul><li>Assessors need excellent interviewing skills </li></ul><ul><li>Tape recording to capture comments </li></ul><ul><li>Train-the-Trainer or internal audit for maintenance </li></ul><ul><li>Sampling and scoring can be political! </li></ul>
    19. 19. What Questions Do You Have?