Measuring Six Sigma with an Emphasis on DFSS
Upcoming SlideShare
Loading in...5
×
 

Measuring Six Sigma with an Emphasis on DFSS

on

  • 501 views

 

Statistics

Views

Total Views
501
Views on SlideShare
501
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Barracuda 7200.8 SATA/PATA Drives Ranked #1 by PC World Unprecedented announcement of 12 new products in June, 2004 for all major market segments enabled Seagate to address 97% of the market New products targeting applications ranging from MP3 players to DVRs and other consumer electronics, notebook computers, PCs, servers and corporate data centers: The world's first 1-inch 5GB hard drive 2.5-inch and 3.5-inch form factor solutions Serial ATA, Serial Attached SCSI and Fibre Channel interfaces Speeds of up to 15,000 RPM and storage capacities of up to 500GB NEW PRODUCT REFRESH INFO: First perpendicular drive First automotive drive First embedded encryption systems
  • Background information was also collected (fucntional area, customers, suppliers, deliverables) We also asked about project success rate (lauched/total) and suggestions to improve commercialization process
  • Remind interviewees that they personally are not being assessed, but that the purpose is to assess the level of design excellence rigor used within their area through their feedback.

Measuring Six Sigma with an Emphasis on DFSS Measuring Six Sigma with an Emphasis on DFSS Presentation Transcript

  • Erika J. Wyckoff Master Black Belt Measuring Six Sigma with an Emphasis on DFSS June 10, 2005
  • Agenda
    • Seagate Background and Six Sigma
    • Measuring and Scoring Six Sigma/DFSS Integration
    • Assessment Logistics
    • Analyzing Quantitative and Qualitative Data
    • Lessons Learned
  • Seagate: Disc Drive Category Leader
    • Seagate is the world’s leading provider of hard disc drives
      • Q3 FY2005*: 24.9M drives shipped; revenue of $1.97B
    • Provides drives for Enterprise, Desktop, Mobile Computing and Consumer Electronics applications
      • Share leader in Desktop, Consumer Electronics and Enterprise
      • 28% overall market share: highest in the industry
      • Broadest product offering in the industry – Largest customer base
    • Ownership and vertical integration of critical technologies: heads, media, motors, and printed circuit boards
    • Approximately 43,500 employees worldwide
    • Major operations and sales offices in 15 countries
  • The Market of the Future Requires Change in Many Areas
    • Greater customer focus on quality, reliability, and ease-of-use than on capacity and speed
    • Demands of the business require a more complex balancing act (pricing, scaling, distribution, etc.)
    • Have to reconcile the need to leverage technology platforms while responding to market fragmentation in growth areas
    • To achieve BIC productivity, we must do MORE with LESS…and do it faster
  • Seagate Uses Six Sigma Philosophy to Strengthen its Competitive Advantage
    • Bulk training waves deployed for DMAIC Black and Green Belts
      • 160 hrs of class training + project
      • Process owner, master black belt, and hands-on champion roles
      • Rigorous project reviews by MBBs
    • Bulk DFSS training waves (Foundations and Project classes)
      • Shorter training classes
      • Fewer MBBs
      • Informal roles (PO, HOC)
    • Seagate earned the Six Sigma Excellence award from sixsigmaIQ in February, 2004
      • Best DFSS Project category
      • Other finalists included Ford Motor Company and Raytheon
      • Evaluated by expert panel comprised of Motorola, Johnson and Johnson, ABB, Honeywell and Lockheed Martin
  • In Order to Leverage Best Practices, Seagate Began to Measure Six Sigma/DFSS Integration
    • Purpose:
    • (1) To ensure enterprise level capture and sharing of best practices
    • (2) To understand the extent of Six Sigma tools/methodology application
    • (3) To identify gaps and opportunities in the DFSS deployment
    • *Given our rapidly evolving product portfolio, the Design Center DFSS assessment is our focus today - similar assessments were conducted for DMAIC methodology in the manufacturing and transactional areas of the business
    • Scope: All drive design and component design centers
    Critical question: How do you define and measure “integration”?
  • The DFSS Assessment Needed to be Structured to Deliver Key Information… * Due to time and resource constraints, the first level assessment team used interviewee responses and did not attempt to inspect the quality of the DFSS tool usage
    • We wanted to assess not only design engineers, but collaborative partners in the overall design process
      • Design engineering, process engineering
      • Core team leads, advanced design, quality/reliability, equip. engineering
      • Functional managers and practitioners
    • Quality of information:
    • We took the approach that quality is more important than quantity. In order to capture best practices and context data, we chose a format that would be more difficult to score and analyze than a scaled survey format
      • Face-to-face interviews
      • One hour in length
      • Open-ended questions
    • Representation:
  • The assessments looked at seven key areas:
    • The team used project review lists, extrernal benchmarking, brainstorming, and affinitizing
      • MBBs
      • Business Unit Leaders
      • Six Sigma Executive Staff
      • External Six Sigma/DFSS industry expert
    • Avoided ‘checkbox’ mentality - conducted as VOC-type interviews to gain understanding of tools and methods
    A Team of Experts Used a Robust Process to Develop the Assessment Tool Requirements Management Understanding Variation Process Capability Modeling Tradeoff Analysis Verification Gap Closure
  • A Comprehension Scale Was Adopted to Evaluate Open-Ended Questions… * The assessment scale is leverage from the Black Belt certification process, but the assessment scoring is directed at the functional area , not the interviewee.
    • Scoring was based on a pre-defined scale of 0-3:
    • 3 = Applies DFSS tools and methodology where appropriate
    • 2 = Understanding of tools and methodology but does not apply where appropriate
    • 1 = Understands DFSS concepts
    • 0 = Does not understand
    • Partial scoring was awarded, for example: Good understanding and partial implementation of tools would equal a score of 2.5
  • The Protocol Was Ordered by DFSS Dimension… ‘ Item Score’ column for assessor scoring (0,1,2,3) SCORING: 0=area does not understand how this concept/tool relates to successful design. 1=area understands the concept/tool, but does not recognize how it relates to successful design. 2=familiar with concept/tool and recognizes how it relates to successful design. 3=familiar with concept/tool and has made rigorous use of concept/tool in the context of product design work. Gray fill indicates a ‘show me’ question Design excellence dimensions
  • … and the ‘Measurement System’ was Calibrated with a Pilot Run Master Black Belt team (2-3 assessors) * Formal attribute MSA not performed.
    • Senior level MBBs with prior interviewing and teaching experience
    • One person asked questions - others captured additional comments by interviewee
    • All three MBBs generated independent scores
    • MBBs alternated through all roles to ensure multiple perspectives experienced
    • MBB team took short breaks after about 2-3 interviews to ‘calibrate’ individual scores and coallate comments onto master score sheet
    • Scores were highly reproducible* between MBBs – additional comments captured were used to clarify final scores if deltas existed between MBBs on a particular question
  • Design Center People Interviewed Questions Asked Pilot Site 18 612 Site 1 23 851 Site 2 13 444 Site 3 14 452 Site 4 10 340 Total 59 2086 Interviewees Were Then Sampled Across All Design Sites… * Pilot assessment protocol had 34 questions. Two modifications made to protocol during first cycle: added VOC process questions and combined several tradeoff questions. Interviewees were selected by the site Six Sigma contact (either a Business Unit leader or a Senior Master Black Belt) Interviewees selected as stratified random sample – “no cherry picking”
  • Logistics Were Planned to Ensure Detailed and Open Feedback…
    • Who?
    • Practitioners, Managers, and support functions
    • 2-3 per area
    • How long?
    • 1 hour per person
    • Leave 10 minutes between interviews to calibrate and consolidate
    • How to conduct?
    • Informal setting
    • Look for success stories
    • Break the ice (introductions and process)
    • Asking about their part of the process
    • Business casual dress and round tables
    • White board or flipcharts provided
  • Following the Last Visit, the Team Compiled Data for Analysis by Several Factors… Overview by Site Overview by Area Global Site Comparison By Area Issues Summary Table of Lowest Scoring Business Critical Issues, KJ Analysis Global Question Comparison By Dimension Total ranked score (highest to lowest) for each question grouped by DFSS area
  • Results Reporting was Rolled Out to Top Management First…then Sites
    • All site data was summarized in a global rollup for the head of the VP of all the design organizations
    • Assessment tool/process overview delivered
    • Results walkthrough with Corporate Six Sigma and Design Center management teams
    • Additional requests for detailed drill-downs in some areas
    • Site reports distributed for action planning
  • We Then Collaborated with Design Center Mgmt to Develop an Action Plan…
    • Communicate best practices
    • Establish cause-effect linkage of issues
    • Prioritize business critical root-causes
    • Strengthen management systems to support projects
    • Identify common themes for key projects
    • Identify skills/resources needed by area
    • Identify improvements to goals & metrics
    • Identify tools/systems to drive deeper integration
  • Design Excellence: Developing the product the customer wants in the most efficient way … to Build a DFSS Ecosystem…”Design Excellence”
  • Lesson Learned:
    • Don’t try to cut the time needed for interviews!
    • Tool development should be more collaborative with design centers
    • Minimize reporting turnaround time
    • Assessors need excellent interviewing skills
    • Tape recording to capture comments
    • Train-the-Trainer or internal audit for maintenance
    • Sampling and scoring can be political!
  • What Questions Do You Have?