Assure 97   - What Drives our ECAD Strategy? A Programme of Change
Upcoming SlideShare
Loading in...5
×
 

Assure 97 - What Drives our ECAD Strategy? A Programme of Change

on

  • 101 views

A public presentation that I gave to the ASSURE conference in 1997. The presentation ask the question "What Drives our ECAD (Electronic Computer Aided Design) strategy?" to an audience of Electronic ...

A public presentation that I gave to the ASSURE conference in 1997. The presentation ask the question "What Drives our ECAD (Electronic Computer Aided Design) strategy?" to an audience of Electronic Design Design Automation (EDA) audience. And, the explains the programme of change to the Electronic Design Process that our approach to ECAD is just one part.

Statistics

Views

Total Views
101
Views on SlideShare
101
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Good morning ! My name is Neil Whitehall, and I am the Electronics Process Manager at GMAv RSD in Edinburgh. <br /> The aim of my presentation this morning is to take you through some of the steps towards creating a formal strategy for improvement to the Electronics Design Process. <br /> The path we will be taking is ‘bottom-up’. In reality, a formal approach to process improvement requires that rare thing - management commitment. Given such a rare commodity, the process of process improvement is much easier accomplished ‘top-down’. <br /> Slide = 40s. <br /> RT = 40s. <br />
  • Without the existence of a formally stated strategy what would guide the development of processes inside our companies ? Intuition ? Relying on intuition alone tends to result in some of these symptoms. <br /> * You never get enough capital to spend. <br /> * You have a wish list but no written plan. <br /> * You have no agreed budget. <br /> * You have all the catalogues on the latest ECAD ‘gizmos’. <br /> * You attend all of the seminars on the latest ‘hot’ products. <br /> * You ask for quotes for every tool you see. <br /> Relying on your ECAD vendors is another unconscious choice. However, your ECAD vendors have their own strategy. Your suppliers strategy is not necessarily your own ! <br /> We need to add focus to intuition. Elements from the vendors strategy may or may not be adopted, at the appropriate time. The process itself cannot be based solely on intuition. <br /> Slide = 55s. <br /> RT = 1:35s <br />
  • This morning we will work through six topics, and see how they can be combined with the last one, to add the necessary focus that will allow us to formulate a basic improvement plan. <br /> Slide = 15s. <br /> RT = 1:50s. <br />
  • Our first area is a theory which teaches us how to think about business. It’s called the Theory Of Constraints. TOC teaches us that businesses are complex, multi-dimensional, multi-disciplines and political. To help us cope with this complexity a ‘thinking process’ is needed to let us focus on those few areas that are vitally important. TOC was developed by Eli Goldratt to analyse production plants. However, it can also be applied to Project Management and the Supply Chain. It is all about optimising systems. <br /> Slide = 45s. <br /> RT = 2:35s. <br />
  • Fundamentally, the history of science followed three phases. The Ancient Greeks looked at the stars and drew star signs in the dirt in an attempt to understand what they could not comprehend. Being compared to that level of thinking is not flattering for degree qualified engineers. But, how many of our engineers formally classify a problem ? Ptolemy at least tried to put 2 + 2 together to try to add to his understanding. But, how many of our engineers formally use correlation to add the missing links when problem solving ? Newton became the first scientist when he developed E-C-E postulation. Do we use a scientific approach to the development of our design process ? <br /> Slide = 45s. <br /> RT = 3:35s. <br />
  • Goldratt takes the 3 phase approach further in seeking to optimise systems. Look at step 2, and ask yourself honestly what we are doing to develop tactics in our design process to exploit the constraints that we all know are present within our companies. <br /> Slide = 30s. <br /> RT = 3:50s. <br />
  • Some basics. Simply Read the Slide ! <br /> Slide = 45s. <br /> RT = 4:35s. <br />
  • Some examples of constraints : Not enough capacity, or capacity that has been compromised by poor deployment or, perhaps a policy constraint that hampers progress and is no longer relevant. <br /> It might be in Test, Assembly, Purchasing, or in Design. <br /> In design this might be things like .... <br /> Computers - Performance, Availability, Deployment <br /> Design Tools - Inter-operability, Deployment, Skills, Standards <br /> Simulation Models - Availability, Accuracy, Standards. <br /> Engineers - Deployment, Education, Experience, Skills. <br /> Project Management - Co-ordinating team working. <br /> Systems Engineering - Inadequate or Changing Specifications. <br /> Slide = 60s. <br /> RT = 5:35s. <br />
  • Our Financial and Managing Directors are measured. How they are measured effects how they behave. It is supposed to effect how we behave, too ! <br /> However, understanding how our actions effect these bottom line measurements is not always clear. <br /> The bridge describes how these operational measures can be linked to some readily understandable concepts. <br /> Slide = 30s. <br /> RT = 7:05s. <br />
  • The bridge gives us a way of understanding how Throughput, Inventory and Operating Expense effect the bottom line, operational measures. <br /> The ‘traditional’ approach inside many companies is to seek to minimise operating expense. <br /> JIT started as a means of minimising Inventory costs. However, it has developed into much more than that. <br /> Slide = 30s. <br /> RT = 7:35s. <br />
  • When Western Companies started to look at Japanese JIT Systems, called ‘Kanbans’, they realised the true importance of Inventory to Manufacturing Systems. <br /> The realisation that reducing inventory levels could be used to raise responsiveness to the market and future throughput hit home because of the ‘newly’ discovered link to Net Profit, and the stronger than realised link to Cash Flow. <br /> So, if JIT is so important to modern Manufacturing Systems, what are we, as design engineers doing to contribute to the success story ? <br /> Slide = 30s. <br /> RT = 8:05s. <br />
  • Some TOC resources. <br /> Slide = 10s. <br /> RT = 8:15s. <br />
  • JIT is pretty simple at face value ... <br /> Slide = 10s. <br /> RT = 8:25s. <br />
  • By cutting the work up into batches we shorten the time between paying for materials and being paid by our own customers. This is better for cash flow. <br /> By cutting the work up into batches Engineering Changes to incorporate new product features can be supplied to manufacturing, and then to the market, much quicker. <br /> Good news so far. <br /> Slide = 30s. <br /> RT = 8:55s. <br />
  • Even better news is found when we look at Due Date Performance. Because the batch durations are shorter, when the shorter duration is multiplied by the process variance, a much smaller ‘delta’ emerges. Essentially, the ability to forecast delivery dates accurately rises. <br /> In summary, successful adoption of JIT is ‘good news’. <br /> Slide = 30s. <br /> RT = 9:25s. <br />
  • We get all of these good points, but at a price ! <br /> The Manufacturing System has to be tuned to ensure that the key Critical Constraining Resources (CCRs) in the production plant are not starved of inventory. To do this the production managers incorporate time buffers before the critical resources. <br /> Why does this effect us ? <br /> 1. What if the CCR was Printed Circuit Assembly (PCA) and it was discovered that ‘design’ has not optimised the surface mount footprints ? <br /> 2. Or, if the CCR was ‘test’ and it was discovered that ‘design’ had not provided enough diagnostic test capability ? <br /> 3. Or if the design had not been Centered about the specification limits, and toleranced to avoid excess scrap ? <br /> What does the bridge tell us about the linkage between these situations and the operational measurements ? <br /> Slide = 60s. <br /> RT = 10:25s. <br />
  • Just for completeness, this added complexity of JIT means that we will need to think about a supporting structure of Integrated Business Systems (IBS) that will act as a framework for the support of Synchronised Manufacturing. <br /> For the Electronic Design Process, this means that our Computing Environmnet, Tools and Tasks will need to inter-operate with the IBSs. <br /> Slide = 25s. <br /> RT = 10:50s. <br />
  • So, we can help ‘Manufacturing’, and maybe we can learn from them, too. The JIT community introduced the concept of time buffers. Time buffers can be applied to the design process, too. <br /> The diagram shows a PERT, where there is an obvious Critical Path which is worked on by our engineer, ‘DE’. However, many of the non critical path activities are performed by ‘DE’, too. <br /> The correct definition for a Critical Path is ‘the longest combination of dependant events’. If you do not schedule the resources, a second path emerges - the Critical Chain. <br /> To regain control of the project, feeding buffers need to be inserted before ‘DEs’ activities to ensure that the activities are completed in good time, allowing the Critical Path to be de-sensitised. <br /> What if we changed the PERT so that the ‘secondary’ activities were no longer performed by DE, but DE had to communicate their existence to the person who would perform it ? What if these activities were to be performed by ‘other’ disciplines ? <br /> Slide = 80s, RT = 12:10s. <br />
  • Why should quality be optimised ? Well, Quality is not free. In terms of electronic products we need a better definition of Product Quality. <br /> Once the features of a product are defined, the quality of the product is determined by it’s dependability. <br /> If a product has the features the customer wants, and the product is dependable, a potential customer becomes interested in the value of the item. Value is linked to the issue of price and cost. <br /> If the goal is to ‘make more money, now and in the future’, then we need to continually optimise our products for features and cost. We need to carry this out in a way that allows the manufacturing process to be optimised, too. <br /> Slide = 45s. <br /> RT = 12:55s. <br />
  • Six Sigma is becoming a widely known buzz-word. But, why Six ? If our aim is to optimise quality why would we force ourselves into a corner that effects our product cost, price, profit margin and net profit ? Six Sigma programmes also tend to focus on notional defects rather than product dependability. <br /> But, let’s put this argument to one side. <br /> What do parametric failures do to time buffers in Production ? Loss of yield is loss of throughput, not just for the ‘value’ of that assembly but for the whole product value. Throughput - or ‘Rate of Sales’ - is lowered. It’s not just the scrap material costs that feed through to the bottom line ! <br /> So, if parametric failures are so bad, what can we as design engineers do to prevent this loss of throughput from occurring ? <br /> Slide = 50s, RT = 13:45s. <br />
  • Phase 1 : Apply yield prediction. <br /> Phase 2 : Maximise Yield. <br /> Phase 3 : Centre the design to maximise Cpk. Make it ‘Robust’. <br /> But, of course we know all this. Analogy have been telling us this for years ! <br /> Slide = 30s. <br /> RT = 14:15s. <br />
  • But, just for reference - here it is again. <br /> Slide = 20s. <br /> RT = 14:35s. <br />
  • If we were to look at a hypothetical Pareto Analysis of failures during Printed Circuit Assembly and Test, a typical scenario might look like this. <br /> Conventional thinking would say, eliminate the Soldering and Component Failures, as they represent the majority of the failure causes. But, if you were trying to apply Statistical Process Control to the Assembly Process, what would the Catastrophic Design Errors and Paramentric Yield Loss do to your Control Chart ? <br /> Applying Design Centering would help us to eliminate Parametric failures, and make the job of Optimising the Manufacturing Process easier. <br /> And prior to that, applying functional simulation would help to reduce the number of Catastrophic Design Errors, eliminating another unwanted variable from the analysis. <br /> Slide = 50s. <br /> RT = 15:25s. <br />
  • We all think that we are ‘doing’ Concurrent Engineering, but how many of us think that we are getting the most out of the resources at our disposal ? Starting with a definition might be appropriate. <br /> Slide = 20s. <br /> RT = 15:45s. <br />
  • The first definition is from Sammy Shina. Shina takes a Cultural focus in his definition. The words that leap out of the page are ‘earliest’ and ‘overall’. <br /> The definition is from the very first page of his book, and was probably written and re-written countless times. Shina didn’t choose timely, early or very early. He deliberately chose ‘earliest’. <br /> Note too that ‘overall’ means that we are trying to find a global optimum, not just the sum of many local optimums. The whole system is to be optimised. <br /> Slide = 30s. <br /> RT = 16:15s. <br />
  • The second definition is from Carter & Stillwell Baker. It takes a logistical approach that is data centric. <br /> To me, this begs the question. ‘ Can data be managed in the same way as inventory is in production ?’ <br /> and then from Shina’s definition ... <br /> ‘Can people be disciplined enough to change their own working cultures ?’ <br /> Slide = 30s. <br /> RT = 16:45s. <br />
  • Carter also gives us focus on the subject of process support. He takes a layered approach to the support of processes. <br /> The most basic, layer 1, focuses on forming an inter-operable network computing environment. <br /> Having accomplished this, we move onto Layer 2. I have split layer 2 between 2a, Inter-operable Tools, and 2b Inter-operable Tasks. At layer 2a, you might want to think about a standard, corporate CAD flow that links the disciplines. At layer 2b, you might want to concentrate on creating Process Maps of the Tasks that surround that CAD flow, <br /> At Layer 3 we get to Product Data Management. This is where we introduce corporate Data Management Systems for Engineering Data, Component and Supplier Information and Enterprise Documentation. <br /> When we, eventually, get to layer 4 the focus changes to workflow management and metrics. <br /> Layer 5, is all about how we support decision making across the organisation. <br /> Slide = 1:00s, RT = 17:45s. <br />
  • The Carnegie Mellon Software Engineering Institute (SEI) develop Capability Maturity Models for Software and Systems Engineering. <br /> The model describes the essential elements of an organisation’s processes that are required to ensure successful Systems Engineering. <br /> http://www.sei.cmu.edu/# Web Site. <br /> http://ftp.sei.cmu.edu/# Document Download. <br /> Slide = 25s. <br /> RT = 18:10s. <br />
  • The Capability Maturity Models ask you to believe in this model. <br /> It asks you to trust that People, Process and Technology will combine to define an organisation’s capability. <br /> And that, this Organisational Capability in turn defines the level of product or service quality delivered to the customer. <br /> Slide = 30s. <br /> RT = 18:40s. <br />
  • But the Capability Maturity Models do not supply the whole picture. Organisational and Business Factors have to be added to the guidance that CMM provides on ... Process Domains and Areas, Capability and Support. <br /> When you put the three together, you are supposed to have the necessary focus to plan improvement. <br /> Slide = 25s. <br /> RT = 19:05s. <br />
  • This diagram is both hideous and beautiful ! <br /> In constructing the Maturity Models they have split the Enterprise up into Domains. These Domains have been split into Process Areas. The Process Areas have been analysed and the base practices extracted. <br /> On the Capability side of the diagram they have defined levels of capability. For each Capability Level the Common Features and Generic Practices have been extracted. <br /> An appraisal of an Enterprise against a Maturity Model quizzes your staff on the existence of these base and generic practices. <br /> The result is a plot of the maturity level of each of the process areas. <br /> Slide = 45s. <br /> RT = 19:50s. <br />
  • The idea is that you are supposed to target your Process Improvement efforts at the areas most in need. This levelling-up approach ensures that the end-point is a global optimum of the overall organisation’s capability. The management are then supposed to synchronise the process improvement plans of the process areas, so that the organisation ‘runs’ up the staircase of improvement together. <br /> Slide = 25s. <br /> RT = 20:15s. <br />
  • So, how does Benchmarking fit into our improvement efforts ? <br /> In his book Robert Camp said ... <br /> “Benchmarking is the search for industry best practices that lead to superior performance”. <br /> “Benchmarking” <br /> ISBN 0-87389-058-2 & 0-527-91635-8. <br /> This is interesting, as CMM provides us with a great deal of insight into the missing best practices. <br /> The Japanese, as usual, have a word for Benchmarking - ‘Dantotsu’ - which is translated as ‘striving to be the best of the best’. <br /> Slide = 30s. <br /> RT = 20:45s. <br />
  • In Benchmarking we are seeking to find out more about the Gap in performance between our companies and our competitors. We want to find out ... <br /> How much the gap is. Where it is, and When it opened up. <br /> We also want to find out how to close the gap through improved knowledge, practices and processes. <br /> Slide = 25s. <br /> RT = 21:10s. <br />
  • In striving to achieve Superior Performance, if we never Classify the problem we will never be able add understanding through correlation to make the changes that are required. <br /> Slide = 20s. <br /> RT = 21:30s. <br />
  • For completeness, here are the Benchmarking Steps. <br /> Slide = 20s. <br /> RT = 21:50s. <br />
  • And here is a Benchmarking Appraisal showing the Maturity Score of the Process Areas against an Industry Sector Database. <br /> Slide = 15s. <br /> RT = 22:05s. <br />
  • And here is a correlation between two divisions of the same enterprise. It highlights where inter-divisional co-operation on shared process improvement effort needs to be focused. <br /> Slide = 15s. <br /> RT = 22:20s. <br />
  • And here we have a trend analysis of the improvement efforts of a division over time, relative to an industry sector average and best practice scores from the database in each process area. <br /> Slide = 15s. <br /> RT = 22:35s. <br />
  • This slide shows how annual Benchmarking Investigations are supposed to feed into the Long Range, Medium Range and Short Range Planning Process. <br /> Note that if anyone in the audience is looking for a Benchmarking Partner I would be very interested in speaking to them ! <br /> Slide = 25s. <br /> RT = 23:00s. <br />
  • Long term plans will involve several phases, as the organisation progresses through the maturity levels. Each phase can be formally identified to provide a focus for the improvement activity. <br /> The phases are supposed to be completed in order ! <br /> The activities carried out in each phase are those actions that will take the company to the next step on the staircase of improvement. <br /> So, what could the phases be ? <br /> Slide = 30s. <br /> RT = 23:30s. <br />
  • Here are four phases, that take the organisation through from focusing on ... <br /> i)Delivery and Process Planning. <br /> ii)Quality and Process Control. <br /> iii)Cost and Process Improvement, and <br /> iv)Variety and System Integration. <br /> Notice the position of the CMM levels on the boundary between the phases. <br /> Notice that the organisational culture and structure might need to change between the phases, too. <br /> So, what can we add to this that would let us fill in the actions we need to perform to move the organisation from one phase to the next ? <br /> Slide = 45s. <br /> RT = 24:15s. <br />
  • Down the Y-Axis add in ... <br /> The 13 Process Areas from the CMM Appraisal. <br /> Then add the ‘Organisational Capability Contributors’, <br /> i)Process <br /> ii)People <br /> iii)Technology <br /> Under Process, add Carter’s Process Support Layers, <br /> Layer 1- Interoperable Network Computing Environment. <br /> Layer 2a- Interoperable Tools. <br /> Layer 2b- Interoperable Tasks. <br /> Layer 3- Product Data Management. <br /> Layer 4- Workflow Management & Metrics. <br /> Layer 5- Decision Support <br /> The add rows for ... <br /> i)People Issues, Training and Staff Development. <br /> ii)Technology Support Issues. <br /> iii)Quality Optimisation. <br /> Slide = 45s, RT = 25:00s. <br />
  • Then, across the X-Axis add in the changes you want to ‘phase-in’. <br /> The overall matrix becomes the outline Long Range Plan for Process Improvement. <br /> The Items in each vertical phase become the focus for the Medium Range Plan. <br /> The Medium Range Plan then needs to be decomposed into a one year programme of actions that are funded and staffed. <br /> Filling in the boxes is relatively easy. Read the references given in the presentation. Perform the investigations and analyses. <br /> Slide = 40s. <br /> RT = 25:40s. <br />
  • I realise that has been a lot to take in all in one go ! That’s why the presentation hand-outs include the oral content of the presentation, to allow you to take away a complete record for reference. <br /> Do you have any questions that you would like to ask ? <br /> Slide = 4:20s. <br /> RT = 30:00s. <br />

Assure 97   - What Drives our ECAD Strategy? A Programme of Change Assure 97 - What Drives our ECAD Strategy? A Programme of Change Presentation Transcript

  • What Drives our ECAD Strategy ? Neil Whitehall Electronics Process Manager GEC Marconi Avionics Radar Systems Division neil.whitehall@gecm.com #1 19/11/97 N.Whitehall
  • What Drives our ECAD Strategy ?   2. The ECAD Marketplace ?  3. What can you do to focus your intuition ?  4. What elements of your ECAD Vendor’s Strategy should be built into your own company’s design process ?  #2 1. Intuition ? 5. How do you answer Q3 without relying completely on intuition ? 19/11/97 N.Whitehall
  • Alternatives to ‘Intuition Alone’  What could add the necessary focus ? • Theory of Constraints (TOC). • Just In Time (JIT). • Quality Optimisation. • Concurrent Engineering (CE). • SEI Capability Maturity Model (CMM). • Benchmarking. • Phases of Development. #3 19/11/97 N.Whitehall
  • Theory of Constraints (TOC) #4 19/11/97 N.Whitehall
  • TOC  The history of science is divided into three phases of increasing maturity of approach. (Ancient Greeks) • Phase 1 : Classification (Ptolemy) • Phase 2 : Correlation (Newton) • Phase 3 : Effect-Cause-Effect  TOC uses this three phase approach as the basis of all problem solving activities. #5 19/11/97 N.Whitehall
  • TOC  #6 Goldratt’s work adds a 5 stage focusing technique. • Step 1 : Identify the System’s Constraint(s). • Step 2 : Decide how to exploit the System’s Constraint(s). • Step 3 : Subordinate everything else to the above decision. • Step 4 : Elevate the System’s Constraint(s). • Step 5 : If, in the previous steps, a constraint has been broken, go back to Step 1. • NB : Don’t allow inertia to cause a System Constraint. 19/11/97 N.Whitehall
  • TOC   What is the goal of the company ? • To make more money, now and in the future.  #7 What is a constraint ? • An obstacle that prevents the company from achieving it’s goal. How do we get nearer to our goal ? • Any step that simultaneously .... • Increases Throughput (Rate of Sales) • Unlimited, except by the market. • Decreases Cost of Inventory • Limited by Zero, Eventually nothing left to minimise. • Decreases Cost of Material • Limited by Zero, Eventually nothing left to minimise. 19/11/97 N.Whitehall
  • TOC   How do you identify the constraints ? • Find a “Level 1 Effect”.  What is a “Level 1 Effect” ? • At a “High level” within organisation it attracts Visibility. • Linked to throughput at a business/divisional level.  #8 Examples of Constraints ? How do you relate Level 1 Effects to Constraints within the Design Process ? • Start at the output of the process. • Back-trace using the 5 step focusing technique. • Link underlying problems causing “Level 1 Effect” to the (possible) constraint within the Design Process. 19/11/97 N.Whitehall
  • TOC THE GOAL : TO MAKE MONEY Bottom line measurements ... NET PROFIT (Absolute) RETURN ON INVESTMENT (Relative) What is the bridge ? ACTIONS #9 19/11/97 N.Whitehall CASH FLOW (Survival)
  • TOC THE DIRECT IMPACT : OPERATIONAL MEASUREMENTS AND THE BOTTOM LINE NET PROFIT THROUGHPUT # 10 19/11/97 N.Whitehall RETURN ON INVESTMENT INVENTORY CASH FLOW OPERATING EXPENSE
  • TOC THE COMPETITIVE EDGE IMPACT : OPERATIONAL MEASURES LINKED TO ACTIONS THROUGH THE BRIDGE. NET PROFIT THROUGHPUT (FUTURE) RETURN ON INVESTMENT INVENTORY COMPETITIVE EDGE # 11 19/11/97 N.Whitehall CASH FLOW OPERATING EXPENSE
  • TOC  Supported by the Avraham Goldratt Institute. # TOC. • http://www.goldratt.com # ’Crazy About Constraints’. • http://www.rogo.com/cac # Courses and Book Orders. • tel : (01628) 74468  Texts • ‘The Theory of Constraints’ (TOC) • ISBN 0-88427-085-8 • ‘The Goal (Introduction/Manufacturing) • ‘It’s Not Luck’ (Supply Chain Management) • ‘The Race’ (Manufacturing) • ‘Critical Chain’ (Project Management) • ISBN 0-88427-061-0 • ISBN 0-566-07637 • Speak to Analogy about ‘Flexible Access’ and ‘Open Access’ • ISBN 0-88427-062-9 • ISBN 0-88427-153-6 # 12 19/11/97 N.Whitehall
  • JIT # 13 19/11/97 N.Whitehall
  • JIT HIGH VERSUS LOW INVENTORY SYSTEMS : ENGINEERING CHANGES LOW INVENTORY HIGH INVENTORY MONTHS Improved product will be available only several months after engineering change. # 14 19/11/97 N.Whitehall Engineering change one month after start of order MONTHS Improved product will be available in less than two weeks.
  • JIT HIGH VERSUS LOW INVENTORY SYSTEMS : DUE DATE PERFORMANCE. LOW INVENTORY HIGH INVENTORY % % Production starts based on a guess. We oscillate between excess finished goods inventory and missed due dates. # 15 19/11/97 N.Whitehall FORECAST VALIDITY Production starts based on good knowledge. Due date performance >> 90%
  • JIT  # 16 19/11/97 The Role of Reduced Inventory in Establishing a Competitive Edge. • Product • Quality • Features (Engineering) • Price • Higher Margins • Lower Investment per Unit • Responsiveness • Due Date Performance • Shorter Quoted Lead Times N.Whitehall
  • Framework for Synchronised Manufacturing MRP Manufacturing Procurement CDM PDM PDM = Product Data Management MRP = Materials Requirements Planning CDM = Component Data Management Design Engineering # 17 19/11/97 N.Whitehall
  • JIT FB DE FB DE DE Project Buffer FB FB DE DE FB DE Design Engineer FB Feeding Buffer Critical Chain Critical Path # 18 19/11/97 N.Whitehall Completion Date
  • Quality Optimisation # 19 19/11/97 N.Whitehall
  • Quality Optimisation   Quality Optimisation • What do failures do to ‘Time Buffers’ in Production ? • How do you ensure that the most cost effective solution is designed in ?  # 20 Six Sigma. • Focuses on notional ‘Defects’ rather than Product Dependability. • Why choose ‘Six’ ? Why not 3, 4, 5.5, 6.1 or 7 ? • How does ‘Six’ relate to real product dependability. • What would rigid adherence to ‘Six’ cost ? An excellent reference ... • ‘Optimising Quality in Electronics Assembly’ • James Allen Smith & Frank Whitehall. • McGraw Hill, 1997. • ISBN 0-07-059229-2. 19/11/97 N.Whitehall
  • Quality Optimisation Phase 1 yield < 100% Yield prediction Yield prediction # 21 19/11/97 N.Whitehall Phase 2 yield=100% Yield maximisation Yield maximisation Phase 3 CPK =! MAX robust Design centering Design centering
  • Quality Optimisation ¤ The Process Capability Ratio measures the robustness of a design ¤ A Cpk of 1 is equal to a yield of 100% ¤ The higher the Cpk, the more reliable the product will operate in the field. Cpl = (µ − LSL) 3σ (USL − µ ) Cpu = 3σ Cpk = min ( Cpl,Cpk) # 22 safety margin 19/11/97 N.Whitehall LSL 3σ µ 3σ USL
  • Quality Optimisation Attributed to ... Soldering Failures Component Failures Catastrophic Design Errors Parametric Yield Loss Build Errors 5% 10% Apply Pareto Analysis ? 15% 20% 25% PERCENT OF TOTOAL FAILURES # 23 19/11/97 N.Whitehall 30% 35%
  • Concurrent Engineering (CE) # 24 19/11/97 N.Whitehall
  • Concurrent Engineering (CE)  # 25 Concurrent Engineering (CE) • Definition #1 : Cultural Focus. • “defined as the earliest possible integration of the overall company’s knowledge, resources, and experience in design, development, marketing, manufacturing, and sales into creating successful new products, with high quality and low cost, while meeting customer expectations.” • Sammy G. Shina. • Concurrent Engineering and Design for Manufacture of Electronic Products, 1991. • ISBN 0-442-00616-0. 19/11/97 N.Whitehall
  • Concurrent Engineering (CE)   Can data be managed in the same way as inventory in production ?  # 26 Concurrent Engineering (CE) • Definition #2 : Logistical Focus. • “Get the right data, To the Right Place, At the Right Time, In the Right Format”. • Don Carter & Barbara Stillwell Baker. • ‘Concurrent Engineering - The Product Development Environment for the 1990s’. • Volume 1 & 2 : ISBN 0-201-56349-5. Can people be disciplined enough to change their own working cultures ? 19/11/97 N.Whitehall
  • Concurrent Engineering (CE)  Carter’s book introduces the concept of a layered approach to the support of the process of Concurrent Engineering. : Inter-operable Network Computing. • Layer 1 : Inter-operable Tools. • Layer 2a : Inter-operable Tasks. • Layer 2b : Product Data Management. • Layer 3 : Workflow Management & Metrics. • Layer 4 : Decision Support. • Layer 5 # 27 19/11/97 N.Whitehall
  • SEI Capability Maturity Model # 28 19/11/97 N.Whitehall
  • SEI Capability Maturity Model Quality Product/Service In turn defines Capability Combine to define People # 29 19/11/97 N.Whitehall Process Technology
  • SEI Capability Maturity Model Organisational Factors Organisation’s Engineering Process Development * Size * Structure * Culture * Roles SE-CMM + Business Factors * Strategic Focus * Market Pull Vs Technology Push * Contract Vs Market Driven * Technology/ Method Support # 30 19/11/97 N.Whitehall Guidance * Focus Area (Domains) * Capability = * Design of Process * Development of Process * Support * Validation & Verification of Process
  • SEI Capability Maturity Model SE-CMM DOMAIN PORTION CAPABILITY PORTION Continuously Improving Quantitatively Controlled Well Defined Planned and Tracked Performed Organisation Project Engineering Process Area Categories Initial * * * * 1 to n Capability Levels * * * Common Features Process Areas * * * 1 to n * Appraisal Result Base Practices * 1 to n Generic Practices 1 to n 1 to n Generic Practices Base Practices PA 1 2 3 . N # 31 19/11/97 N.Whitehall Capability Level 0 1 2 3 4 5 Result of an appraisal is a capability level profile establishing organisational systems engineering process capability in each process area.
  • SEI CMM ‘Staircase’ of Improvement 5 4 3 2 1 0 PERFORMED INFORMALLY NOT PERFORMED # 32 19/11/97 N.Whitehall Base Practice Performed QUANTITATIVELY CONTROLLED WELL-DEFINED PLANNED & TRACKED Commitment to Planning Disciplined Tracking & Verifying of Performance CONTINUOUSLY IMPROVING Standard Process Defined Standard Process Tailored Defined Process is Performed Measurable Quality Goals Defined Linkage between Process Capability and Established Goals Objective management of performance Quantitative Process Effectiveness is Established Continuously Improving Process Effectiveness
  • Benchmarking # 33 19/11/97 N.Whitehall
  • Benchmarking Process Benchmarking Process Benchmark Metrics Benchmark Practices Benchmark Gap a) How Much b) Where c) When How To Close Gap a) Improved Knowledge b) Improved Practices c) Improved Processes Management Commitment Organisation & Communication Employee Participation Superior Performance # 34 19/11/97 N.Whitehall
  • Projecting the Benchmark Gap Industry Practices Performance Metric Under Measurement (Product or Process Related) Superior Performance ? Benchmark Gap Time Now Endpoint (Years) # 35 19/11/97 N.Whitehall
  • Benchmarking Steps Planning 1. Identify what is to be benchmarked. 2. Identify Comparative Companies. 3. Determine Data Collection Method and Collect Data. Analysis 4. Determine Current Performance Gap. 5. Project Future Performance Levels. Integration Action Maturity # 36 19/11/97 N.Whitehall 6. Communicate benchmark findings and gain acceptance. 7. Establish Functional Goals. 8. Develop Action Plans. 9. Implement Specific Actions and Monitor Progress. 10. Recalibrate Benchmarks. * Leadership Position Attained. * Practices Fully Integrated Into Process.
  • A Benchmarking Appraisal Design Process Control Technology Facilities Electronic Design Team Electronic Design Process Project Design Process Trouble Reports Interfaces To/From the Team Standards Customer Requirements Project Management Electronic Design Data Component Engineering -40 # 37 19/11/97 N.Whitehall -30 -20 -10 0 10 20
  • A Correlation Between Divisions Customer Requirements Technology Design Process Control Component Engineering Project Design Process Interfaces To/From Team Electronic Design Data Electronic Design Process Electronic Design Team Facilities Standards Trouble Reports Project Management -20 # 38 19/11/97 N.Whitehall -15 -10 -5 0 5 10
  • A Benchmarking Trend Analysis Project Management 80 Customer Requirements Design Process Control 60 Trouble Reports Project Design Process 40 20 Standards 0 Technology Component Engineering Electronic Design Data Facilities Interfaces to/from Team Electronic Design Team Electronic Design Process 1997 1996 # 39 19/11/97 N.Whitehall Best in Class Aerospace Average
  • Benchmarking & Planning Benchmarking Investigations Benchmark Metrics & Practices Benchmark Partner Visits Benchmarking Process Planning Analysis Integration Action Maturity Recalibrate Special Benchmarking Study Annual Plans & Budgets Long Range Plans Start Year Mid Year End Year Time # 40 19/11/97 N.Whitehall
  • Phases of Development # 41 19/11/97 N.Whitehall
  • Phases of Development CMM Level 1 ‘Initial’ CMM Level 2 ‘Repeatable’ Phase 1 Focus Culture Structure # 42 19/11/97 N.Whitehall CMM Level 3 ‘Defined’ CMM Level 4 ‘Managed’ CMM Level 5 ‘Optimised’ Phase 2 Phase 3 Phase 4 Delivery & Process Planning Quality & Process Control Cost & Process Improvement Variety & System Integration Leader/Follower Participatory Empowering Innovative Product Team Project Team Functional Matrix
  • A Simplistic ‘Recipe’ Approach CMM Level 2 ‘Repeatable’ CMM Level 1 ‘Initial’ Phase 1 13 CMM Process Areas Process Support : Carter’s Layers People Technology Quality Optimisation # 43 19/11/97 N.Whitehall CMM Level 4 ‘Managed’ CMM Level 3 ‘Defined’ Phase 2 Phase 3 CMM Level 5 ‘Optimised’ Phase 4
  • A Simplistic ‘Recipe’ Approach CMM Level 2 ‘Repeatable’ CMM Level 1 ‘Initial’ Phase 1 13 CMM Process Areas Process Support : Carter’s Layers People Technology Quality Optimisation # 44 19/11/97 N.Whitehall CMM Level 4 ‘Managed’ CMM Level 3 ‘Defined’ Phase 2 Phase 3 CMM Level 5 ‘Optimised’ Phase 4
  • Questions and Answers. # 45 19/11/97 N.Whitehall