IIE Conference Presentation
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

IIE Conference Presentation

on

  • 502 views

 

Statistics

Views

Total Views
502
Views on SlideShare
502
Embed Views
0

Actions

Likes
0
Downloads
1
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

IIE Conference Presentation Presentation Transcript

  • 1. Quality Decisions Come FromQuality Data Presented Wednesday, May 23 By Brandon Theiss Brandon.Theiss@gmail.com
  • 2. About Me• Academics – MS Industrial Engineering Rutgers University – BS Electrical & Computer Engineering Rutgers University – BA Physics Rutgers University• Professional – Principal Industrial Engineer -Medrtonic – Master Black belt- American Standard Brands – Systems Engineer- Johnson Scale Co• Awards – ASQ Top 40 Leader in Quality Under 40• Certifications – ASQ Certified Manager of Quality/ Org Excellence Cert # 13788 – ASQ Certified Quality Auditor Cert # 41232 – ASQ Certified Quality Engineer Cert # 56176 – ASQ Certified Reliability Engineer Cert #7203 – ASQ Certified Six Sigma Green Belt Cert # 3962 – ASQ Certified Six Sigma Black Belt Cert # 9641 – ASQ Certified Software Quality Engineer Cert # 4941• Publications – Going with the Flow- The importance of collecting data without holding up your processes- Quality Progress March 2011 – "Numbers Are Not Enough: Improved Manufacturing Comes From Using Quality Data the Right Way" (cover story). Industrial Engineering Magazine- Journal of the Institute of Industrial Engineers September (2011): 28-33. Print
  • 3. Problem Statement• The era of social networking has enabled a person sitting in a café in New York City to obtain up to the minute ‘status’ updates about a friend or colleague in Shanghai. – Do these tweets have value? – Is this information “quality data”? – Why does business process data not flow with the same velocity and volume?
  • 4. What is Quality Data?• Quality is defined as – Joseph Juran - > "fitness for intended use" – W. Edwards Deming -> "meeting or exceeding customer expectations."• Data is to be considered as Quality if it is – Relevant – Accurate – Timely – Complete
  • 5. What is a Process?• Formal Definition – A systematic series of actions directed to some end• Practical Definition – Any Verb Noun Combination • Eat Sandwich • Read Book • Attend Conference• Implications of Practical Definition – Same Tools Techniques and Methods of the Lean Six Sigma Methodologies can be used for virtually anything
  • 6. Limitations of Lean Six Sigma• Process Intelligence – There must exist a critical level of intelligence at the elemental levels in the organization. This can be manifest in the operators or machines• Process Ownership – The operators need to be sufficiently trained and empowered to take ownership of the process so that they can creatively maximize the efficiencies the hardware can provide in order to produce a quality work product for their internal or external customers.• Non Quality data as an input will generate bad conclusions – The most elegant of mathematical calculations will not correct errors in the dataset• Correlation does not imply causation – As the statistics become more elegant often this fact is often forgotten.• Utilization of Statistical Tools does not yield absolute truth – Statistical Significance Does not Mean actual significance. • (See US Supreme Court Matrixx Initiatives, Inc. v. Siracusano ) – Best Utilization is to direct resources to discovering truth.
  • 7. Problems with Valuing Data• Data in the abstract has little value• Data only begins to have value when connections and well reasoned conclusions are drawn from the information that these “numbers on a screen” can be translated directly to the accountants profit and loss statement.• “how much should be invested in measuring A when its impact on B might not generate enough additional benefit to cover the cost of measuring A?” Data Collected Capital Allocated Value Determined
  • 8. Scientific Approach to Valuing Data“Our theories determine what we measure” (Albert Einstein)• In science the goal is the confirmation or refutation of a theoretical model.• In industry the costs, purposes and uses for collected data must be tied directly to the profitability of the enterprise.• Many such models for industrial systems exist.• The strength of the model is only established through the course of collecting data and testing predictive validity of the theory• The solution to this dilemma is to a priori declare the value.
  • 9. Data Collection System• Historically the cost of collecting data has been quite high relative to the perceived benefits derived.• Traditional data collection systems are considered as non value added thus ‘muda’• Data was collected about a product or process either by diverting material to a quality lab for inspection or doing painstaking time studies of operator actions and then auditing the results.• Fortunately the cost and flexibility of current technology allows for process and product specific data to be collected in line by the existing operators without increasing cycle time or incurring great expense
  • 10. Case Study 1 (LTL Trucking)• Less than Truck Load Trucking- handles mostly palletized material than handled by FedEx/ UPS and not large enough to warrant its own truck• Freight is billed based upon – Distance – Weight – Volume – Class of Freight• Traditional Method – Shipper would contact Shipping company and provide the complete data – Shipping company would the pick up the load and transport it – The shipping company would invoice the shipper
  • 11. Case Study 1 (cont)(LTL Trucking)• Problems with Traditional Method – Customer Provided weight and volume information never verified. – Heavier pallets = greater fuel costs – Larger pallets = fewer pallets in a truck• Initial Response – 30% of inbound freight into the shipping companies terminal was diverted to a re-weighing re-dimensioning station • Operator weighs pallet, measures dimensions and fills out a paper form • At the end of the shift the papers are turned into a data entry person who keys in the information • Billing system corrects invoice and adds surcharge for customer error
  • 12. Case Study 1 (cont) (LTL Trucking)• Problems with Initial Response – Only 30% of the pallets are analyzed – Slower cycle time for reweighed pallets – Paper based • Transcription errors • Lost chits • Erroneous Readings – Extra Labor required (dimensioning and data entry)• Final Solution – Integrated weighing system in the forks of the forklift • As the operator is moving the pallet, the pallet is identified by either an RFID tag or scanned barcode. The weight is then captured and wirelessly transmitted to the billing system • A similar system has been designed which allows for the dimensions of the pallet to also be determined
  • 13. Case Study 2 (Bulk Mixing)• A company was manually mixing bulk materials to make a slurry• The process of batching consisted of an operator utilizing a bobcat skid loader to dump bulk material into a large industrial mixer. – At the beginning of the shift the operator was given a pre-printed form that prescribed the exact amount of each of the bulk ingredients. – Prior to beginning the batching process the operator moved the bobcat onto a floor scale and recorded the empty tare weight of the loader. – Then the operator navigated the bobcat to the pile of bulk material, scooped up a load, navigated back to the scale to record the weight on the pre-printed form – Then finally to the mixer to load the material. – This process was repeated multiple times for any given ingredient. – After completing an ingredient the operator would add up the respective weights, using a pocket calculator and recorded the total weights on form. – This cycle continued for all of the ingredients in the recipe. – Once the recipe was completed the operator turned in the form to his supervisor who in turn submitted the form to the quality lab.
  • 14. Case Study 2 (cont) (Bulk Mixing)• A Gauge R&R study was performed and the process was determined to be highly variable yet adequate. – Company was not concerned as the output of the batching process was the input to another process that was quite tolerant to batching errors (as long as they were reported)• The downstream processes were having constant quality problems – Managers in the downstream process complained to the batching area claiming they were delivering an inferior product – Batching Area managers pointed to the Quality records which indicated that everything was made to specification – The batching supervisors emphatically argued that if an error were reported, he personally would reprimanded the operator and in most cases fired them
  • 15. Case Study 2 (cont) (Bulk Mixing)• Problems – Opportunities for the data to be corrupted or compromised or lost • manually hand writing data, • using a hand held calculator , • improperly identifying the materials and • multiple hand offs of the physical form – ‘Muda’ in the wasted motion of the operator having to drive to the floor scale prior to the delivery of each load – Measurement resolution • Recipes listed ingredients to fractions of a pound however scale read by 2 lb increments• True Root Cause – Managers Response to Defects • Operator knew if they reported the actual amount dispensed they would be jeopardizing their jobs • Consequently only the correct values were reported • The down stream processes were not given accurate data
  • 16. Case Study 3 (Final Assembly/Packaging)• As a result of a bench marking exercise, a multinational manufacturer identified that one of its plants was 3 times more productive than the worst performing plant in the same geographic region for the final assembly and packaging area• An investigation was performed comparing the standard operating procedures, line layouts, tools, training and staffing of the lines at the respective plants. – Minor differences were observed and were immediately changed to adopt the best practices of the higher performing plant. As a result de minimis incremental improvements were observed however the plant continued to under perform. – A team of interns to observer the line and record the ‘up time’. After collecting data around the clock for a week, the team determined that the line was only meeting its target cycle time 65% of the time. – Process maps were drawn and one of the last steps was to scan the unique part barcode so that product would be flowed in the ERP system as a finished good
  • 17. Case Study 3 (cont) (Final Assembly/Packaging)• Initial Solution – Using the time stamp information, the time between successive scans was revealed. – By plotting this information on a SPC chart, out of control conditions (line stoppages) would be obvious. Once the problems were identified and quantified resources could be allocated to developing their solution – With the help of operations and IT, software was written that would automatically generate the control chart. – The front line supervisors were instructed that prior to letting the operators leave for the day, each one of the out of control conditions had to be explained
  • 18. Case Study 3 (cont)(Final Assembly/Packaging)• Why the Initial Solution Failed – After 10 hour shift, operators did not want to spend 20 minutes doing paper work – 9 hours after the fact, no one could recall why exactly the line had stopped• Final Solution – Modified Line Software • When an operator failed to scan a barcode in the allocated period of time, a window would popup and would not allow the system to proceed. • Window presented 4 options (Materials, Machines, Manning or Other). • The operator would use the touch screen interface and poke at the option that matched the line stoppage. For instance if there was a stock out in a given box, the operator would poke at the option for materials and then would be presented with an option of all the potential stock out materials. – Stoppage Data was then used in Pareto Charts which lead to the improvement in upstream data collection
  • 19. Case Study 4 (High Volume Manufacturer )• A manufacturer of hand tools could not account for discrepancies of 300,000 parts per facility per month. The manufacturer had made a large investment in an ERP which automatically translated customer orders directly to the production schedule and manufactured pieces the Achilles heel of the system was the quality of the input data. – If information were not entered, then the ERP would respond by scheduling more production of the piece. – If a wrong number were entered, then the ERP would fail to schedule the part for production. – Managers lacked sufficient information as to the whereabouts of the material or which manufacturing stage it was in.
  • 20. Case Study 4 (cont) (High Volume Manufacturer )• The process of collecting data was convoluted, – the operator would hand record on a chit of paper the number of parts produced. – At the conclusion of the shift these notes were turned in to the area supervisor, – The supervisor would either personally key the information into the ERP system or delegate the task to a data entry person.• Problems – The paper notes were frequently lost; – Accidental and deliberate transcription errors – Lots of ‘Muda’ – ERP Amplifying Feedback Loop • If a sheet was lost, the ERP would drive more production of the part, though in the physical reality it had been produced or conversely if due to a over reporting error the ERP would not schedule production.
  • 21. Case Study 4 (cont) (High Volume Manufacturer )• The solution – Hardware • a simple counting scale, • a barcode scanner and • a printer. – Process • Operator to scanned a barcode to identify the product • The associated average piece weight for that part and step in the manufacturing process was downloaded from database to the scale. • Placing items on the scale then instantly yielded the correct number of pieces, • The operator then pressed a button to indicate a finished count. • The part was then flowed from one WIP state to the next in the ERP system • A label was printed to confirm that the transaction was received.• Results – The number of misattributed parts of this manufacturer was reduced to less than 300 while the cycle time increased by less than 15 seconds.
  • 22. Conclusions “Data enables an individual, company or society to rise from the bondage of myths and half truths.”• The tools technique and methods of Lean Six Sigma help to reveal truth yet are limited by the input quality of the data. Inserting garbage data into the most elegant of calculations will still result in nonsensical analysis and decisions.• Getting enough of the right data, at the right time, to the right people is the constantly evolving challenge that must be met. Technology now facilitates the real time flow of this information. The technology can only be implemented and process improvements achieved if an enterprise is willing to make the investment.
  • 23. Questions?• Contact Information – Brandon Theiss – Brandon.Theiss@gmail.com – Linkedin- http://www.linkedin.com/in/brandontheiss