Your SlideShare is downloading. ×
NG BB 55 CONTROL Tollgate
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

NG BB 55 CONTROL Tollgate

1,295
views

Published on

Published in: Education, Business, Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,295
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
245
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. UNCLASSIFIED / FOUO Black Belt Control Tollgate Briefing Project Name DEPMS Project Number Name of Belt Organization Date UNCLASSIFIED / FOUO
  • 2. UNCLASSIFIED / FOUO Tollgate Requirements - Control NG CPI BLACK BELT TOLLGATE REQUIREMENTS PROJECT DELIVERABLES NGB COMMENTS CONTROL Updated Financial / Operational Benefits Mandatory Standardize Process/ SOPs/Training plans Mandatory Varies by project Process Owner Accountibility Mandatory Includes transition plan Achievement of Results / New Baseline Mandatory % goal achievement Process Control Plans Mandatory Dashboard Replication Opportunities Recommended Leverage to other areas Visual Process Controls Recommended Varies by project Mistake Proofing Tools Recommended Varies by project Storyboard / A3 Mandatory 1-page proj summary Barriers/Issues/Risks Mandatory Quick Wins Recommended Lessons Learned Optional 2 UNCLASSIFIED / FOUO
  • 3. UNCLASSIFIED / FOUO Control Tollgate Templates NOTE: THIS IS A TEMPLATE FOR ALL NG CPI BELTS NG has developed this template as a basic format with standard deliverables to help guide NG CPI belts through the NG tollgate requirements for certification. It is recognized that each project is unique and has unique deliverables with unique flows. Therefore, this format does not have to be followed exactly to the letter of the law for your project. (DELETE THIS SLIDE FOR YOUR PROJECT) 3 UNCLASSIFIED / FOUO
  • 4. UNCLASSIFIED / FOUO Define Charter and Timeline Team Members Name Role Affiliation DACI Black Belt Driver Master Black Belt Driver Take away Project Sponsor Approver message goes Process Owner Approver here Project Charter (impact ofProblem problem)Statement:Business Case: Project TimelineGoal Statement: Phase Start Stop StatusUnit: Define mm/dd/yy mm/dd/yyDefect: Measure mm/dd/yy mm/dd/yyCustomer Analyze mm/dd/yy mm/dd/yySpecifications: Improve mm/dd/yy mm/dd/yyProcess Start: Control mm/dd/yy mm/dd/yyProcess Stop: 4Scope: Required Deliverable UNCLASSIFIED / FOUO
  • 5. UNCLASSIFIED / FOUOMeasure Overview Baseline Statistics Process Capability Process Capability of Delivery Time P otential (Within) C apability  VOC / VOB LSL Target USL Cp C PL 1.16 2.22  Unit (d) or Mean (c) W ithin C PU C pk 0.10 0.10 O v erall C C pk 1.16  Defect (d) or St. Dev. (c) O v erall C apability Pp 1.24  DPMO (d) PPL 2.37 PPU 0.11   PCE: (Cycle Time Only) PLT: (Cycle Time Only) - Example - LS L P pk C pm 0.11 0.35 P rocess D ata 10 T arget 20 USL 30 S ample M ean 29.1203  Sigma Quality Level S ample N S tD ev (Within) 266 2.87033 S tD ev (O v erall) 2.69154 12 16 20 24 28 32 36  MSA Results: show the percentage result of the GR&R or other O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance P P M < LS L 0.00 measurement systems analysis carried out in the project P P M < LS L 0.00 PPM < LS L 0.00 PPM > USL 281954.89 PPM > USL 379619.67 PPM > USL 371895.18 P P M T otal 281954.89 PPM T otal 379619.67 P P M T otal 371895.18 Baseline “As Is” Performance Tools Used Summary for Delivery Time A nderson-D arling N ormality Test  Detailed process mapping  Time Series Plot A -S quared 1.95 P -V alue < M ean 0.005 29.128  Measurement Systems Analysis  Probability Plot S tD ev 2.677 V ariance S kew ness 7.169 0.201075  Value Stream Mapping  Pareto Analysis Kurtosis -0.471714 N M inimum 266 24.000  Data Collection Planning  Operational Definitions 5s 1st Q uartile 27.000  24 26 28 30 32 34 M edian 3rd Q uartile 29.000 31.000  Basic Statistics Generic Pull M aximum 35.000  95% C onfidence Interv al for M ean 28.805 29.451  Process Capability 95% C onfidence Interv al for M edian 29.000 29.000  Histograms  Control Charts 95% C onfidence Interv al for S tD ev 9 5 % C onfidence Inter v als 2.468 2.927 Mean 5 Median 28.8 28.9 29.0 29.1 29.2 29.3 29.4 Required Deliverable UNCLASSIFIED / FOUO
  • 6. UNCLASSIFIED / FOUO Analyze Overview Fishbone Diagram Cause and Effect Matrix Ra ting of Importa nce to Facilities & Equipment Custome r Materials Manpower 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Wrong Location Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Space Lack of Seats No Standardization of seats Lack of Knowledge Old Buildings Total Inequality in seats New Codes Proce ss Ste p Proce ss Input Lack of Funds Lack of Controls Not Suited for “Dedicated” to Task 1 0 Senior Leader Current Mission (Type of Space) 2 3 0 0 No Suitable space to Assign 4 0 Getting Seats Takes Time 5 6 0 0 (Y) Effect: 7 0 8 0 9 0 10 0 PLT = 5 days 11 0 Vague Lack of Database 12 0 Reqmts 13 0 People Unplanned Programs (too long) 14 0 15 0 Multiple Paths Facilities 16 0 17 0 Location (Competing for 18 0 Same Space) 19 0 20 0 Lack of Controls 0 Senior Leadership 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Total Delays in elevating Too Long (Time) Lower Spec Collocation Target Impasse issues Mold, HVAC Crashes Upper Spec Approvals This table prov ides the initial input to the FMEA. W hen each of the output variables (requirements) are not correct, that represents potential "EFFECTS". W hen each input Methods Mother Nature Measurements variable is not correct, that represents "Failure Modes". CAO/IPT Time Avail Unforeseen Funding Decision to Wait 1. List the Key Process Output Variables 2. Rate each v ariable on a 1-to-10 scale to importantance to the customer Circumstances 3. List Key Process Input Variables 4. Rate each v ariables relationship to each output variable on a 1-to-10 scale 5. Select the top input variables to start the FMEA process; Determine how each selected Competency vs. PMA input varable can "go wrong" and place that in the Failure Mode column of the FMEA. Critical X/Root Causes Analysis Root Cause/Effect Pareto Chart 100 150 80  Root cause:  Effect - Example - Percent 100 60 Count 50 40  Root cause: 20  Effect 0 0 Defect So ut h No r th Ea s t Oth e rs  Root cause: Count 100 50 15 6 Percent 58.5 29.2 8.8 3.5  Effect 6 Cum % 58.5 87.7 96.5 100.0 Required Deliverable UNCLASSIFIED / FOUO
  • 7. UNCLASSIFIED / FOUO Improve Overview FMEA Solution Matrix O OProcess Step / Potential Failure Potential Failure Potential Root D Actions D S C Current Controls Resp. Actions Taken S C Per Commercial Terms Input Mode (X) Effects (Y) Causes E Recommended E Time to Issue Invoice E C E C T T Root Causes (Xs)? What is the In what ways does What is the impact V What causes the Key U What are the existing What are the What are the V U E E R process step the Key Input go on the Key Output E Input to go wrong? R controls and procedures actions for reducing completed actions E R C RPN C P Level of Effort and Input wrong? Variables (Customer R R (inspection and test) that the occurrence of taken with the R R T T N Presentation under Requirements)? I E prevent either the cause the cause, or recalculated I E Overall Impact Rating I I Completeinvestigation? or the Failure Mode? improving RPN? Accuracy T N T N O O Y C detection? Y C N N E E Updating Ineffective Discrepancies: POI vs Adjust templates to Ineffective reviews 5 4 None 5 100 PMO 5 2 2 20 Risk Rating Tollgates templates Templates match POI Users and leaders Slide purposes not Adjust slide titles 4 None 4 80 PMO 1 2 10 Root Cause Significance Rating ? 10 10 10 10 10 10 dont buy-in to LSS clear and notes Redundant and NVA Eliminate NVA 3 None 4 60 PMO 1 2 10 slides slides Potential Improvements ? Impact Rating Incomplete SOP or Develop "read me" 3 None 5 75 PMO 1 2 10 "Help" within PS slides Offshore Costs 7 1 5 8 1 10 320 8 Commercial Terms 8 4 2 5 10 7 360 7 Updating Too many steps to Link templates to Quantity of Source Data 8 5 7 7 1 10 380 6 Inefficient updating 3 NVA steps 5 None 4 60 PMO 3 3 2 18 Tollgates build/update PS Phase Reconciliation 10 10 10 1 10 10 510 6 Users get frustrated Too many choices Eliminate NVA; 4 None 4 48 PMO 2 2 12 Quality of Source Data 7 7 7 7 7 7 420 7 and delay projects between templates group in folders Inconsistent file Simple names; Training 8 7 10 5 8 10 480 6 3 None 4 36 PMO 1 2 6 names and locations group in folders Client (eg RCTI) 6 2 2 10 8 8 360 5 Job Setup 10 4 10 6 10 10 500 5 LSS Tool Not all LSS tools & User cannot find Not all tools available Revise list of tools 4 5 None 4 80 PMO 4 2 2 16 Payroll Close Date 8 10 1 1 1 8 290 8 Access refs in PS tools & references in PS and joggers Project completion is Poor explanation in Develop direct Pre-Billing 10 8 6 8 2 10 440 5 3 None 3 36 PMO 2 2 16 delayed some references access pdf file Client Reporting Requirements 7 9 4 10 10 10 500 4 Delivery Method 9 1 1 5 5 7 280 9 LSS Tool Too many steps to Inefficient retrieval of Multiple means for "Read me" file; one Job Completion 7 10 1 7 5 10 400 3 2 3 None 5 30 PMO 2 2 2 8 Access retrieve tools LSS tools/refs accessing tools folder Eliminate NVA Job Manager Requirements 9 7 5 4 8 9 420 2 NVA steps 3 None 5 30 PMO 2 2 8 steps Implementation Plan New Process Measurements PROJECT NAME DIVISION GREEN BELT DATE PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT  New Process PLT: (If applicable) Implementation Control Action Responsible Target/ Actual  New Process SQL: Solution Improvement Action Individual/ Issues/Barriers Risk Mitigation Current Status/ Comments Number Number Complete Date New DPMO: Solution Owner 1  2 3 4 5 - Example - 6 7 8 7 Required Deliverable UNCLASSIFIED / FOUO
  • 8. UNCLASSIFIED / FOUO Process Control/Reaction Plan PROJECT NAME DIVISION DIVISION CHIEF DATE LD09014 Improve Tollgate Templates LSS PMO LTC Garrett Heath 12/18/2007 BRANCH CHIEF PROCESS OWNERS RESPONSIBLE CONTROL MANAGER Applicable Control Charts and Metrics Control Responsible Upper Lower Action Control Action Target Individual Freq. Process Step Control Control Current Reaction Plan Number Value Limit Limit If process cycle time for updating templates rises 4 Min, 40 1 Process Cycle Time Green Belt Quarterly Updating Tollgates 4 Min 5 Min N/A above 8 minutes, PMO will re-examine process to Sec identify constraints. Conduct VOC Survey to 100% PMO willl adjust tollgate templates to address issues 2 address issues in FMEA Green Belt Quarterly Satisfaction 100% 85% N/A raised by VOC. (Improve Phase) Tollgate Templates Conduct VOC Survey to Access to Tools 100% PMO will adjust tools and references access to 3 address issues in FMEA Green Belt Quarterly and References Satisfaction 100% 85% N/A issues raised by VOC. (Improve Phase) - Example - 8 Required Deliverable UNCLASSIFIED / FOUO
  • 9. UNCLASSIFIED / FOUOStandard Operating Procedures (SOPs) and Training Plans SOPs Requiring Revision - Example - SOPs Requiring Revision Responsible Status FY09 SOP Update Lead: Branch Chief, Completed 1 Oct 08 – 28 Feb 09 AO: Team Lead Next SOP Update schedule Lead: TSC Chief, Complete before FY10 NSPS 1 Sep 09 – 30 Sep 10 AO: Team Lead Cycle Next SOP Update schedule Lead: TSC Chief, Complete before FY11 NSPS 1 Sep 09 – 30 Sep 10 AO: Team Lead Cycle Required Training Required Training Responsible Status Office Professional Each Team Members are OPD Training Schedule plan is Development encouraged to provide OPD asked to be updated to team members. Individual Professional Branch Chief and Team Susp Date: Branch Chief and Development Plan members Team member discuss and finalize by 1 May 09 9 Required Deliverable UNCLASSIFIED / FOUO
  • 10. UNCLASSIFIED / FOUO Visual Process Control Tools • SOP is posted computer desktop for easy reference access. • Mistake proofing system in place - copy of DD214 w/checklist is posted on the counselor’s desk - Example - Sample DD 214 w/ checklist 10 Recommended UNCLASSIFIED / FOUO
  • 11. UNCLASSIFIED / FOUOMistake Proofing (Poke Yoke) Tools - Example -  Solution: Make use of existing bar codes on shipping paperwork to mistake proof receiving function.  Validate with suppliers (shipping company) that all shipping containers will have barcode label ($0)  Purchase RF (wireless) bar coding equipment and software ($100/gun * 20 guns)  Train all longshoreman on use and care of equipment ($50/hr * 20 hrs)  Install error metrics for error tracking ($100) Recommended 11 UNCLASSIFIED / FOUO
  • 12. UNCLASSIFIED / FOUO Updated Benefits Estimate - Example - Metric Baseline Objective Achieved Cycle Time 30 days 7 days 5 days Cost Savings $1.054M $850k $830k Defect Rate Additional Benefits/Comments:  The solution was approved 15 May 09 and began full implementation on 22 May 09. It is presently being executed by the Process owner with no major issues or degradation of service to our customers.  Morale has improved significantly with the new layout of the office, as indicated by the recent office survey. Required Deliverable 12 UNCLASSIFIED / FOUO
  • 13. UNCLASSIFIED / FOUOFuture Projects Identified  Based on the outcome of this project, the following is a list of “possible areas of concern” and will result in LSS Projects.  PQDR Process - Example -  QAR Surveillance Process  QAR Training Process  MRB Corrective Action Process  NDT Process Optional Deliverable 13 13 UNCLASSIFIED / FOUO
  • 14. UNCLASSIFIED / FOUOProject Replication Replication Project/Best Practice Candidate: This completed project is being submitted as a candidate for replication. This project has achieved benefits and contains the appropriate level of documentation to enable another practitioner in another DoD organization to replicate part or all of the project with a minimal amount of effort vs. benefit, which will provide an additional return on investment to DoD.  Required documentation & benefits before submitting as a replication candidate:  Project Charter  High level Process Map or SIPOC  FMEA or Fishbone Diagram  Implementation Plan  Control Plan and Training Plan  Demonstrated Quality, Cost, or Speed improvements (i.e., SQL, ROI/Savings, PLT, PCE….)  This project may have replication potential within these DoD organizations:  Organization 1  Organization 2  … Recommended 14 UNCLASSIFIED / FOUO
  • 15. UNCLASSIFIED / FOUO Project Barriers/Issues/Risks  Barriers  Issues  Risks Required Deliverable 15 UNCLASSIFIED / FOUO
  • 16. UNCLASSIFIED / FOUOCompleted Project “Storyboard” Define 1.2 Day - Example - Project Charter CCR Gap Sigma Performanc BUS CASE: Be #2 Fin Service Provider e Level of 1.3 GOAL: Reduce Loan/Lease CT from 9.2 to 8.0 days by July 1 Measure FIN IMPACT: $2.7M per year Analyze Control Improve Pilot Plan Loan or Lease Screen “Officer performs Entry both” & “Officers changed”, Color eliminated as Printouts contributors to high Rewards cycle time. & RecogOfficer Work & Turnover, Waiting, & Automation Affect CT; Job Aids affect Variation in CT 16 Required Deliverable UNCLASSIFIED / FOUO
  • 17. UNCLASSIFIED / FOUO 8-Step A3 Project Summary Report Company: Department: Date: Prepared by: 1. Define the problem situation 3. Action plans to correct problems 2. Problem Analysis 4. Results of Activity 5. Future Steps 17 UNCLASSIFIED / FOUO
  • 18. UNCLASSIFIED / FOUO NG CPI Tollgate Tool 8-STEP PROCESS 1. Validate 2. Identify 3. Set 4. Find Root 5. Develop 6. See C-Ms 7. Confirm 8. Standardize Problem Gaps Targets Cause C-Ms Through Results Define Measure Analyze Improve Control  Project Charter • Problem Stmt Detailed As Is Process  Potential Xs  “To Be” Process Map  Process Control Plan • Defect Definition Map • Brainstorming • 5 Whys  Solution Generation /  Process Owner • Goal Statement  Value Stream Map • Fishbone Prioritization Accountability • Project Scope Affinity Diagram  Key Process Metrics • • Business Impact • Strat Alignment  Data Collection Plan  Critical Xs  Improvement Strategy  Updated Financial  Measure Systems • Cause & Effect Matrix • Improvement Model Benefits  Sponsor & Team • Hypothesis Testing • Implementation Plan Analysis • Pilot • Regression Replication Opportunities  Replication Check  Data Collected • “X” Improvement Target • Time Studies  Measurable Y  Baseline Data Analysis • Theory of Constraints  Mistake Proofing  Project Documentation  Voice of Customer • Descriptive Stats of revised policies,  FMEA SOP’s, procedures, and  Customer Specs • Graphs • Risk Analysis • Pareto Charts training  Voice of Business • Risk Mitigation Plan  Project Timeline  Est Financial Benefits  FMEA  Visual Process Control • Risk Analysis Tools (Optional)  Communication Plan  Control Charts (as • Risk Mitigation Plan • Stakeholder Analysis needed) High Level Process  Process Capability  Control Charts Map (SIPOC)  Process Capability I accept the Define I accept the Measure I accept the Analyze I accept the Improve I accept the Control Tollgate Tollgate Tollgate Tollgate Tollgate (Sponsor) (Sponsor) (Sponsor) (Sponsor) (Sponsor) (Process Owner) (Process Owner) (Process Owner) (Process Owner) (Process Owner) (MBB) (MBB) (MBB) (MBB) (MBB) (Finance Owner) (Finance Owner) 18 UNCLASSIFIED / FOUO
  • 19. UNCLASSIFIED / FOUO Appendix 19 UNCLASSIFIED / FOUO
  • 20. UNCLASSIFIED / FOUO Cross Functional Team Team Members Name Role Affiliation DACI Black Belt Driver Master Black Belt Driver Project Sponsor Approver Process Owner Approver Contributor Contributor Contributor Contributor Inform Inform Inform Inform 20 Required Deliverable UNCLASSIFIED / FOUO
  • 21. UNCLASSIFIED / FOUO Replication Check I confirm that:  DEPMS (DoD Enterprise Performance Management System) has been searched for similar projects: Yes / No  Replication: List relevant initiatives / potential replication projects found (if any): • Project 1: (DEPMS # or other tracking tool project number) • Project 2:  Collaboration: Identify organizations that can/should be considered for working this project collaboratively: • Organization 1: • Organization 2: 21 Required Deliverable UNCLASSIFIED / FOUO
  • 22. UNCLASSIFIED / FOUO Strategic Alignment The Define Tollgate requires a linkage to organizational strategy.  Include an organizational metric/metrics for which your project will help improve  Refer to your organization’s Strategic Plan and/or other referenced documents 22 Required Deliverable UNCLASSIFIED / FOUO
  • 23. UNCLASSIFIED / FOUO Business Impact  Insert as much information as possible about the potential operational and/or financial benefits  Include any assumptions upon which these estimates are based Example: Operational benefits – This project is expected to reduce PLT by 35%, improve SQL from 1.2 to 3.0, save 20 man hours per shift Example: Financial benefits – This project is expected to save $xx in FYxx 23 Required Deliverable UNCLASSIFIED / FOUO
  • 24. UNCLASSIFIED / FOUOHigh-Level Process Map (SIPOC) Suppliers Inputs Process Outputs Customers Measurable Y: 24 Required Deliverable UNCLASSIFIED / FOUO
  • 25. UNCLASSIFIED / FOUO Voice of Customer / Voice of Business Critical Customer Voice of the Key Customer Issue(s) Requirement Customer / What does the customer want from us? What does the customer want from us? We should summarize key issues and We need to identify the issue(s) that translate them into specific and measurable prevent us from satisfying our requirements customers. CriticalBusiness Voice of the Key Process Issue(s) Requirement Business What does the business want/need from us? What does the business want/need from We should summarize key issues and us? We need to identify the issue(s) translate them into specific and measurable that prevent us from meeting strategic requirements goals/missions. 25 Required Deliverable UNCLASSIFIED / FOUO
  • 26. UNCLASSIFIED / FOUO Stakeholder Analysis Stakeholder Explanation of Stakeholder’s Stakeholder Project Impact Level of Current Action Plan Stakeholder Current Attitude Score On Stakeholder Influence on Stakeholder For Name/Group Toward Project (H=3, M=2, L=1, (H, M, L) Success of Attitude Stakeholder ( +, 0, - ) +=3, 0=1, -=-3) Project (H,M,L) (list) Recommended Deliverable 26 UNCLASSIFIED / FOUO
  • 27. UNCLASSIFIED / FOUO Communication Plan Topics of Audience Media Purpose Discussion/ Owner Frequency Notes/Status Key Messages 27 Required Deliverable UNCLASSIFIED / FOUO
  • 28. UNCLASSIFIED / FOUODetailed “As Is” Process Map - Example - Required Deliverable - VSM or Process Map or Both 28 UNCLASSIFIED / FOUO
  • 29. UNCLASSIFIED / FOUO Value Stream Map Order Mgmt Supervisor Service lead time = 384 min CUSTOMER Weekly Update - Example - Customer call time = 24 min Phone Call Phone Call Trigger: Order Mgmt Completion Criteria: Cycle Time: Screen for Acct Mgr Takt Time: Number of People: Manual Update SUPPLIERS P/T = 3 min Number of Approvals: Items in Inbox: Lost calls=10% % Rework: # of Iterations (cycles): Volume=1200 # of Databases: Top 3 Rework Issues: 1. 2. 3. Large Business 6 Customers Order Mgmt Order Mgmt Order Mgmt Order Mgmt DIST Customer Product Shipping Pick Info Need Pricing Info Small 4 4 4 4 10 Pack & Ship Business 20 Orders 5 Customers P/T = 2 min P/T = 6 Min P/T = 6 Min P/T = 2 Min P/T = 120 Min Error Rate=2% Error Rate=0% Error Rate=2% Error Rate=1% Error Rate=1% Volume=800 Volume=1200 Home Volume=800 Volume=800 Volume=800 3 Customers 5 min 240 min 3 min 2 min 6 min 6 min 2 min 120 min Required Deliverable - VSM or Process Map or Both 29 UNCLASSIFIED / FOUO
  • 30. UNCLASSIFIED / FOUOKey Input, Process, and Output Metrics Suppliers Inputs Process Outputs Customer Start Step 2 Step 3 Step1 Step 5 Step 4 VOC/ Input Metrics Process Metrics Output Metrics VOB Quality Speed Cost Required Deliverable 30 UNCLASSIFIED / FOUO
  • 31. UNCLASSIFIED / FOUOOperational Definitions Define each of the Key Input, Output, Process Metrics from your SIPOC that you are going to collect data on (via the Data Collection Plan) as well as any other terms that need clarification for the data collectors and everyone else on the team. Examples:  Award Process PLT: The time from when a Director submits the Award recommendation to the time when the employee is presented the Award in a ceremony.  Number of Claims Processed: The number of Claims processed per weekday (M-F).  Total Hours Worked: The total number of hours worked in the facility including weekends and holidays.  Number of Personnel: The total number of military and civilian personnel working (not including contractors). Include other unique terms that apply to your project that require clear operational definitions for those collecting the data and for those interpreting the data. Required 31 UNCLASSIFIED / FOUO
  • 32. UNCLASSIFIED / FOUO Data Collection Plan Performanc Operational Data How Will Data Be Who Will When Will Sampl Stratificati How will e Measure Definition Source and Collected Collect Data Be e Size on Factors data be Location Data Collected used? 1 Ability to update X – Steps to In DEPMS By counting steps Name ASAP 1 None To find VA, BNVA, projects and update projects NVA build tollgate reviews - Example - 2 Ability to update X – Tollgate In DEPMS By determining % of Name ASAP 40 None To determine projects and template slides activity steps identified in consistency with build tollgate that match POI “Introduction to _____” POI reviews modules in POI that are adequately addressed in templates 3 Easy Access to X – Availability of In DEPMS By determining the Name ASAP 63 None To determine LSS tools and LSS tools and percentage of tools, with availability of tools references references their references, listed on and references DMAIC Road Map slides that can be found in PS 4 Easy Access to X – Steps In DEPMS By counting # steps Name ASAP 37 None To find VA, BNVA, LSS tools and required to find required to find the tools NVA references tools and and their references references 32 Required Deliverable UNCLASSIFIED / FOUO
  • 33. UNCLASSIFIED / FOUOMeasurement Systems Analysis The Measurement System used to collect data has been calibrated and is considered to have no potential for significant errors. The data collection tool is reliable, can be counted on, has good resolution, shows no signs of bias and is stable. Type of Measurement Description Considerations to this Project Error The ability of the measurement Work hours can be measured to <.25 Discrimination system to divide measurements into hours. Radar usage measure to +- 2 (resolution) “data categories” minute. The difference between an observed No bias - Work hours and radar start- Bias average measurement result and a stop times consistent through reference value population. No bias of work hours and radar Stability The change in bias over time usage data. Not an issue. Labor and radar usage Repeatability The extent variability is consistent is historical and felt to be accurate enough for insight and analysis. - Example - Remarks in usage data deemed not Different appraisers produce reproducible, therefore were not Reproducibility consistent results considered in determining which radars were used in each op 33 Variation The difference between parts Required Deliverable process. N/a to this UNCLASSIFIED / FOUO
  • 34. UNCLASSIFIED / FOUO Reported by : Gage name: Tolerance:Measurement Systems Analysis Date of study : Misc: Gage R&R (ANOVA) for Response Gage R&R Components of Variation Response by Part %Contribution Source VarComp (of VarComp) 100 % Contribution % Study Var 10.00 Total Gage R&R 0.0015896 3.70 Percent Repeatability 0.0005567 1.29 9.75 Reproducibility 0.0010330 2.40 50 Operator 0.0003418 0.79 9.50 Operator*Part 0.0006912 1.61 0 Part-To-Part 0.0414247 96.30 Gage R&R Repeat Reprod Part-to-Part 1 2 3 4 5 6 7 8 9 10 Total Variation 0.0430143 100.00 Part R Chart by Operator Study Var %Study Var Response by Operator 1 2 3 Source StdDev (SD) (6 * SD) (%SV) UCL=0.1073 0.10 10.00 Total Gage R&R 0.039870 0.23922 19.22 Sample Range Repeatability 0.023594 0.14156 11.38 Reproducibility 0.032140 0.19284 15.50 _ 9.75 0.05 Operator 0.018488 0.11093 8.91 R=0.0417 Operator*Part 0.026290 0.15774 12.68 9.50 Part-To-Part 0.203531 1.22118 98.13 0.00 LCL=0 1 2 3 Total Variation 0.207399 1.24439 100.00 Operator Xbar Chart by Operator Number of Distinct Categories = 7 1 2 3 Operator * Part Interaction 10.00 10.00 Operator Sample Mean UCL=9.8422 _ 1 The Measurement _ Average 2 X=9.7996 9.75 9.75 LCL=9.7569 3 System is acceptable 9.50 with the Total Gage 9.50 1 2 3 4 5 6 7 8 9 10 R&R % Contribution Part <10% - Example - 34 Optional BB Deliverable UNCLASSIFIED / FOUO
  • 35. UNCLASSIFIED / FOUO “As Is” Baseline Statistics Summary for Workdays A nderson-Darling N ormality Test A -S quared 12.65 P -V alue < 0.005 M ean 44.814 S tDev 61.251 - Example - V ariance S kew ness 3751.674 2.87329 Kurtosis 9.54577 N 118 M inimum 1.000  The current process has a non-normal 1st Q uartile 12.000 M edian 22.000 3rd Q uartile 52.000 0 60 120 180 240 300 360 M aximum 365.000 distribution with the 95% C onfidence Interv al for M ean 33.647 55.981 P-Value < 0.05 95% C onfidence Interv al for M edian 17.000 29.123 95% C onfidence Interv al for S tDev  Mean = 44 days 9 5 % C onfidence Inter vals 54.308 70.246 Mean  Median = 22 days Median 20 30 40 50 60  Std Dev = 61 days  Range = 365 days Required Deliverable 35 UNCLASSIFIED / FOUO
  • 36. UNCLASSIFIED / FOUO Process Control Chart I-MR Chart of Delivery Time The current baseline 40 delivery time is stable UC L=37.70 over time with both Indiv idual V alue 35 the Moving Range 30 _ X=29.13 (3.22 days) and 25 Individual Average LC L=20.56 (29.13 days) 20 1 28 55 82 109 136 163 190 217 244 experiencing common Observation cause variation 10.0 UC L=10.53 255 data points M ov ing Range 7.5 collected with zero 5.0 subgroups, thus the __ MR=3.22 2.5 I&MR control chart 0.0 LC L=0 selected 1 28 55 82 109 136 163 190 217 244 Observation - Example - Required As Applicable 36 UNCLASSIFIED / FOUO
  • 37. UNCLASSIFIED / FOUO Process Capability Process Capability of Workdays  118 data points collected Calculations Based on Lognormal Distribution Model  Non-normal distribution LSL USL  Mean = 44 days LS L P rocess Data 0 O v erall C apability Z.Bench -0.31 Target * Z.LS L 3.07  Lower Cust Spec = 0 days USL 15 Z.U S L -0.02  Upper Cust Spec = 15 days S ample M ean 44.8136 S ample N 118 - Example - P pk -0.01 E xp. O v erall P erformance Location 3.09501 % < LS L 0.00 S cale 1.26378  65% of observations % > U S L 62.03 O bserv ed P erformance % Total 62.03 outside customer spec % < LS L 0.00 % > U S L 65.25  Z Bench = -.31 % Total 65.25 0 60 120 180 240 300 360 420 Required Deliverable 37 UNCLASSIFIED / FOUO
  • 38. UNCLASSIFIED / FOUOProcess Constraint ID Analysis Takt Rate Analysis compares the task time of each process (or process step) to other steps and customer demand to determine if the time trap is the constraint Takt Time = Net Process Time Available Takt Rate = Customer Demand Rate = Number of Units to Process Number of Units to Process Net Process Time Available Value Add Analysis - Current State Takt Tim e = 55 80 Task Time (seconds) 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 Task # - Example - CVA Time NVA-R Time NVA Time 38 BB Optional Deliverable UNCLASSIFIED / FOUO
  • 39. UNCLASSIFIED / FOUOPareto Plot Analysis Pareto Chart 100 150 80 - Example - Percent 100 60 Count 40 50 20 0 0 ut h r th st e rs So No Ea Oth Defect Count 100 50 15 6 Percent 58.5 29.2 8.8 3.5 Cum % 58.5 87.7 96.5 100.0 The South and North contain over 80% of the defects. Our project will focus here and not on the East and West. 39 Optional Deliverable UNCLASSIFIED / FOUO
  • 40. UNCLASSIFIED / FOUOCause & Effect Diagram (Fishbone) Materials Manpower Facilities & Equipment Wrong Location SpaceLack of Seats No Standardization of seats Lack of Knowledge Old Buildings Inequality in seats New CodesLack of Funds Lack of Controls Not Suited for “Dedicated” to Task Senior Leader Current Mission (Type of Space) No Suitable space to Assign Getting Seats Takes Time (Y) Effect: Vague Reqmts Lack of Database PLT = 5 days People Unplanned Programs Multiple Paths Facilities (too long) Location (Competing for Same Space) Lack of Controls Senior Leadership - Example -Delays in elevating Too Long (Time) CollocationImpasse issues Mold, HVAC Crashes Approvals Methods Mother Nature Measurements CAO/IPT Time Avail to Unforeseen Funding Decision Wait Circumstances Competency vs. PMA 40 Required Deliverable UNCLASSIFIED / FOUO
  • 41. UNCLASSIFIED / FOUOXY Matrix (Root Cause Analysis) Ra ting of Importa nce to Custome r 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Requirement Total Proce ss Ste p Proce ss Input 1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 0 10 0 11 0 12 0 13 0 14 0 15 0 16 0 17 0 18 0 19 0 20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 Total Lower Spec Target Upper Spec This table prov ides the initial input to the FMEA. W hen each of the output variables (requirements) are not correct, that represents potential "EFFECTS". W hen each input variable is not correct, that represents "Failure Modes". 1. List the Key Process Output Variables 2. Rate each v ariable on a 1-to-10 scale to importantance to the customer 3. List Key Process Input Variables 4. Rate each v ariables relationship to each output variable on a 1-to-10 scale 5. Select the top input variables to start the FMEA process; Determine how each selected input varable can "go wrong" and place that in the Failure Mode column of the FMEA. 41 Required DeliverableUNCLASSIFIED / FOUO
  • 42. UNCLASSIFIED / FOUOHypothesis Test Summary Hypothesis Test Factor (x) (ANOVA, 1 or 2 sample t - test, Chi Squared, p Value Observations/Conclusion Regression, Test of Equal Variance, etc) Tested Significant factor - 1 hour driving time from DC Example: ANOVA Location 0.030 to Baltimore office causes ticket cycle time to generally be longer for the Baltimore site Significant factor - on average, calls requiring Example: ANOVA Part vs. No Part 0.004 parts have double the cycle time (22 vs 43 hours) Significant factor - Department 4 has digitized Example: Chi Squared Department 0.000 addition of customer info to ticket and less human intervention, resulting in fewer errors South region accounted for 59% of the defects Example: Pareto Region n/a due to their manual process and distance from the parts warehouse - Example - Optional BB Deliverable Describe any other observations about the root cause (x) data 42 UNCLASSIFIED / FOUO
  • 43. UNCLASSIFIED / FOUO One-Way ANOVA: Root Cause Verification Boxplots of Net Hour by Part/No (means are indicated by solid circles) After further investigation, possible 150 Boxplot: Part/ No Part Impact on Ticket Cycle Time reasons proposed by the team are OEM backorders, lack of technician - Example - Net Hours Call Open certifications and the distance from the OEM to the client site. It is also 100 caused by the need for technicians to make a second visit to the end user to complete the part replacement. 50 Next step will be for the team to confirm these suspected root causes. 0 Part/No Part Part No Part Analysis of Variance for Net Hour  Because the p-value <= Source DF SS MS F P Part/No 1 7421 7421 8.65 0.004 0.05, we can be confident Error 69 59194 858 that calls requiring parts Total 70 66615 do have an impact on the Individual 95% CIs For Mean Level N Mean StDev --+---------+---------+---------+---- ticket cycle time. No Part 27 21.99 19.95 (--------*---------) Part 44 43.05 33.70 (------*------) --+---------+---------+---------+---- Pooled StDev = 29.29 12 24 36 48 43 Optional BB Deliverable UNCLASSIFIED / FOUO
  • 44. UNCLASSIFIED / FOUO Linear Regression 95% confident that 94.1% of the variation in “Wait Time” is from the “Qty of Deliveries” Fitted Line Plot Wait Time = 32.05 + 0.5825 Deliveries 55 S 1.11885 R-Sq 94.1% R-Sq(adj) 93.9% 50 Wait Time 45 40 - Example - 35 10 15 20 25 30 35 Deliveries 44 Optional BB Deliverable UNCLASSIFIED / FOUO
  • 45. UNCLASSIFIED / FOUO Solution Selection Matrix Per Commercial Terms Time to Issue Invoice Root Causes (Xs)? - Example - Level of Effort Presentation Overall Impact Rating Complete Accuracy Risk Rating Root Cause Significance Rating ? 10 10 10 10 10 10 Potential Improvements ? Impact Rating Offshore Costs 7 1 5 8 1 10 320 8 Commercial Terms 8 4 2 5 10 7 360 7 Quantity of Source Data 8 5 7 7 1 10 380 6 Reconciliation 10 10 10 1 10 10 510 6 Quality of Source Data 7 7 7 7 7 7 420 7 Training 8 7 10 5 8 10 480 6 Client (eg RCTI) 6 2 2 10 8 8 360 5 Job Setup 10 4 10 6 10 10 500 5 Payroll Close Date 8 10 1 1 1 8 290 8 Pre-Billing 10 8 6 8 2 10 440 5 Client Reporting Requirements 7 9 4 10 10 10 500 4 Delivery Method 9 1 1 5 5 7 280 9 Job Completion 7 10 1 7 5 10 400 3 Job Manager Requirements 9 7 5 4 8 9 420 2 * Solutions ranked > 450 have been selected to be implemented 45 Required DeliverableUNCLASSIFIED / FOUO
  • 46. UNCLASSIFIED / FOUO Implementation Plan PROJECT NAME DIVISION GREEN BELT DATE PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT Responsible Implementation Control Action Target/ Actual Solution Improvement Action Individual/ Issues/Barriers Risk Mitigation Current Status/ Comments Number Number Complete Date Solution Owner 1 2 3 4 5 6 7 8 46 Required DeliverableUNCLASSIFIED / FOUO
  • 47. UNCLASSIFIED / FOUO Pilot Plan Pilot Description Success Criteria Test Team Schedule Test • Sample Check-in Data Sets to be entered in Hand- • Data Set Entry Accuracy BR, KM, plus Start 3/1 Hand- Chek device < 3.4 DPMO Hot-Chek Complete 3/3 Chek/ • Sample Data Sets Transmitted to Hot-Chek System • Data Set Entry Time < tech rep Hot-Chek – All Hotel Floors, All Hotel Rooms 6 Seconds Interface • Confirmation Data Received from Hot-Chek to • Data Set Transmission/ Test Hand-Chek Device – All Hotel Floors, All Hotel Reception Accuracy < Rooms 3.4 DPMO • Sample Guest Data Entered in Hot-Chek System • Data Set Entry Accuracy BR, KM, + 6 Start 3/6 (variety of room requirements) < 3.4 DPMO Check-in Complete 3/7 • “Guests” (Hotel Employees) Walked Through • Data Set Entry Time < Staff Check-in 6 Seconds Check-in Process (90% Pre-Registered, 10% Non- Verificatio • Data Set Transmission/ Pre-Registered) n Test Reception Accuracy < • Volume Stress Test – Simulated Arrival 20 Guests in a “Tour Bus” 3.4 DPMO • Design Scorecard CCRs • Process Measurements recorded via Observer (see Design Scorecard); “Guest” Observations Recorded. • 25 Guests invited to experience new hotel check-in • Data Set Entry Accuracy BR, KM, + 6 Start 3/10 p < 3.4 DPMO Check-in Complete • Guests “pre-registered” with their room • Data Set Entry Time < Staff 3/10 Check-in requirements in Hot-Chek system. 6 Seconds Validation • Guests Walked Through Check-in Process (90% • Data Set Transmission/ - Example - Test Pre-Registered, 10% Non-Pre-Registered) Reception Accuracy < • Process Measurements recorded via Observer (see 3.4 DPMO Design Scorecard) • Design Scorecard CCRs • Guests Debriefed Following Experience. Recommended Deliverable 47 UNCLASSIFIED / FOUO
  • 48. UNCLASSIFIED / FOUO Pilot Results Data Collected: Plan Pilot Measure Target x s Comments PLT 1 minute 0.5 min. 0.05 min. Improved PLT Data Accuracy < 3.4 DPMO 100 DPMO Decreased DPMO PCE < 10 % < 15 % Improved PCE Pilot Observations: 1) Data Entry sequence was confusing GAP Analysis/Root Causes: 1) SOP wasn’t clear; need to lay it out better before implementation 2) Order of questions needs to be reevaluated Follow-up Actions: 1) Revise SOP on order of questions asked and flow, and run pilot again - Example - Recommended Deliverable 48 UNCLASSIFIED / FOUO
  • 49. UNCLASSIFIED / FOUOFailure Mode Effects Analysis (FMEA) O O Process Step / Potential Failure Potential Failure Potential Root D Actions D S C Current Controls Resp. Actions Taken S C Input Mode (X) Effects (Y) Causes E Recommended E E C E C T T What is the In what ways does What is the impact V What causes the Key U What are the existing What are the What are the V U E E R process step the Key Input go on the Key Output E Input to go wrong? R controls and procedures actions for reducing completed actions E R C RPN C P and Input wrong? Variables (Customer R R (inspection and test) that the occurrence of taken with the R R T T N under Requirements)? I E prevent either the cause the cause, or recalculated I E I I investigation? T N or the Failure Mode? improving RPN? T N O O Y C detection? Y C N N E E Updating Ineffective Discrepancies: POI vs Adjust templates to Ineffective reviews 5 4 None 5 100 PMO 5 2 2 20 Tollgates templates Templates match POI Users and leaders Slide purposes not Adjust slide titles 4 None 4 80 PMO 1 2 10 dont buy-in to LSS clear and notes Redundant and NVA Eliminate NVA 3 None 4 60 PMO 1 2 10 slides slides Incomplete SOP or Develop "read me" 3 None 5 75 PMO 1 2 10 "Help" within PS slides Updating Too many steps to Link templates to Inefficient updating 3 NVA steps 5 None 4 60 PMO 3 3 2 18 Tollgates build/update PS Phase Users get frustrated Too many choices Eliminate NVA; 4 None 4 48 PMO 2 2 12 and delay projects between templates group in folders Inconsistent file Simple names; 3 None 4 36 PMO 1 2 6 names and locations group in folders LSS Tool Not all LSS tools & User cannot find Not all tools available Revise list of tools 4 5 None 4 80 PMO 4 2 2 16 Access refs in PS tools & references in PS and joggers Project completion is Poor explanation in Develop direct 3 None 3 36 PMO 2 2 16 delayed some references access pdf file LSS Tool Too many steps to Inefficient retrieval of Multiple means for "Read me" file; one 2 3 None 5 30 PMO 2 2 2 8 Access retrieve tools LSS tools/refs accessing tools folder Eliminate NVA NVA steps 3 None 5 30 PMO 2 2 8 steps - Example - 49 Required Deliverable UNCLASSIFIED / FOUO
  • 50. UNCLASSIFIED / FOUO Descriptive Statistics / Process Capability “As Is” Process Capability “New” Process Capability Process Capability Analysis for Cholesterol Process Capability Analysis for Control USL USL Process Data Process Data 220.000 ST ST USL 220.000et * LT LT Target * * LSL * 193.133 Mean 184.967 le N 30 Sample N 30v (ST) 26.0455v (LT) 22.4931 StDev (ST) 20.4206 StDev (LT) 16.3662ntial (ST) Capability * Potential (ST) Capability 0.34 Cp * * 0.34 CPU CPL Process Capability Analysis for Cholesterol Cap 0.57 * Process * Cpk 0.57 120 140 160 180 200 220 240 260 Cpm * 140 160 180 200 220 240erall (LT) Capability * Observed Performance PPM < LSL * Expected ST Performance PPM < LSL * Expected LT Performance PPM < LSL * USL Process Data Overall (LT) Capability Observed Performance ProcessST Performance Expected Data Expected LT Performance 0.40 PPM > USL 133333.33 PPM > USL 151146.50 PPM > USL 116152.65 Pp * PPM < LSL * PPM < LSL * PPM < LSL * * PPM Total USL 133333.33 PPM Total 220.000 Total 151146.50 PPM 116152.65 0.40 Target * - Example - PPU PPL Ppk 0.71 * 0.71 PPM > USL PPM Total USL 0.00 0.00 Target PPM > USL PPM Total 220.000 43119.06 43119.06 * PPM > USL PPM Total 16153.51 16153.51 LSL * LSL * Mean 193.133 Mean 184.967 Sample N 30 Sample N 30 StDev (ST) 26.0455 StDev (ST) 20.4206 StDev (LT) 22.4931 StDev (LT) 16.3662 Potential (ST) Capability 50 Cp * Required Deliverable Potential (ST) Capability Cp * UNCLASSIFIED / FOUO
  • 51. UNCLASSIFIED / FOUOControl Chart P Chart for Total Defectives 0.10 - Example - Feb/Mar Data Confirms Process Has Remained In Control 3.0SL=0.08162 Proportion 0.05 P=0.03817 0.00 -3.0SL=0.00E+00 0 50 100 150 Sample Number Recommended Deliverable 51 UNCLASSIFIED / FOUO
  • 52. UNCLASSIFIED / FOUO “As Is” vs. “To Be” Process Map “As Is” Process To Be Process - Example - Required Deliverable 52 UNCLASSIFIED / FOUO
  • 53. UNCLASSIFIED / FOUO Related Project Consideration Multi-Generation Project Plan (MGPP) Generation 1 Generation 2 Generation 3 (Date) (Date) (Date) Vision Process Generation Platforms / Technology Optional Deliverable 53 UNCLASSIFIED / FOUO

×