1. UNCLASSIFIED / FOUO
Black Belt
Improve
Tollgate Briefing
Project Name
DEPMS Project Number
Name of Belt
Organization
Date
UNCLASSIFIED / FOUO
2. UNCLASSIFIED / FOUO
Tollgate Requirements - Improve
NG CPI BLACK BELT TOLLGATE REQUIREMENTS
PROJECT DELIVERABLES NGB COMMENTS
IMPROVE
Solution Prioritization / Selection Mandatory Brainstorm/prioritize
Future State Process Map Mandatory
Implementation / Improvement Plan Mandatory Action plan
Pilot Plan / Pilot Results Recommended
New Process Capability / Sigma Level / DPMO Mandatory Based on data type
New Control Charts Recommended Not appl to all projects
Storyboard / A3 Mandatory 1-page proj summary
Barriers/Issues/Risks Mandatory
Quick Wins Recommended
Lessons Learned Optional
2
UNCLASSIFIED / FOUO
3. UNCLASSIFIED / FOUO
Improve Tollgate Templates
NOTE: THIS IS A TEMPLATE FOR ALL NG CPI BELTS
NG has developed this template as a basic format with standard deliverables to
help guide NG CPI belts through the NG tollgate requirements for certification.
It is recognized that each project is unique and has unique deliverables with
unique flows. Therefore, this format does not have to be followed exactly to the
letter of the law for your project.
(DELETE THIS SLIDE FOR YOUR PROJECT)
3
UNCLASSIFIED / FOUO
4. UNCLASSIFIED / FOUO
Define Charter and Timeline
Team Members
Name Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Take away
Project Sponsor Approver
message goes
Process Owner Approver
here
Project Charter (impact of
Problem problem)
Statement:
Business Case:
Project Timeline
Goal Statement:
Phase Start Stop Status
Unit: Define mm/dd/yy mm/dd/yy
Defect: Measure mm/dd/yy mm/dd/yy
Customer Analyze mm/dd/yy mm/dd/yy
Specifications:
Improve mm/dd/yy mm/dd/yy
Process Start:
Control mm/dd/yy mm/dd/yy
Process Stop:
4
Scope:
Required Deliverable UNCLASSIFIED / FOUO
5. UNCLASSIFIED / FOUO
Measure Overview
Baseline Statistics Process Capability
Process Capability of Delivery Time P otential (Within) C apability
VOC / VOB LSL Target USL
Cp
C PL
1.16
2.22
Unit (d) or Mean (c) W ithin
C PU
C pk
0.10
0.10
O v erall C C pk 1.16
Defect (d) or St. Dev. (c) O v erall C apability
Pp 1.24
DPMO (d) PPL 2.37
PPU 0.11
PCE: (Cycle Time Only)
PLT: (Cycle Time Only)
- Example - LS L
P pk
C pm
0.11
0.35
P rocess D ata
10
T arget 20
USL 30
S ample M ean 29.1203
Sigma Quality Level S ample N
S tD ev (Within)
266
2.87033
S tD ev (O v erall) 2.69154
12 16 20 24 28 32 36
MSA Results: show the percentage result of the GR&R or other O bserv ed P erformance E xp. Within P erformance E xp. O v erall P erformance
P P M < LS L 0.00
measurement systems analysis carried out in the project
P P M < LS L 0.00 PPM < LS L 0.00
PPM > USL 281954.89 PPM > USL 379619.67 PPM > USL 371895.18
P P M T otal 281954.89 PPM T otal 379619.67 P P M T otal 371895.18
Baseline “As Is” Performance Tools Used
Summary for Delivery Time A nderson-D arling N ormality Test
Detailed process mapping Time Series Plot
A -S quared 1.95
P -V alue <
M ean
0.005
29.128 Measurement Systems Analysis Probability Plot
S tD ev 2.677
V ariance
S kew ness
7.169
0.201075 Value Stream Mapping Pareto Analysis
Kurtosis -0.471714
N
M inimum
266
24.000 Data Collection Planning Operational Definitions
5s
1st Q uartile 27.000
24 26 28 30 32 34
M edian
3rd Q uartile
29.000
31.000 Basic Statistics
Generic Pull
M aximum 35.000
95% C onfidence Interv al for M ean
28.805 29.451
Process Capability
95% C onfidence Interv al for M edian
29.000 29.000 Histograms Control Charts
95% C onfidence Interv al for S tD ev
9 5 % C onfidence Inter v als
2.468 2.927
Mean
5
Median
28.8 28.9 29.0 29.1 29.2 29.3 29.4
Required Deliverable
UNCLASSIFIED / FOUO
6. UNCLASSIFIED / FOUO
Analyze Overview
Fishbone Diagram Cause and Effect Matrix
Ra ting of
Importa nce to
Facilities & Equipment
Custome r
Materials Manpower 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Wrong Location
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Requirement
Space
Lack of Seats No Standardization of seats Lack of Knowledge Old Buildings
Total
Inequality in seats New Codes Proce ss Ste p Proce ss Input
Lack of Funds Lack of Controls Not Suited for
“Dedicated” to Task
1 0
Senior Leader Current Mission (Type of Space)
2
3
0
0
No Suitable space to Assign 4 0
Getting Seats Takes Time 5
6
0
0
(Y) Effect:
7 0
8 0
9 0
10 0
PLT = 5 days
11 0
Vague Lack of Database
12 0
Reqmts
13 0
People Unplanned Programs
(too long)
14 0
15 0
Multiple Paths Facilities 16 0
17 0
Location (Competing for 18 0
Same Space) 19 0
20 0
Lack of Controls 0
Senior Leadership
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
Total
Delays in elevating Too Long (Time) Lower Spec
Collocation Target
Impasse issues Mold, HVAC Crashes Upper Spec
Approvals This table prov ides the initial input to the FMEA. W hen each of the output variables
(requirements) are not correct, that represents potential "EFFECTS". W hen each input
Methods Mother Nature Measurements variable is not correct, that represents "Failure Modes".
CAO/IPT Time Avail
Unforeseen Funding Decision to Wait
1. List the Key Process Output Variables
2. Rate each v ariable on a 1-to-10 scale to importantance to the customer
Circumstances 3. List Key Process Input Variables
4. Rate each v ariables relationship to each output variable on a 1-to-10 scale
5. Select the top input variables to start the FMEA process; Determine how each selected
Competency vs. PMA
input varable can "go wrong" and place that in the Failure Mode column of the FMEA.
Critical X/Root Causes Analysis Root Cause/Effect
Pareto Chart
100
150
80
Root cause:
Effect - Example -
Percent
100 60
Count
50
40
Root cause:
20
Effect
0 0
Defect So
ut h
No
r th
Ea
s t
Oth
e rs
Root cause:
Count 100 50 15 6
Percent 58.5 29.2 8.8 3.5
Effect 6
Cum % 58.5 87.7 96.5 100.0
Required Deliverable
UNCLASSIFIED / FOUO
8. UNCLASSIFIED / FOUO
Implementation Plan
PROJECT NAME DIVISION GREEN BELT DATE
PROJECT SPONSOR SERVICE AREA / FUNCTION / SERVICE BLACK BELT
Responsible
Implementation Control Action Target/ Actual
Solution Improvement Action Individual/ Issues/Barriers Risk Mitigation Current Status/ Comments
Number Number Complete Date
Solution Owner
1
2
3
4
5
6
7
8
8
Required DeliverableUNCLASSIFIED / FOUO
9. UNCLASSIFIED / FOUO
Pilot Plan
Pilot
Description Success Criteria Test Team Schedule
Test
• Sample Check-in Data Sets to be entered in Hand- • Data Set Entry Accuracy BR, KM, plus Start 3/1
Hand- Chek device < 3.4 DPMO Hot-Chek Complete 3/3
Chek/ • Sample Data Sets Transmitted to Hot-Chek System • Data Set Entry Time < tech rep
Hot-Chek – All Hotel Floors, All Hotel Rooms 6 Seconds
Interface • Confirmation Data Received from Hot-Chek to • Data Set Transmission/
Test Hand-Chek Device – All Hotel Floors, All Hotel Reception Accuracy <
Rooms 3.4 DPMO
• Sample Guest Data Entered in Hot-Chek System • Data Set Entry Accuracy BR, KM, + 6 Start 3/6
(variety of room requirements) < 3.4 DPMO Check-in Complete 3/7
• “Guests” (Hotel Employees) Walked Through • Data Set Entry Time < Staff
Check-in 6 Seconds
Check-in Process (90% Pre-Registered, 10% Non-
Verificatio • Data Set Transmission/
Pre-Registered)
n Test Reception Accuracy <
• Volume Stress Test – Simulated Arrival 20 Guests in
a “Tour Bus” 3.4 DPMO
• Design Scorecard CCRs
• Process Measurements recorded via Observer (see
Design Scorecard); “Guest” Observations Recorded.
• 25 Guests invited to experience new hotel check-in • Data Set Entry Accuracy BR, KM, + 6 Start 3/10
p < 3.4 DPMO Check-in Complete
• Guests “pre-registered” with their room • Data Set Entry Time < Staff 3/10
Check-in requirements in Hot-Chek system. 6 Seconds
Validation • Guests Walked Through Check-in Process (90% • Data Set Transmission/ - Example -
Test Pre-Registered, 10% Non-Pre-Registered) Reception Accuracy <
• Process Measurements recorded via Observer (see 3.4 DPMO
Design Scorecard) • Design Scorecard CCRs
• Guests Debriefed Following Experience.
Recommended Deliverable 9
UNCLASSIFIED / FOUO
10. UNCLASSIFIED / FOUO
Pilot Results
Data Collected:
Plan Pilot
Measure Target x s Comments
PLT 1 minute 0.5 min. 0.05 min. Improved PLT
Data Accuracy < 3.4 DPMO 100 DPMO Decreased DPMO
PCE < 10 % < 15 % Improved PCE
Pilot Observations: 1) Data Entry sequence was confusing
GAP Analysis/Root Causes: 1) SOP wasn’t clear; need to lay it out better
before implementation
2) Order of questions needs to be reevaluated
Follow-up Actions: 1) Revise SOP on order of questions asked and flow, and run pilot again
- Example -
Recommended Deliverable 10
UNCLASSIFIED / FOUO
11. UNCLASSIFIED / FOUO
Failure Mode Effects Analysis (FMEA)
O O
Process Step / Potential Failure Potential Failure Potential Root D Actions D
S C Current Controls Resp. Actions Taken S C
Input Mode (X) Effects (Y) Causes E Recommended E
E C E C
T T
What is the In what ways does What is the impact V What causes the Key U What are the existing What are the What are the V U
E E R
process step the Key Input go on the Key Output E Input to go wrong? R controls and procedures actions for reducing completed actions E R
C RPN C P
and Input wrong? Variables (Customer R R (inspection and test) that the occurrence of taken with the R R
T T N
under Requirements)? I E prevent either the cause the cause, or recalculated I E
I I
investigation? T N or the Failure Mode? improving RPN? T N
O O
Y C detection? Y C
N N
E E
Updating Ineffective Discrepancies: POI vs Adjust templates to
Ineffective reviews 5 4 None 5 100 PMO 5 2 2 20
Tollgates templates Templates match POI
Users and leaders Slide purposes not Adjust slide titles
4 None 4 80 PMO 1 2 10
don't buy-in to LSS clear and notes
Redundant and NVA Eliminate NVA
3 None 4 60 PMO 1 2 10
slides slides
Incomplete SOP or Develop "read me"
3 None 5 75 PMO 1 2 10
"Help" within PS slides
Updating Too many steps to Link templates to
Inefficient updating 3 NVA steps 5 None 4 60 PMO 3 3 2 18
Tollgates build/update PS Phase
Users get frustrated Too many choices Eliminate NVA;
4 None 4 48 PMO 2 2 12
and delay projects between templates group in folders
Inconsistent file Simple names;
3 None 4 36 PMO 1 2 6
names and locations group in folders
LSS Tool Not all LSS tools & User cannot find Not all tools available Revise list of tools
4 5 None 4 80 PMO 4 2 2 16
Access refs in PS tools & references in PS and joggers
Project completion is Poor explanation in Develop direct
3 None 3 36 PMO 2 2 16
delayed some references access pdf file
LSS Tool Too many steps to Inefficient retrieval of Multiple means for "Read me" file; one
2 3 None 5 30 PMO 2 2 2 8
Access retrieve tools LSS tools/refs accessing tools folder
Eliminate NVA
NVA steps 3 None 5 30 PMO 2 2 8
steps
- Example -
11
Required Deliverable
UNCLASSIFIED / FOUO
12. UNCLASSIFIED / FOUO
Descriptive Statistics / Process Capability
“As Is” Process Capability “New” Process Capability
Process Capability Analysis for Cholesterol Process Capability Analysis for Control
USL USL
Process Data Process Data
220.000 ST ST
USL 220.000
et * LT LT
Target *
*
LSL *
193.133
Mean 184.967
le N 30
Sample N 30
v (ST) 26.0455
v (LT) 22.4931 StDev (ST) 20.4206
StDev (LT) 16.3662
ntial (ST) Capability
* Potential (ST) Capability
0.34 Cp *
*
0.34
CPU
CPL
Process Capability Analysis for Cholesterol Cap
0.57
*
Process
* Cpk 0.57
120 140 160 180 200 220 240 260
Cpm *
140 160 180 200 220 240
erall (LT) Capability
*
Observed Performance
PPM < LSL *
Expected ST Performance
PPM < LSL *
Expected LT Performance
PPM < LSL *
USL
Process Data Overall (LT) Capability Observed Performance
ProcessST Performance
Expected
Data Expected LT Performance
0.40 PPM > USL 133333.33 PPM > USL 151146.50 PPM > USL 116152.65
Pp * PPM < LSL * PPM < LSL * PPM < LSL *
* PPM Total USL
133333.33 PPM Total 220.000 Total
151146.50 PPM 116152.65
0.40
Target * - Example -
PPU
PPL
Ppk
0.71
*
0.71
PPM > USL
PPM Total
USL
0.00
0.00
Target
PPM > USL
PPM Total
220.000
43119.06
43119.06
*
PPM > USL
PPM Total
16153.51
16153.51
LSL *
LSL *
Mean 193.133
Mean 184.967
Sample N 30
Sample N 30
StDev (ST) 26.0455
StDev (ST) 20.4206
StDev (LT) 22.4931
StDev (LT) 16.3662
Potential (ST) Capability 12
Cp * Required Deliverable
Potential (ST) Capability
Cp * UNCLASSIFIED / FOUO
13. UNCLASSIFIED / FOUO
Control Chart
P Chart for Total Defectives
0.10 - Example -
Feb/Mar Data Confirms Process Has
Remained In Control
3.0SL=0.08162
Proportion
0.05
P=0.03817
0.00 -3.0SL=0.00E+00
0 50 100 150
Sample Number
Recommended Deliverable 13
UNCLASSIFIED / FOUO
14. UNCLASSIFIED / FOUO
“As Is” vs. “To Be” Process Map
“As Is” Process
To Be Process
- Example -
Required Deliverable 14
UNCLASSIFIED / FOUO
17. UNCLASSIFIED / FOUO
Improve Storyboard
Define 1.2 Day
Project Charter CCR Gap Sigma
Performanc
BUS CASE: Be #2 Fin Service Provider e Level of
1.3
GOAL: Reduce Loan/Lease CT from
9.2 to 8.0 days by July 1
Measure
FIN IMPACT: $2.7M per year Analyze
- Example -
Improve
Pilot Plan
Loan or
Lease
Screen “Officer performs
Entry both” & “Officers
changed”,
Color
eliminated as
Printouts
contributors to high
Rewards cycle time.
& Recog
Officer Work & Turnover, Waiting, & Automation
Affect CT; Job Aids affect Variation in CT Flex Required Deliverable
Time
17
UNCLASSIFIED / FOUO
18. UNCLASSIFIED / FOUO
8-Step A3 Project Summary Report
Company: Department: Date: Prepared by:
1. Define the problem situation 3. Action plans to correct problems
2. Problem Analysis 4. Results of Activity
5. Future Steps
18
UNCLASSIFIED / FOUO
19. UNCLASSIFIED / FOUO
NG CPI Tollgate Tool
8-STEP PROCESS
1. Validate 2. Identify 3. Set 4. Find Root 5. Develop 6. See C-Ms 7. Confirm
8. Standardize
Problem Gaps Targets Cause C-Ms Through Results
Define Measure Analyze Improve Control
Project Charter
• Problem Stmt Detailed As Is Process Potential Xs “To Be” Process Map Process Control Plan
• Defect Definition Map • Brainstorming
• 5 Whys Solution Generation / Process Owner
• Goal Statement
Value Stream Map • Fishbone Prioritization Accountability
• Project Scope Affinity Diagram
Key Process Metrics •
• Business Impact
• Strat Alignment Data Collection Plan Critical Xs Improvement Strategy Updated Financial
Measure Systems • Cause & Effect Matrix • Improvement Model Benefits
Sponsor & Team • Hypothesis Testing • Implementation Plan
Analysis • Pilot
• Regression Replication Opportunities
Replication Check Data Collected • “X” Improvement Target
• Time Studies
Measurable Y Baseline Data Analysis • Theory of Constraints Mistake Proofing Project Documentation
Voice of Customer • Descriptive Stats of revised policies,
FMEA SOP’s, procedures, and
Customer Specs • Graphs
• Risk Analysis
• Pareto Charts
training
Voice of Business • Risk Mitigation Plan
Project Timeline Est Financial Benefits FMEA Visual Process Control
• Risk Analysis Tools (Optional)
Communication Plan Control Charts (as • Risk Mitigation Plan
• Stakeholder Analysis needed)
High Level Process Process Capability Control Charts
Map (SIPOC) Process Capability
I accept the Define I accept the Measure I accept the Analyze I accept the Improve I accept the Control
Tollgate Tollgate Tollgate Tollgate Tollgate
(Sponsor) (Sponsor) (Sponsor) (Sponsor) (Sponsor)
(Process Owner) (Process Owner) (Process Owner) (Process Owner) (Process Owner)
(MBB) (MBB) (MBB) (MBB) (MBB)
(Finance Owner) (Finance Owner) 19
UNCLASSIFIED / FOUO
21. UNCLASSIFIED / FOUO
Cross Functional Team
Team Members
Name Role Affiliation DACI
Black Belt Driver
Master Black Belt Driver
Project Sponsor Approver
Process Owner Approver
Contributor
Contributor
Contributor
Contributor
Inform
Inform
Inform
Inform
21
Required Deliverable
UNCLASSIFIED / FOUO
22. UNCLASSIFIED / FOUO
Replication Check
I confirm that:
DEPMS (DoD Enterprise Performance Management System) has been
searched for similar projects: Yes / No
Replication: List relevant initiatives / potential replication projects
found (if any):
• Project 1: (DEPMS # or other tracking tool project number)
• Project 2:
Collaboration: Identify organizations that can/should be considered for
working this project collaboratively:
• Organization 1:
• Organization 2: 22
Required Deliverable
UNCLASSIFIED / FOUO
23. UNCLASSIFIED / FOUO
Strategic Alignment
The Define Tollgate requires a linkage to organizational strategy.
Include an organizational metric/metrics for which your project will
help improve
Refer to your organization’s Strategic Plan and/or other referenced
documents
23
Required Deliverable
UNCLASSIFIED / FOUO
24. UNCLASSIFIED / FOUO
Business Impact
Insert as much information as possible about the potential operational
and/or financial benefits
Include any assumptions upon which these estimates are based
Example: Operational benefits – This project is expected to reduce PLT
by 35%, improve SQL from 1.2 to 3.0, save 20 man hours per shift
Example: Financial benefits – This project is expected to save $xx in FYxx
24
Required Deliverable
UNCLASSIFIED / FOUO
26. UNCLASSIFIED / FOUO
Voice of Customer / Voice of Business
Critical Customer
Voice of the Key Customer Issue(s) Requirement
Customer /
What does the customer want from us? What does the customer want from us? We should summarize key issues and
We need to identify the issue(s) that translate them into specific and measurable
prevent us from satisfying our requirements
customers.
CriticalBusiness
Voice of the Key Process Issue(s)
Requirement
Business
What does the business want/need from us? What does the business want/need from We should summarize key issues and
us? We need to identify the issue(s) translate them into specific and measurable
that prevent us from meeting strategic requirements
goals/missions.
26
Required Deliverable
UNCLASSIFIED / FOUO
27. UNCLASSIFIED / FOUO
Stakeholder Analysis
Stakeholder Explanation of
Stakeholder’s Stakeholder
Project Impact Level of Current Action Plan
Stakeholder Current Attitude Score
On Stakeholder Influence on Stakeholder For
Name/Group Toward Project (H=3, M=2, L=1,
(H, M, L) Success of Attitude Stakeholder
( +, 0, - ) +=3, 0=1, -=-3)
Project (H,M,L) (list)
Recommended Deliverable 27
UNCLASSIFIED / FOUO
28. UNCLASSIFIED / FOUO
Communication Plan
Topics of
Audience Media Purpose Discussion/ Owner Frequency Notes/Status
Key Messages
28
Required Deliverable
UNCLASSIFIED / FOUO
29. UNCLASSIFIED / FOUO
Detailed “As Is” Process Map
- Example -
Required Deliverable - VSM or Process Map or Both
29
UNCLASSIFIED / FOUO