SlideShare a Scribd company logo
Revised 04/13/07 Page 1 of
25
CASE STUDY
Introduction
This is a case study of how Six Sigma, using the DMAIC improvement methodology, can be applied in a
commercial environment.
In January 2004, district manager Ram Singh volunteered the office to be a Six Sigma beta site. The team
was tasked with driving orders growth using the Six Sigma process.
This case study shows how the mission to become a Six Sigma office would drive improvement projects for
two key customer CTQs, mapped here step-by-step to the DMAIC methodology. The sequence and/or
details of some of the actual events have been modified for the sake of simplicity or to better illustrate a
point.
Project Leadership
District Manager
Black Belt
Small Project Specialist
Systems Engineer
Define 1: Project CTQs
1.1 Evaluate VOC
The team knew they needed to drive orders growth by increasing customer satisfaction using the Six
Sigma process; the question was, how? In this initial meeting, the team brainstormed
• Potential beta site customers to be included in an improvement project
• Methods for collecting Voice of the Customer
Beta site customers
The team identified eight beta site customers—four contractors, three distributors, and an original
equipment manufacturer (OEM)—to be the target of their improvement efforts. These customers, each
experiencing varying degrees of growth or stagnation, were carefully chosen so as to provide a true
representation of the market growth drivers.
Methodology for collecting Voice of the Customer data
The team decided that personal interviews would be the best way to gather Voice of the Customer data.
They felt that face-to-face interactions would enable them to read nonverbal cues and probe for more
information, Furthermore, an interview would provide one more opportunity to get in front of the customer
and reinforce their commitment to increasing customer satisfaction
To collect Voice of the Customer data, the team needed an instrument that would ensure consistency in
how the interviews would be conducted, and as a result, provide high quality data. The instrument the
team developed was a survey—not to be filled out by the customer, but that instead would serve as a basis
for conducting the customer interviews—with the intention of determining how the Delhi office's
performance could help customer growth. The survey was structured around five categories the team
Revised 04/13/07 Page 2 of
25
identified that paralleled the sales process. For each category, the team brainstormed some points of
customer concern that might stimulate further discussion.
Category Sample points for further discussion
Customer service Call response time, availability of technical information,
accuracy of information to customer
The quotation/preorder process Accuracy, timing, pricing of quotations
Post order service Submittal accuracy and lead time, return
material/shipping damage, issue response time
Items within partial control of the Delhi office Product specifications, product offerings
Items out of the control of the Delhi office Overall business pricing structure, component product
quality, product availability
The team agreed on the following process and desired outcomes for the customer interviews:
• The interviewers would help the customer identify CTQs for each of the five categories, making certain
to keep customers focused on WHAT was important to them, not on HOW to solve the problem
• The customer would force rank the CTQs within each category according to how they help drive mutual
growth
• The customer would define how to measure each CTQ and would set the specifications for what was
considered to be acceptable product or service quality
• The customer would identify their top five CTQs across all categories—again, according to how they
help drive mutual growth
Gathering Voice of the Customer data would be a team effort.
1.2 Contain Problem if Necessary
After reviewing the results of the customer interviews, the team felt they did a good job of forecasting
customer concerns during their brainstorming meeting. Because the customers’ key concerns were
addressed by the CTQs, no containment actions were needed.
1.3 Translate VOC into CTQs
Because of the way the team had gathered the Voice of the Customer data—by helping customers identify
CTQs and asking customers for specific, measurable performance requirements—the task of translating
customer feedback into CTQs had taken care of itself.
1.4 Prioritize CTQs
The next task for the team was to prioritize CTQs based on the results of the customer survey. The
prioritization process was facilitated by the survey's design, since it required customers to force rank CTQs.
Given these considerations, the team identified the following two CTQs to target for their improvement
project:
• Call response time
• On-time field submittals
1.5 Integrate CTQs with Business Strategy
The goal of the sales office was simple: to drive growth for the eight beta site customers by 10% above
their targeted growth for the year. During the Voice of the Customer surveys, the customers had agreed
that improvement in the top-ranked CTQs could enhance current order rates.
Revised 04/13/07 Page 3 of
25
Define 2: Define Charter
2.1 Develop Charter
From the Voice of the Customer data and the initial brainstorming session, the team had enough
information to create a project charter, a document that establishes the purpose and plan for the project.
First the team agreed on a statement of the problem, or the unmet customer need targeted for
improvement.
Problem statement:
"Voice of the Customer data indicates that in order to promote growth with our customers, we need to
improve our process capability in the areas of call response time and on-time field submittals."
Call response time: “Customers want your organisation to respond to their calls in less than 4 hours.
Currently customers tell us we respond anywhere from immediately to 48 hours. Baseline evaluation
with three customers shows that delays in responding to customer issues in the past 2 months have
delayed customer construction or bidding projects in five different cases, resulting in a financial loss to
these customers of approximately $50,000.”
On-time submittals: "Customers expect a submittal package returned to them in less than 5 days
from the purchase order date. (A submittal package includes technical drawings, project
documentation, and a cover sheet. It is sent to the customer for approval before the construction
process can begin.) Today, we are returning submittal packages anywhere from 1 to 10 days from the
purchase order date. A delay in getting the submittal package to the customer will delay the
construction process as a whole. Without the drawings, the equipment cannot be approved; without the
approval, the manufacturing process cannot begin. In the past 2 months, extremely late submittals
have caused our customers to have to work overtime in order to meet their customers’ timelines (at a
cost of approximately $10,000)."
Goal statement:
"By the end of December 2004, both of these processes will be operating at a Six Sigma level."
After considering the problem and goal statement, the team created a project scope, which detailed the
process boundaries and project focus.
Project scope:
"The team will target improvement efforts on two CTQs: call response time and on-time field
submittals."
The team then identified the resources required to complete the project.
Resources/team members:
A separate team, lead by a Green Belt, was established for each targeted CTQ.
Finally, the team determined the benefits expected to result from their improvement project.
Expected benefits:
• Increased top line growth of 10%, or $1,000,000, for the eight customers involved while reducing
pre- and post-order write-offs (tangible benefit)
Revised 04/13/07 Page 4 of
25
• Translation opportunities to the other 27 district offices (tangible benefit)
• Stronger relationships with customers (intangible benefit)
2.2 Obtain Key Business Stakeholder Signoff
The team identified the project's key stakeholders as follows:
• Vice President, Sales, champion at high level and one who could support translation
• Regional Manager, who similarly could support translation within the region
• Ram Singh, District Manager, who could drive action in the office by setting priorities and
measurements, provide critical resources, and remove barriers by influencing other critical constituents
• The sales engineers, who are measured by and paid according to customer orders and whose roles
could be most significantly impacted by the projects, and who could help ensure the process stays in
place once the project is completed.
All key stakeholders agreed to the project's purpose and other provisions as outlined in the charter.
Define 3: High Level Process Map
3.1 Construct Process Map
Call response time CTQ:
The team mapped the process from when a customer calls to when their issue is resolved. This high-level
process map separated calls into four types:
• Product/service (for issues regarding product defect, missing product, RGA)
• Computerized ordering system (for job release, price and availability, SPA, and order entry issues)
• Transportation (shipping status inquiries)
• Technical questions
On-time field submittals CTQ:
Team leader Green Belt and her team created a SIPOC map of their process, which depicted the following
steps:
1. Sales engineer (SE) gets purchase order from customer
2. SE transmits quotation to sales engineer assistant (SEA) via a quotation software application called
Speedi
3. SE hands off the purchase order to the SEA
4. SEA compiles the submittal package
5. SEA sends submittal to customer
3.2 Validate Process Map Against Charter
Call response CTQ:
Project leader and his team confirmed that the four types of calls received by the sales engineers as
presented on their process map were within the scope of the project. Since their CTQ involved
acknowledging customer calls and not resolving the issue, the team agreed that these calls were
something that could be handled within the office. The team felt the early use of the Change Management
Includes/Excludes tool helped to ensure that their efforts were directed to issues within their control.
On-time field submittals CTQ:
Revised 04/13/07 Page 5 of
25
Green Belt (PL) and the others assigned to this CTQ confirmed that the five steps in their process were
within the scope of the project, and that the boundaries of the project were neither too narrow nor beyond
the control of their Delhi office. This team also felt that early use of the Change Management
Includes/Excludes tool helped to ensure that their efforts were directed to issues within their control.
Measure 4: Project Y
4.1 Identify all possible Ys for CTQs
Along with extracting CTQs during customer interviews, team collecting VOC / Interviews had worked with
customers to determine output measures, or Ys, for each CTQ. Although each team felt they had
adequately identified their Ys during the interviewing process, to be safe, they followed the Coach's
recommendation to brainstorm all possible Ys.
Call response CTQ:
Team working on Call response project identified three possible Ys for the call response CTQ:
Y1. The time it takes to respond to an initial customer call
Y2. The time it takes to resolve the customer's issue
Y3. Whether or not a call is returned within the 4-hour critical time period
On-time field submittals CTQ:
Team working on Submittal time CTQ brainstormed the following possible Ys:
Y1. The number of drawings disapproved by the sales office
Y2. The time it takes for the customer to forward the purchase order to the sales engineer
Y3. The time it takes the sales engineer assistant to compile the submittal package
Y4. The time from when the sales engineer gets the purchase order from the customer to the time when
the submittal drawings are shipped to the customer
4.2 Prioritize and Select Project Y
According to the Coach, the next step for each team was to select the best possible Y and establish the
metric by which it would be measured.
Call response CTQ:
CTQ Metric (Y)
Call response time Time (in minutes) to return a call as
measured from when the call was initially
received
On-time field submittals CTQ:
CTQ Metric (Y)
On-time field submittals Time (in days) to ship field submittals as
measured from the date of the customer
purchase order
Revised 04/13/07 Page 6 of
25
4.3 Ensure Project Scope is Manageable
Y: Time (in minutes) for call response
It would not seriously impact productivity since the sales engineer was simply logging information on a time
sheet. The team agreed that the small size of this project and its limited scope made it possible to be
accomplished in the period set for the project.
Y: Time (in days) to ship field submittals
Because the project Ys were well defined when the team created their charter, considerations such as
timing, resources, equipment, facilities, and barriers and constraints were addressed at that time.
Measure 5: Performance Standards for Y
5.1 Gather Required Information to Establish a Performance Standard
Here are the customer-defined performance standards for each Y:
Y Performance standard
Time (in minutes ) to return
call
All phone calls from a customer should be returned
within 4 hours
Time (in hours) to ship field
submittals
All field submittals generated by the sales office
should be shipped no later than 5 working days from
the purchase order date
5.2 Set and confirm performance standards for the project
Y: Time (in minutes) for call response
The survey team was surprised to find that each one of the eight customers, independent of each other, set
the same standard for returning a call: 4 hours.
The team also learned that their competitors were returning calls within the 4-hour time period. They
decided that the customer consensus and benchmark data was sufficient to confirm the performance
standard.
Y: Time (in days) to ship field submittals
To confirm the performance standards the customer initially defined, the team they decided to 1) conduct
follow-up interviews with customers, and 2) benchmark against other companies.
The team interviewed two contractors and two distributors, who confirmed that the 5-day specification limit
was an acceptable performance standard. In addition, they asked for customer input about the
presentation and quality of their field submittals.
5.3 Define Unit, Opportunity, and Defect
At this point, each team consulted the Coach to establish solid operational definitions of unit, opportunity,
and defect in order to ensure consistency in measurement and focus for improvement. The Coach defines
a unit as any item that is produced or processed. An opportunity is anything that is inspected, measured,
or tested on a unit that provides a chance of not meeting the performance standard, thus allowing a defect.
In addition, each team had to define the number of opportunities per unit.
Revised 04/13/07 Page 7 of
25
Y: Time (in minutes) for call response
The team defined unit, defect, and opportunity as follows:
• A unit was defined as an initial call from a customer. Even initial fax transmissions from customers.
• An opportunity was defined as a call returned to the customer. The team recognized that only one
opportunity for defect existed per call; there could be only one first response to a customer call.
• A defect was defined as any return call to address an initial call/voice mail/fax that was not placed
within the 4-hour upper specification limit.
Y: Time (in days) to ship field submittals
The team defined unit, defect, and opportunity as follows:
• A unit was defined as a project field submittal
• An opportunity was defined as the field submittal shipment. Since there could be only one shipment of
the project field submittal, the number of opportunities for a defect per unit is 1.
• A defect was defined as a project field submittal shipped more than 5 days from the purchase order
date.
MEASURE 6: Project Data Collection Plan
6.1 Draft Project Data Collection Plan
Y: Time (in minutes) for call response
As advised by the Coach, they considered each of the following questions:
What data needed to be collected?
The team looked at their process map to determine the data they would need to collect to help them
understand why it took longer than 4 hours to return a customer's call. One of the team members observed
that the data would fall into one of two areas: 1) Call characteristics, and 2) Call/response time data.
Green Belt ( Project lead) created a table and the team brainstormed to identify the data categories for
each area. The final table listing the data to be collected looked like this:
Call Characteristics Call/Response Time Data
Type of communication: call,
voice mail, or fax
Date/time of initial call
Caller's name Time of first response
Caller's company Required Factory response time
Name of factory called Actual factory response time
Type of issue Date/time issue resolved
As part of the data collection effort, the team needed to solidify some definitions. They decided that the
time a call was received would be defined by the time it was noted in the log by the sales engineer. In the
case of a voice the time stamp on the machine would define mail or fax, “received”. They also agreed that
call response time would be defined by the time the call to the customer was placed; the call was
considered successful if the sales engineer was able to reach the customer, leave a voice mail or leave a
message with someone in the customer’s office.
When will the data be collected?
Data was collected for the last financial year
Who will collect the data?
Revised 04/13/07 Page 8 of
25
To provide a representative sample of data, each of the 15 members of the Delhi office would collect data.
Each team member agreed to choose a specific week each month to log all calls and then submit the logs
to an intern student for data entry in a spreadsheet.
How will sampling be done?
To reduce the effort of recording the data manually, the team would record data for 1-week intervals every
other week over a 2-month period, resulting in 4 weeks of data. This guaranteed at least 150 data points
over the period of the study and ensured the data was random and therefore statistically valid. The sample
population would consist of the eight beta site customers.
Who will analyze the data?
On a monthly basis, Green Belt would work with Black Belt (Coach) to analyze the data using applicable
statistical tools. They would then share the results with the rest of the team.
Y: Time (in days) to ship field submittals
What data needed to be collected?
The team needed to collect data that corresponded to the steps in the process map that were within their
office's full control—the five steps in their SIPOC map. This included the date the purchase order was
received, the date it was handed off to the sales engineer assistant, and the date it was shipped to the
customer.
When will the data be collected?
The team decided they would collect baseline data for a 3-month period between Sept to Dec.
Who will collect the data?
The person who shipped the field submittal package would be responsible for collecting his or her own
data on the process and forwarding that data to Green Belt at the end of each week.
Who will analyze the data?
On a monthly basis, Green Belt (PL) would work with black Belt (Coach) to analyze the data using
applicable statistical tools. They would then share the results with the rest of the team.
6.2 Complete Measurement System Analysis (MSA)
To ensure that the data collected would be both complete and accurate, each team needed to perform a
measurement system analysis. This would ensure that they could account for and/or eliminate any
variation introduced by their measurement system.
Y: Time (in minutes) for call response
The team then turned to their Black Belt, for advice on selecting a tool to measure the variation in their data
collection form.
The team thought about their measurement process and brainstormed a list of possible sources of
variation. They then considered how to eliminate or measure the source of variation.
Measurement Source of Variation How to Measure/Eliminate
Inaccurate recording of the time a call is received • Use time stamps in system for voice mail and fax.
Audit 10 voice mail messages/fax transmissions to
ensure that time stamps are accurate.
• Conduct a Gage R&R for incoming calls: Call a
customer representative and have that person
record the time the call is received.
Inaccurate recording of the time a call is responded to • Include in above Gage R&R.
Inaccurate transcription of data collection • Audit data transcription.
Defective watch or clock • Check for linearity accuracy ahead of time by
Revised 04/13/07 Page 9 of
25
comparing to a standards clock.
Intentional manipulation of data • Achieve team agreement that data manipulation
would only result in lost credibility, sales, and
income since customers would recognize whether
or not call response time improved—regardless of
what the data showed.
One of his team members discovered that when logging onto the computer network, the clock on her
laptop computer would synchronize to the clock of the network server. Since everyone was logging on to
the same server, this solved the problem and gave a reliable time standard for the Project. It would now
become standard office practice for sales engineers to synchronize their watches to the laptop at the time
of log-in each morning.
To conduct the Gage R&R, Green Belt of Project (Project lead) had each sales engineer complete the
GR&R while in the office. Project lead called each sales engineer three times and recorded the time the
phone was answered. The sales engineer also recorded the time he or she answered the phone.
Green Belt of Project also compared the timestamps on the fax and in the voice mail system with his
standards clock and determined that they, too, were accurate. (Use same spreadsheet setup to determine
mean & s.d.)
Y: Time (in hours) to ship field submittals
Green Belt decided to audit 100% of her collected data. Smaller sample size.
Data to be collected MSA plan
Purchase order date Audit from physical or electronic purchase order
copy
Date submittal handed off to assistant Audit from Speedi program: "submittal date" field
Date sales engineer downloaded Audit from Speedi program: "download date" field"
Date submittal shipped to customer Time stamp and photocopy cover sheet before it is
sent out
6.3 Finalize Project Data Collection Plan
Y: Time (in minutes) for call response
A member of Green Belt of Project's team noticed that the data collection form called for collecting data for
items that the team had decided was beyond the project's scope. After some debate, the team agreed to
refine the data collection form to include only those items that are related to the CTQ of response time to
an initial call.
Y: Time (in days) to ship field submittals
Considering the low volume of both existing and future data points, the team decided they would collect
data on all field project submittals, not just those for the Six Sigma-targeted customers.
Measure 7: Data for Project Y
7.1 Communicate Data Collection Plan
Y: Time (in minutes) for call response
Revised 04/13/07 Page 10
of 25
To add emphasis on the need to collect data accurately and consistently, there was a top level
communication for the data collection plan to the entire team in a full staff meeting, with members of team
fielding questions.
Y: Time (in hours) to ship field submittals
• Baseline data would be collected between Sept and Dec, 2003.
• The person who shipped the field submittal package would be responsible for collecting his or her own
data and forwarding it to Project lead on a weekly basis.
• Data to be collected included the date the purchase order was received, the date it was handed off to
the sales engineer assistant, the date it was shipped to the customer, as well as additional information
(such as method of shipment, requisition number) that would help with the tracking process.
Project lead presented the data collection form that would be used to track submittals.
At first, some team members resisted—they considered the data collection efforts to be just more demands
on their time. To address this resistance, Project lead reinforced the Coach's assertion that data is the
foundation of Six Sigma methodology, and that to ensure the success of a project; decisions should be
based on data rather than simply intuition, internal knowledge, or business judgment.
7.2 Train Employees
Y: Time (in minutes) for call response
To ensure consistency and accuracy of the data, a member of Project team distributed the data collection
form to each member of the office staff, explained how to collect the data, and answered individual
questions.
Y: Time (in hours) to ship field submittals
Because of the relative simplicity of the data collection process for this project Y, no formal training was
required.
7.3 Collect Data for Project Y and Potential Xs
As outlined in their respective data collection plans, each team collected baseline data on their respective
process. There were no problems with the data collection process.
Measure 8: Process Capability
8.1 Graphically Analyze Data
Now that each team had collected baseline data on their respective process, their next step was to visually
display the data to help them understand how their processes were performing.
Y: Time (in minutes) for call response
The data from the 2-month period was collected and entered into the master spreadsheet.
The team analyzed the data with help from their Black Belt. Using Minitab™, they first performed a
normality test, which resulted in a p-value of 0.0. According to the Coach, as a general rule for normality
testing, you can conclude that the distribution of the sample data is normal if the p-value is greater than
0.05. Therefore, the team concluded that their data was not normally distributed.
Revised 04/13/07 Page 11
of 25
The Coach says that if you find that your data is not normally distributed, it is important to find out why.
Non-normal distributions will often contain clues that will help you focus your project or identify vital causes
of variation.
The team searched the Coach for help on how to handle non-normal data. The first thing they did was look
for outliers—data points that fall well outside the normal range of values for all the other data points. They
identified two outliers: one at 28.6 hours and another at 38.7 hours.
The Coach suggested that outliers could be removed if they were explainable by an activity that is outside
of the process. For example, data that is improperly transcribed would be an activity outside of the call
response process. Project lead went back to the phone logs to see if there was an error made in
transcription but found that, in fact, those were the actual response times. In this case, there was no valid
reason to remove the data.
Black belt recommended treating the data as discrete data.
Y: Time (in hours) to ship field submittals
Project team collected baseline data on their process between mid-Sept and mid-Dec.
A quick visual scan of the data revealed a great deal of variation on the number of days it took to ship
drawings back to the customer. In addition, the team noticed an outlier—a data point that falls well outside
the normal range of values for all the other data points. The outlier represented a single instance where it
took 12 days to ship the submittal package; the rest of data ranged between 0 (same-day shipment) and 6
days. The team decided to include the outlier in their analysis because it reflected process variation within
their control and there was no special cause for that data point.
To determine whether the data in their sample was normally distributed, the team performed a normality
test. The Coach says that if you find that your data is not normally distributed, it is important to find out
why. Non-normal distributions will often contain clues that will help you focus your project or identify vital
causes of variation.
The normality test resulted in a p-value of 0.193. As a general rule for normality testing, you can conclude
that the distribution of the sample data is normal if the p-value is greater than 0.05.
8.2 Calculate Z Value
Y: Time (in minutes) for call response
The team was now ready to calculate the Z value of their process. First, they needed to determine what
kind of data they had—was it short term or long term?
Because they were interested in knowing the capability of their current process, the data was considered to
be long-term. The data is also considered long-term because the team could not differentiate between
special cause variation (a specific, known factor) and common cause variation (variation caused by
unknown factors); they had not grouped the data into rational subgroups.
Using the Six Sigma product report, Green Belt and Project team calculated the Z bench value for their
process. However, Z bench yields short-term process capability. To measure long-term capability, they
needed to subtract Z shift from Z bench.
This yielded a result of 0.55 sigma and 292,683 DPMO. This meant that their process, as it stood, was far
short of being capable of meeting the customers' CTQ of having initial calls returned within 4 hours.
Y: Time (in hours) to ship field submittals
After visually analyzing the data, the team's next task was to examine the capability of their process, or
how well their process was performing. Using the capability analysis tool with data entered for long-term
Revised 04/13/07 Page 12
of 25
analysis, the team calculated long-term process capability for total ship time. This capability analysis
established a baseline for comparing the process before and after improvements were made.
The capability analysis showed a mean ship time of 3.6 days, with a standard deviation of 3.02 The initial
sigma value, or ZLT, was 0.46—far from the 6 sigma goal. Assuming an estimated Z shift of 1.5, the ZST
was 1.96. The calculated DPMO was 200,000.
Measure 9 Improvement Goal for Project Y
9.1 Define Statistical Problem
Y: Time (in minutes) for call response
According to the Coach, this step does not apply to projects using discrete data. Because the call response
team was treating their data as discrete, they proceeded to the next step.
Y: Time (in hours) to ship field submittals
Now that they had defined their process capability, the team needed to set their improvement goal, or
statement of their project Y's performance that would meet their CTQ. To do this, the Coach states they
first had to define the problem in statistical terms, stating the required reduction of defects in terms of
defects per million opportunities (DPMO) and corresponding Z value.
The statement of the statistical problem for the project Y was, "Reduce the variation of the time to return
field submittals and to shift the mean of submittal response time closer to 0 so that all field-generated
submittals are shipped less than 5 days from receipt of the purchase order.”
9.2 Identify Project Goal
Each team's goal was to reach 6-sigma meaning process with a DPMO of less than four.
Y: Time (in minutes) for call response
The goal for this Y was to
• Reduce defects from the current process's 292,683 DPMO to less than 4 DPMO
• Increase its ZST value from 2.05 sigma to 6 sigma, or improve ZLT from 0.55 to 4.5 sigma
You can revisit the goal statement at this stage and look at the performance opportunity.
Y: Time (in hours) to ship field submittals
The goal for this Y was to
• Reduce defects from the current process's 200,000 DPMO to less than 4 DPMO
• Increase its ZST value from 1.96 sigma to 6 sigma, or improve ZLT from 0.46 to 4.5 sigma
At this point the team updated its charter to include these details of their project goal.
9.3 Determine Improvement Methodology
Y: Time (in minutes) for call response
The team reviewed the information from the voice of the customer to determine whether their process did,
in fact, have the potential to reach the 6 sigma goal.They benchmarked other call centers with similar
Revised 04/13/07 Page 13
of 25
processes and volumes. Even though their process was running at a sigma level of 0.55, based on the
benchmark data, the team felt it was possible to improve the process to meet the performance goal of 6
Sigma.
Y: Time (in hours) to ship field submittals
Now the team needed to determine whether their process did, in fact, has the potential to reach the 6-
sigma goal. They benchmarked other sales offices with similar processes and volumes and decided it
would be possible to meet their performance goal by improving their existing process. Because a complete
redesign of the process was not necessary, they continued their project using the DMAIC methodology.
ANALYZE 10: PRIORITIZED LIST OF ALL Xs
10.1 Capture all possible Xs
Next, each team looked at all potential sources of variation, or Xs, for each of their project Ys.
Y: Time (in minutes) for call response
The team gathered to brainstorm the causes for variation in returning customer calls. They created the
following list:
1. Different cell phones/service providers: There were three different service plans in use among the
sales engineers, each with its own set of charges, amount of airtime and service area.
2. Staff reduction: The administrative assistant, who used to screen and route calls, was laid off and not
replaced.
3. Inconsistent practice in returning calls: The sales engineers were unaware of the customer's need for a
call back within 4 hours.
4. Inability of the voice mail system to inform sales engineers of new messages
5. Inability of the voice mail system to allow caller to mark messages as urgent
Y: Time (in hours) to ship field submittals
The team came up with the following possible Xs:
• A delay in handing off the purchase order to the sales engineer assistant
• Complexity of quotation: A complex quotation may take extra time to put together.
• Technological limitations: A sales engineer on the road may not have access to the Speedi quotation
software and therefore would not be able to transmit the quotation to the sales engineer assistant
• Need for factory input: The price of the job still needs to be cleared by the factory or the need for
factory documentation could delay the process
• Customer need for alternative quotes: The need to compile extra pieces for the submission package
could delay the process
• Sales engineer assistant experience level: In some cases, sales engineers may want to do the
drawings themselves instead of placing the responsibility on less experienced sales engineer
assistants.
Then the team then consulted their SIPOC map and matched each X to corresponding steps in their
process.
Step Corresponding X
Sales engineer hands off purchase order to sales
engineer assistant
• Delay in handing off purchase orders to sales
engineer assistants
• Technological limitations
Sales engineer assistant compiles submittal
package
• Complexity of quotation
• Need for factory input
• Customer need for alternative quotes
• Sales engineer assistant experience level
Revised 04/13/07 Page 14
of 25
10.2 Draft Prioritized List of Xs
In this step, each team selected the Xs that had the highest degree of impact on the process for their Y.
Y: Time (in minutes) for call response
Green Belt of Project then placed the Xs the team had captured into a table and asked each team member
to rank them in order of priority. Before they got started, one of the team members asked if the inability of
the voice mail system to mark urgent messages was really an X. "After all," she said, "it doesn't matter
whether or not the customer marks this initial call as urgent. We still have to get back to them within 4
hours." After a few minutes of debate, the rest of the team agreed that this really wasn't an X and so it
was taken off the list.
Each team member prioritized the Xs. With Project lead serving as facilitator, they debated the rankings
until they came to an agreement on the prioritization. Here's what they decided:
1. Inconsistent practice in returning calls
2. Different cell phones/service providers
3. Inability of the voice mail system to inform sales engineers of new messages
4. Staff reduction
Y: Time (in hours) to ship field submittals
The team looked at their baseline data to help them prioritize their Xs. They looked again at the outlier
representing the single instance where it took 12 days to ship the submittal package. This outlier reflected
a 9-day delay between the time the sales engineer received the purchase order and when it was handed
off to the sales engineer assistant. From looking at their baseline data, the team suspected that had their
project Y was mostly impacted by the Xs related to sales engineer to sales engineer assistant turnover
time.
Analyze 11: List Vital Few Xs
11.1 Verify Xs with Data
Y: Time (in minutes) for call response
Now it was time for Project lead to determine if the Xs they had uncovered had a significant impact on their
project Y. They turned to the Coach to learn how to do this and found a 7-step process, which they
followed for each of the Xs.
X: Inconsistent practice in returning calls
Green Belt of Project asked BB if they hadn't already verified this X by analyzing the baseline data. BB said
they had calculated the sigma value of the process, but this X had not been verified with data. They
needed to show whether there was a difference among the sales engineers' practices of returning calls .
Black Belt on the project created a survey to examine the sales engineers' process for returning calls. He
performed a measurement system analysis to ensure it would not add variation to the data collection
process. The survey showed that the sales engineers fell into three broad categories: those who returned
calls within an hour, those who returned calls within a couple of hours, and those who returned calls only
when they had an answer for the customer, which could take up to several days.
Revised 04/13/07 Page 15
of 25
The alternative hypothesis for this X stated, "There is a significant difference in the practice of returning
customer calls among the sales engineers." The null hypothesis stated, "There is no difference in the
practice of returning customer calls among the sales engineers."
Because the team was testing the difference in the means of three levels of a factor, they selected the 1-
way ANOVA tool. Here are the results:
Analysis of Variance
Source DF SS MS F P
C2 2 1737 869 7.01 0.003
Error 38 4711 124
Total 40 6448
The p-value showed a 0.3% chance of error in rejecting the null hypothesis. A p-value of less than 0.05
indicated a significant difference between the means. This was a vital X.
X: Different cell phone/service providers
They agreed that this X needed to be verified with data since 1) it was not known to be an X from a
previous project, and 2) it would not be too costly to collect the data to verify this X since there were only
three different service plans in use. The data they had for this X, the amount of air time for a particular
plan, was stable because the time was the same each month for each particular plan.
The team then came up with the alternative hypothesis for this X: "There is a difference in the call response
time of sales engineers with different cellular phone services." The null hypothesis for this X stated, "There
is no difference in the call response time of sales engineers with different cellular phone services."
Now the team needed to select the proper statistical tool. They returned to the Coach and found a table of
tools in the How section of this step. They opted to use the 1-way ANOVA tool. Here are the results:
Analysis of Variance for Different Calling Plans
Source DF SS MS F P
C2 2 1862 931 7.72 0.002
Error 38 4586 121
Total 40 6448
The p-value for this test told the team that there would only be a 0.2% chance of error if they rejected the
null hypothesis. Because the p-value was less than 0.05, they could accept the alternative hypothesis that
there is a significant difference in the call response time of the sales engineers with different cellular phone
calling plans. This is a vital X.
X: Inability of the voice mail system to inform sales engineers of new messages
The team applied the same process to this X as they used in the previous one. This lack of capability was a
particular problem for those sales engineers who were out in the field. Unless they called in for their
messages on a regular basis, they had no way to know if a new call was in their voice mailbox.
The team agreed that they needed to test this X because it was not known from another project and it
would not be too costly to collect the data. The data was already known to be acceptable and the process,
the voice mail system, was known to be stable.
The team stated the alternative hypothesis as, "The call response time will be less for sales engineers in
the office than those in the field." Their null hypothesis stated, "There will be no difference in the call
response time between the sales engineers in the office and those in the field."
The team chose to use a 2-sample t-test because they were interested in finding out if there was a
difference in the means of two samples from the same group.
Here are the results of the test:
Revised 04/13/07 Page 16
of 25
Two sample T for Response Time
N Mean StDev SE Mean
1 (Inside) 27 0.90 1.26 0.24
2 (Outside) 14 14.7 18.9 5.1
95% CI for mu (1) - mu (2): ( -24.74, -2.9)
T-Test mu (1) = mu (2) (vs <): T = -2.73 P = 0.0086 DF = 13
The data clearly shows a difference in the mean response time between the two groups of sales engineers.
The p-value shows only a 0.8% chance of error in rejecting the null hypothesis. Therefore, the inability of
the voice mail system to inform sales engineers of new and urgent messages is also a vital X.
X: Staff reduction
The team was able to reject this as a vital X because data had never been collected on the administrative
assistant's call screening and forwarding process. This would make it impossible to verify with data that this
had an impact on call response time.
Y: Time (in hours) to ship field submittals
Next, the team needed to prioritize their Xs according to their effect on process variation.
List of possible Xs
• Delay in handing off purchase orders to sales
engineer assistants
• Technological limitations
• Complexity of quotation
• Need for factory input
• Customer need for alternative quotes
• Sales engineer assistant experience level
According to the Coach, the prioritization of Xs would have to be based on data. In the last step, the team
suspected that had their project Y was mostly impacted by the Xs related to purchase order hand-off time.
Now they needed to prove statistically that this was the case. To obtain this proof, they did a regression
analysis.
The regression equation is
response time = 1.52 + 1.11 hand off
Predictor Coef StDev T P
Constant 1.5190 0.5614 2.71 0.018
hand off 1.1148 0.1937 5.76 0.000
S = 1.663 R-Sq = 71.8% R-Sq(adj) = 69.6%
Analysis of Variance
Source DF SS MS F P
Regression 1 91.639 91.639 33.13 0.000
Error 13 35.961 2.766
Total 14 127.600
Unusual Observations
Obs hand off response Fit StDev Fit Residual St Resid
1 9.00 12.000 11.552 1.447 0.448 0.55 X
3 0.00 6.000 1.519 0.561 4.481 2.86R
R denotes an observation with a large standardized residual
X denotes an observation whose X value gives it large influence.
Revised 04/13/07 Page 17
of 25
The high R-sq value from the regression analysis showed that purchase order hand-off time explained 72%
of the variation in ship time. They recognized that this was due to correlation, not causality. They had not
conducted any experiments where they deliberately altered this X to determine if it caused a change in the
Y, but they could clearly tell purchase order hand-off time was a vital X.
To be sure they did not overlook another significant source of variation, the team assessed the other Xs on
their list for their effect on process variation. To do this, they reviewed the various logs and tracking
mechanisms they had used in the data collection process and conducted 1-way analyses of variance to
determine the degree of impact each X had on the project Y. They found that no X accounted for more
than 6 percent of total process variation.
They team prioritized their list of Xs, according to statistical significance.
Prioritized list of Xs
1. Delay in handing off purchase orders to sales
engineer assistants
2. Technological limitations
3. Complexity of quotation
4. Need for factory input
5. Sales engineer assistant experience level
6. Customer need for alternative quotes
11.2 Finalize List of Vital Few Xs
Now each team needed to identify the vital few Xs—the small groups of inputs or processes that had the
most direct effect on their project Ys.
Y: Time (in minutes) for call response
Since the project scope didn't permit hiring additional staff and data had not been collected when there was
staff in place, the staff reduction X was taken off the list. This left the following three vital Xs:
Vital Few Xs
1. Inconsistent practice in returning calls
2. Different cell phones/service providers
3. Inability of the voice mail system to inform sales
engineers of new and urgent messages
Y: Time (in hours) to ship submittal drawings
Given the results of the regression analysis, the team could clearly see that most of the variation in their
project Y was caused by purchase order hand-off time.
However, the team realized they needed to "drill down" this X; it had a CAP element. They determined the
root cause of the delay seemed to lie in the sales engineers' overall lack of awareness of the importance of
timely purchase order turnover. The team agreed they needed to educate the sales engineers as to the
importance of this hand-off, since it would be measured in the future. They also drilled down the technical
limitations X to the real problem: access to the Speedi quotation software.
Vital Few Xs
1. Engineers' overall lack of awareness of the
importance of timely purchase order turnover
2. Access to Speedi quotation software
Revised 04/13/07 Page 18
of 25
Analyze 12: Quantified Financial Opportunity
12.1 Refine Financial Benefits
The two teams decided that at the end of their projects, they would compare the performance of the
eight targeted accounts against the non-beta site accounts to see whether any improvement in
business was found. In the meantime, each team assessed the financial benefits relating to their
respective project Ys.
Y: Time (in minutes) for call response
Green Belt of Project's team knew they had an opportunity to improve customer satisfaction by improving
call response time, but had no verifiable way to link that improvement with increased sales. Any connection
was, at best, anecdotal. The call response team did wonder if a cost savings could be realized by moving
everyone to the same cellular phone service plan. Green Belt of Project selected two team members to
investigate this.
Y: Time (in hours) to ship field submittals
Using their 15 baseline data points, the team ran a chi square test to see if timeliness had any impact so
far on bid win rates. The team recognized they had a small amount of data, but knew they should start at
that point and add to it as they continued to collect data.
Win Bid Lost Bid
Submit <40 hours 4 1
Submit >40 hours 3 8
Here are the results of the chi square test:
Expected counts are printed below observed counts
win lost Total
<5 days 4 1 5
2.19 2.81
> 5 days 3 8 11
4.81 6.19
Total 7 9 16
Chi-Sq = 1.502 + 1.168 +
0.683 + 0.531 = 3.883
DF = 1, P-Value = 0.049
3 cells with expected counts less than 5.0
Black Belt on the project knew the average bid was worth $50,000. The win rate when submittal
turnaround time was more than 5 days was 3/8, or approximately 40%, while the win rate when less than 5
days was 4/5, or 80%. If the team could double their win rate from 40% to 80%, they could expect to
double growth. Of course, other factors such a pricing may mitigate, but the team could take that into
account with a two-tiered set of chi-squared tests. They needed more data in order to verify the benefits,
and so Project lead continued to collect win/loss and value of bid data along with the performance of the
submittal time Y.
12.2 Estimate Costs Associated with Improved Process
Revised 04/13/07 Page 19
of 25
Y: Time (in minutes) for call response
The team analyzed the cost associated with the different cellular phone provider/service plans held by the
various members of the sales team. The team then solicited bids from other cellular service providers to
compare costs and services. They team presented this information in a matrix.
The team determined that if the entire office switched to one provider and one service plan, they could
realize an annual savings of $19,000. This included the purchase of new cell phones with increased
functionality including text messaging and paging. The cost structure eliminated roaming fees and
expanded the service area to the entire country. Although there was an additional start-up cost associated
with starting this service, it would quickly be recovered.
Two members of Call response Project's team investigated the cost associated with changing the voice
mail system. They learned that the functionality they wanted was already available on their existing system;
it just hadn't been activated. There would be no new costs associated with activating this feature because it
could be done under the existing service agreement. In fact, the activation of the notification feature
included a way to allow the customer to mark their call as urgent. This would trigger the system to dial the
sales engineers' beepers and alert them to the urgent message, thus interfacing with the new proposed
cellular phones.
Y: Time (in hours) to ship field submittals
No new costs were associated with implementing the solution for this particular Y.
12.3 Identify Intangible Benefits
At this point, each team needed to identify the intangible benefits, or favorable outcomes that are not
reportable for formal accounting purposes, but nonetheless help justify the project.
Y: Time (in minutes) for call response
The team identified the following intangible benefits:
• Increased quality of customer communications
• The ability for customers to reach their sales engineer much faster
• Increased customer confidence in the sales engineers’ ability to respond to their calls
Y: Time (in hours) to ship field submittals
The team identified the following intangible benefits:
• Increased team cohesiveness as a result of working together toward a common goal
• Enhanced customer relations; noticeable customer enthusiasm and appreciation for the team's efforts
to improve quality of service
• Increased exposure to customers; sharing scorecards provided an opportunity to get in front of the
customer and reinforce the team's commitment to making the process improvement
Improve 13: Proposed Solution
13.1 Identify Improvement Strategy
Y: Time (in minutes) for call response
The team created the following improvement strategy:
Revised 04/13/07 Page 20
of 25
X: Inconsistent practice in returning calls
Improvement strategy: The team would add an extra step to the call response process: Whether or not
the customer issue was resolved, sales engineers would return customer calls within 4 hours, at least
advising the customer of the status of their issue
X: Different cell phones/service providers
Improvement strategy: The team would change to a singular cellular carrier offering consistency in service
and greater functionality. This strategy would be implemented in early July.
X: Inability of the voice mail system to inform sales engineers of new and urgent messages
Improvement strategy: The team decided to activate the functionality on the voice mail system that would
allow urgent/new message flagging and automatic paging. This strategy would be implemented in early
July.
In addition, the sales engineers developed strategies to address some other concerns about the voice mail
system that had surfaced in the Voice of the Customer data. These additional strategies would not cost any
more to implement and would increase the intangible benefits.
• All sales engineers would record a uniform greeting using a standardized format. This greeting would
be updated daily.
• They would simplify the menu system to make it easier to navigate.
• The menu system would clearly instruct customers in how to reach a live person
Y: Time (in hours) to ship field submittals
In order to ship field submittals within the 5-day specification limit, the team needed to reduce purchase
order hand-off time and establish a performance standard for that time. Through group discussion, they
determined that it was reasonable to allow no more than 1 day from when the sales engineer received the
purchase order to when it was handed off to the sales engineer assistant. This 1-day turnaround time
became the operating tolerance for this vital X.
The team also identified an improvement strategy to overcome technological limitations related to
accessing the Speedi quotation software. When out-of-town, sales engineers would phone in confirmed
orders to assistants so they could begin the submittal compilation process.
In addition, the team identified two actions they could take to enhance the presentation and quality of
service:
• Send customers a fax notification of submittal shipment
• Standardize the cover sheet that was sent with submittals
These improvements were in response to customer feedback the team received when they gathered Voice
of the Customer data. The fax notification supports the team's measurement system analysis by providing
another time-stamped record to cross-reference.
13.2 Experiment to Determine Solution
Y: Time (in minutes) for call response
The team decided the most effective way to test their improvement strategies was to conduct a full-scale
pilot that involved the entire office. The changes that needed to be made to the voice mail system could
not be made for one or two individuals; rather, they would affect the entire office. Also, it did not make
economic sense to switch only a few staff members over to a new cellular service. Savings would only be
realized by switching the entire office.
Y: Time (in hours) to ship field submittals
Revised 04/13/07 Page 21
of 25
Black Belt and Project lead identified a pilot as an excellent way to test whether their suspected
vital Xs were indeed vital. As is often the case in commercial processes, classic DOEs with high
and low settings are more difficult to design, typically because of the scarcity of data points and
the difficulty of setting multiple Xs at high and low settings. Instead, the team would use a form of
evolutionary operations where they planned to set the process and Xs at their estimated optimal
settings and run the process. They would then determine whether the Xs indeed had the effect on
the Y.
Improve 14: Piloted Solution
14.1 Plan the Pilot
Y: Time (in minutes) for call response
The team agreed that there would be no need to change the data collection plan or the form they used to
collect their baseline data. Everyone would follow the same procedure and submit their logs for data entry.
Because the new cellular service and changes to the voice mail system would not be implemented until
early July, the team decided they would check their voice mail once an hour as a containment strategy.
They also changed their voice mail greeting to the standardized format.
Y: Time (in hours) to ship field submittals
The team turned to the Coach for guidance in how to pilot their solution. The first step, according to the
Coach, was to assess the risks involved. The team determined that running a pilot posed no risk of
financial loss, negative effect on customers, or drain on internal resources.
They also decided their resource requirements and procedure would be identical to what they had outlined
in the data collection plan they used to gather their baseline data. The time frame set for the pilot was 2
months.
14.2 Run the Pilot and Collect Data
Y: Time (in minutes) for call response
The pilot was run from the beginning of May through the end of June. Following the data collection plan to
collect the baseline data, the members of the office logged 5 to 10 initial calls and then submitted the log
sheets for data entry.
Y: Time (in hours) to ship field submittals
Project lead communicated the strategy for running the pilot and collecting data at a full staff meeting.
Since everyone was already knowledgeable of the data collection procedure, no formal training was
necessary.
14.3 Analyze the Results of the Pilot
Y: Time (in minutes) for call response
At the end of the 2-month pilot, Black belt on project analyzed the results of the data collection in Minitab™
using a 2-sample t-test.
Revised 04/13/07 Page 22
of 25
The alternative hypothesis stated, "The mean of the baseline would be greater than the mean of the pilot."
The null hypothesis stated, "The mean of the baseline response and the mean of the pilot response time
would be the same."
The team needed to know if they had reduced the variation in their process. Turning to the Quality Coach,
they determined they should use the homogeneity of variance tool.
Green Belt of Project turned to the Coach to interpret the results of the homogeneity of variance test. The
information in the Coach explained the p-value from Levene's test is used when the data comes from
distributions where normality has not been determined or from continuous, but non-normal distributions.
Though close, the p-value was less than 0.05, indicating a statistical difference in the variation of the
baseline data and the data from the pilot. This meant the team had been successful in reducing both the
mean and the variation.
Y: Time (in hours) to ship field submittals
After running the pilot for 2 months, Project lead reran her process capability analysis with the new data.
The results of the capability analysis showed the new ZLT was 5.49. Assuming a Z shift of 1.5, the ZST was
6.99, well above the project goal. This clearly satisfies the Coach's criteria that the output shows a
significant difference that can be attributed to the solution.
Project lead then conducted a 2-sample t-test to confirm that the process was statistically improved. She
established a null hypothesis: "There is no difference in submittal time between the original process and
the improved process." Her alternate hypothesis was, "Submittal time for the original process is greater
than for the improved process."
Here are the results of the 2-sample t-test:
Two sample T for Original Process vs Improved Process
N Mean StDev SE Mean
Time to 15 24.5 20.1 5.2
Response 10 10.30 5.40 1.7
95% CI for mu Time to - mu Response: ( 2.6, 25.8)
T-Test mu Time to = mu Response (vs >): T= 2.59 P=0.0099 DF= 16
Control 15: Sustain Solution
15.1 Develop Control Plan
Y: Time (in minutes) for call response
The team met to develop a control plan for each of the vital Xs. Using brainstorming they came up with the
following plan:
Item Control Action Implementation Date
Cellular phone service Switch to new carrier and purchase
new cell phones for all sales
engineers
July 1
Data collection Sales engineers will continue to
collect data for one week each
month
July 1
Internal dashboard Create to show previous month's
results
August 1
External dashboard Create to show to customers on July 1
Revised 04/13/07 Page 23
of 25
sales calls
Control chart Create to show previous month's
results
August 1
Training manual Update to show new procedures July 1
Y: Time (in hours) to ship field submittals
To ensure that the benefits from the solution continued to be realized, the team created a control plan
outlined as follows:
• The team would continue to follow the data collection plan and collect data on every field submittal as
part of their normal record keeping routine. This data would become part of the dashboards (or
scorecards) for the eight beta site customers. In addition, the team would keep an internal metric on
hand-off time for purchase orders.
• Team leader Project lead would continue to compile and analyze the data using the applicable
statistical tools.
• Project lead would hold responsibility for responding if the process goes out of control based on the
control chart for hand-off time. She would investigate the cause of the defect and take appropriate
measures to prevent it from recurring.
In addition, Project lead created a standard operating procedure documenting the team's control plan.
This would ensure consistency for future projects and transcend future personnel changes to the team.
15.2 Implement Solution
Y: Time (in minutes) for call response
Beginning in July, the office implemented the control plan with the solution in place. This included the new
cellular phone service and the changes to the voice mail system. Monthly control charts were produced to
show the results of the plan. As the office became more confident that the plan would become the
established way of doing business and the results showed sustained performance, the sampling would be
cut back to once a quarter.
Y: Time (in hours) to ship submittal drawings
The implemented solution was the same as the proposed solution:
• Sales engineers handed off all quotations to sales engineer assistants within 1 day after receipt to
ensure the customer-defined 5-day performance standard for submittal shipment would be met.
• In the future, 1-day handoffs would be included as part of the sales engineers metrics as part of the
control plan.
• If they did not have access to Speedi quotation software, sales engineers phoned in confirmed orders
to assistants so they could begin the submittal compilation process.
In addition to implementing the solution, to enhance the presentation and quality of service, the team
• Sent customers a fax notification of submittal shipment
• Standardized the cover sheet that was sent with submittals
15.3 Confirm Solution
Y: Time (in hours) for call response
As before, project lead and green belt tested the normality of the data. Since the p-value of the data was
less than 0.05, the data was still considered to be non-normal. They used the same method to determine
Revised 04/13/07 Page 24
of 25
the sigma value of the process as they had when they first analyzed the baseline data. Using the Six
Sigma product report to calculate Z value, the team achieved a sigma level of 5.5 by the third month of
implementing the solution for this Y.
To show that the mean and the variation had been reduced, Green Belt of Project and black belt repeated
the analysis they performed to analyze the results of the pilot study. Here are the results.
2-sample t-test:
C2 N Mean StDev SE Mean
1 41 5.6 12.7 2.0
2 26 0.698 0.622 0.12
95% CI for mu (1) - mu (2): ( 0.9, 8.93)
T-Test mu (1) = mu (2) (vs >): T = 2.48 P = 0.0088 DF = 40
The team was able to say with statistical certainty that they had improved on both the mean and the
variation of the call response Y. Green Belt of Project wanted to be able to show the rest of the team that
new process for handling call response was in control.
Y: Time (in days) to ship submittal drawings
In order to statistically prove they had met their 6 sigma goal, the team calculated the capability of the
improved process and compared it to their goal and original process data.
A final 2-sample t-test was run comparing the baseline data with the implemented solution from data
collected from August to December. The test confirmed that the mean of their Y had indeed shifted.
An F-test was also conducted; it confirmed that the standard deviation was significantly reduced as well.
Finally, confidence intervals were calculated for the new process ZLT and compared to the confidence
intervals for the old process ZLT. The confidence intervals did not overlap, confirming that the process Y
had been improved.
The impact of the project's success was evident in the Voice of the Customer. Here is what two of the beta
customers had to say:
"This project has made the company much easier to do business with and has helped them gain a
competitive advantage."
"The projects the Delhi team have done truly made a difference in their total service level."
Control 16: Project Documentation
16.1 Finalize Financial Results
The sales results were calculated at the end of the fiscal year and shown to the entire Delhi staff.
"According to this chart", said Green Belt of Project," it looks like we saved the year by growing business
with our eight beta customers."
Black belt on project explained that their projects had been very successful in reducing process variation,
but there was no statistical way to prove that reducing that variation caused an increase in business. It was
hoped that as a result, customers would remain loyal to the company and do more business and that
appears to have been the case.
16.2 Complete Documentation Package
Revised 04/13/07 Page 25
of 25
Each improvement team then completed a documentation package that would ensure the effectiveness of
future projects and help other teams with their improvement projects. This package was compiled from a
variety of documents the teams worked with during the course of their projects and included the following
features, as recommended by the Coach:
• Project abstract
• Problem statement
• Baseline data on process performance
• List of vital Xs
• Solution
• Control mechanism
• Performance metrics
• Financial results
• Lessons learned/best practices
• Translation opportunities

More Related Content

What's hot

Lean Six Sigma: DMAIC In-Depth
Lean Six Sigma: DMAIC In-DepthLean Six Sigma: DMAIC In-Depth
Lean Six Sigma: DMAIC In-Depth
GoLeanSixSigma.com
 
Quality Circle Presentation Template
Quality Circle Presentation TemplateQuality Circle Presentation Template
Quality Circle Presentation Template
Ek Pahla Kadam
 
Dmaic overview for slideshare
Dmaic overview for slideshareDmaic overview for slideshare
Dmaic overview for slideshare
Lean Strategies International LLC
 
Dmaic
DmaicDmaic
Basics of Six Sigma
Basics of Six SigmaBasics of Six Sigma
Basics of Six Sigma
Harishankar Sahu
 
Lean Six Sigma-Case study
Lean Six Sigma-Case studyLean Six Sigma-Case study
Lean Six Sigma-Case studysourov_das
 
Lean Manufacturing PowerPoint Presentation Sample
Lean Manufacturing PowerPoint Presentation SampleLean Manufacturing PowerPoint Presentation Sample
Lean Manufacturing PowerPoint Presentation Sample
Andrew Schwartz
 
13. value stream mapping
13. value stream mapping13. value stream mapping
13. value stream mapping
Hakeem-Ur- Rehman
 
Six Sigma Introduction
Six Sigma IntroductionSix Sigma Introduction
Six Sigma IntroductionAbhishek Kumar
 
Six Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
Six Sigma Session For Production And Project Team By Lt Col Vikram BakshiSix Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
Six Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
LT COLONEL VIKRAM BAKSHI ( RETD)
 
6 sigma basic best ppt
6 sigma basic best ppt6 sigma basic best ppt
6 sigma basic best ppt
Jayesh Sarode
 
Value Stream Mapping -The Concept
Value Stream Mapping -The ConceptValue Stream Mapping -The Concept
Value Stream Mapping -The Concept
Subhrajyoti Parida
 
Qc story
Qc storyQc story
Qc story
ssuser283e821
 
An introduction to lean six sigma
An introduction to lean six sigmaAn introduction to lean six sigma
An introduction to lean six sigma
Rahul Singh
 
Analyze phase lean six sigma tollgate template
Analyze phase   lean six sigma tollgate templateAnalyze phase   lean six sigma tollgate template
Analyze phase lean six sigma tollgate templateSteven Bonacorsi
 
Dmaic
DmaicDmaic
Dmaic
jagan339
 
Lean basics
Lean basicsLean basics
Lean basics
Aryan Viswakarma
 
Lean Process Improvement Techniques
Lean Process Improvement TechniquesLean Process Improvement Techniques
Lean Process Improvement Techniques
Jeremy Jay V. Lim, MBB, PMP
 
Six Sigma Principles And Concepts PowerPoint Presentation Slides
Six Sigma Principles And Concepts PowerPoint Presentation SlidesSix Sigma Principles And Concepts PowerPoint Presentation Slides
Six Sigma Principles And Concepts PowerPoint Presentation Slides
SlideTeam
 

What's hot (20)

Lean Six Sigma: DMAIC In-Depth
Lean Six Sigma: DMAIC In-DepthLean Six Sigma: DMAIC In-Depth
Lean Six Sigma: DMAIC In-Depth
 
Quality Circle Presentation Template
Quality Circle Presentation TemplateQuality Circle Presentation Template
Quality Circle Presentation Template
 
Dmaic overview for slideshare
Dmaic overview for slideshareDmaic overview for slideshare
Dmaic overview for slideshare
 
Dmaic
DmaicDmaic
Dmaic
 
Basics of Six Sigma
Basics of Six SigmaBasics of Six Sigma
Basics of Six Sigma
 
Lean Six Sigma-Case study
Lean Six Sigma-Case studyLean Six Sigma-Case study
Lean Six Sigma-Case study
 
Lean Manufacturing PowerPoint Presentation Sample
Lean Manufacturing PowerPoint Presentation SampleLean Manufacturing PowerPoint Presentation Sample
Lean Manufacturing PowerPoint Presentation Sample
 
13. value stream mapping
13. value stream mapping13. value stream mapping
13. value stream mapping
 
Six Sigma Introduction
Six Sigma IntroductionSix Sigma Introduction
Six Sigma Introduction
 
Six Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
Six Sigma Session For Production And Project Team By Lt Col Vikram BakshiSix Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
Six Sigma Session For Production And Project Team By Lt Col Vikram Bakshi
 
Six sigma
Six sigmaSix sigma
Six sigma
 
6 sigma basic best ppt
6 sigma basic best ppt6 sigma basic best ppt
6 sigma basic best ppt
 
Value Stream Mapping -The Concept
Value Stream Mapping -The ConceptValue Stream Mapping -The Concept
Value Stream Mapping -The Concept
 
Qc story
Qc storyQc story
Qc story
 
An introduction to lean six sigma
An introduction to lean six sigmaAn introduction to lean six sigma
An introduction to lean six sigma
 
Analyze phase lean six sigma tollgate template
Analyze phase   lean six sigma tollgate templateAnalyze phase   lean six sigma tollgate template
Analyze phase lean six sigma tollgate template
 
Dmaic
DmaicDmaic
Dmaic
 
Lean basics
Lean basicsLean basics
Lean basics
 
Lean Process Improvement Techniques
Lean Process Improvement TechniquesLean Process Improvement Techniques
Lean Process Improvement Techniques
 
Six Sigma Principles And Concepts PowerPoint Presentation Slides
Six Sigma Principles And Concepts PowerPoint Presentation SlidesSix Sigma Principles And Concepts PowerPoint Presentation Slides
Six Sigma Principles And Concepts PowerPoint Presentation Slides
 

Viewers also liked

Six Sigma DMAIC Case Study
Six Sigma DMAIC Case StudySix Sigma DMAIC Case Study
Six Sigma DMAIC Case Study
Nitesh Verma
 
Six Sigma Case Study
Six Sigma Case StudySix Sigma Case Study
Six Sigma Case Studysanjay_asati
 
Six Sigma Case Study
Six Sigma Case StudySix Sigma Case Study
Six Sigma Case Study
Nitesh Verma
 
Six Sigma the best ppt
Six Sigma the best pptSix Sigma the best ppt
Six Sigma the best ppt
Rabia Sgh S
 
DMAIC-Six sigma process Improvement Approach
DMAIC-Six sigma process Improvement ApproachDMAIC-Six sigma process Improvement Approach
DMAIC-Six sigma process Improvement ApproachConfiz
 
Six sigma presentation
Six sigma presentationSix sigma presentation
Six sigma presentation
Dr. Bikram Jit Singh
 
Basic Six Sigma Presentation
Basic Six Sigma PresentationBasic Six Sigma Presentation
Basic Six Sigma Presentationvivekissar
 
Six Sigma Project on Distribution Efficiency
Six Sigma Project on Distribution EfficiencySix Sigma Project on Distribution Efficiency
Six Sigma Project on Distribution Efficiency
mrt77
 
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
Improve On-time deliveries_ Lean Six Sigma Green Belt ProjectImprove On-time deliveries_ Lean Six Sigma Green Belt Project
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
Mahit Ohri
 
Dell Server Ordering Six Sigma Case Study
Dell Server Ordering Six Sigma Case StudyDell Server Ordering Six Sigma Case Study
Dell Server Ordering Six Sigma Case Study
Steven Bonacorsi
 
six sigma ppt
six sigma pptsix sigma ppt
six sigma ppt
Sanjiv Yadav
 
Six Sigma
Six SigmaSix Sigma
Six Sigma
princegeet17
 
Business Builder Forum September 2011
Business Builder Forum September 2011Business Builder Forum September 2011
Business Builder Forum September 2011
MeadesCompany
 
Six sigma
Six sigmaSix sigma
Six sigma
Shri Theja
 

Viewers also liked (16)

Six Sigma DMAIC Case Study
Six Sigma DMAIC Case StudySix Sigma DMAIC Case Study
Six Sigma DMAIC Case Study
 
Six Sigma Case Study
Six Sigma Case StudySix Sigma Case Study
Six Sigma Case Study
 
Six Sigma Case Study
Six Sigma Case StudySix Sigma Case Study
Six Sigma Case Study
 
Six Sigma the best ppt
Six Sigma the best pptSix Sigma the best ppt
Six Sigma the best ppt
 
DMAIC-Six sigma process Improvement Approach
DMAIC-Six sigma process Improvement ApproachDMAIC-Six sigma process Improvement Approach
DMAIC-Six sigma process Improvement Approach
 
Six sigma presentation
Six sigma presentationSix sigma presentation
Six sigma presentation
 
Six sigma ppt
Six sigma pptSix sigma ppt
Six sigma ppt
 
PAPER_CODE__IE12
PAPER_CODE__IE12PAPER_CODE__IE12
PAPER_CODE__IE12
 
Basic Six Sigma Presentation
Basic Six Sigma PresentationBasic Six Sigma Presentation
Basic Six Sigma Presentation
 
Six Sigma Project on Distribution Efficiency
Six Sigma Project on Distribution EfficiencySix Sigma Project on Distribution Efficiency
Six Sigma Project on Distribution Efficiency
 
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
Improve On-time deliveries_ Lean Six Sigma Green Belt ProjectImprove On-time deliveries_ Lean Six Sigma Green Belt Project
Improve On-time deliveries_ Lean Six Sigma Green Belt Project
 
Dell Server Ordering Six Sigma Case Study
Dell Server Ordering Six Sigma Case StudyDell Server Ordering Six Sigma Case Study
Dell Server Ordering Six Sigma Case Study
 
six sigma ppt
six sigma pptsix sigma ppt
six sigma ppt
 
Six Sigma
Six SigmaSix Sigma
Six Sigma
 
Business Builder Forum September 2011
Business Builder Forum September 2011Business Builder Forum September 2011
Business Builder Forum September 2011
 
Six sigma
Six sigmaSix sigma
Six sigma
 

Similar to Six Sigma DMAIC Case Study

5 tools in dmai...
5 tools in dmai...5 tools in dmai...
5 tools in dmai...
Hamdia Mansour
 
Total Quality Management - II
Total Quality Management - IITotal Quality Management - II
Total Quality Management - II
Nayana(TV) Shrinivas Desai
 
Total Quality Management - II
Total Quality Management - IITotal Quality Management - II
Total Quality Management - II
Nayana(TV) Shrinivas Desai
 
Overview Six sigma by D&H Engineers
Overview Six sigma by D&H EngineersOverview Six sigma by D&H Engineers
Overview Six sigma by D&H Engineers
D&H Engineers
 
QFD-COVID TEST KIT updated-converted.pdf
QFD-COVID TEST KIT updated-converted.pdfQFD-COVID TEST KIT updated-converted.pdf
QFD-COVID TEST KIT updated-converted.pdf
SachinShishodia4
 
Chapter 10 Tools and Techniques for Quality Management.ppt
Chapter 10 Tools and Techniques for Quality Management.pptChapter 10 Tools and Techniques for Quality Management.ppt
Chapter 10 Tools and Techniques for Quality Management.ppt
Dr. Nazrul Islam
 
Tools and Techniques for Quality Management
Tools and Techniques for Quality ManagementTools and Techniques for Quality Management
Tools and Techniques for Quality Management
Nazrul Islam
 
Six sigma define - v1.0
Six sigma   define - v1.0Six sigma   define - v1.0
Six sigma define - v1.0
Sonali Jamakhandi
 
Application sourcing focused on value, not cost alone
Application sourcing focused on value, not cost aloneApplication sourcing focused on value, not cost alone
Application sourcing focused on value, not cost alone
WGroup
 
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
Gainsight
 
Taylor.randall
Taylor.randallTaylor.randall
Taylor.randallNASAPMC
 
A Research Paper On Customer Satisfaction Evaluation Process
A Research Paper On Customer Satisfaction Evaluation ProcessA Research Paper On Customer Satisfaction Evaluation Process
A Research Paper On Customer Satisfaction Evaluation Process
Tracy Drey
 
How We Reorganized Our Entire Post-Sales Organization
How We Reorganized Our Entire Post-Sales OrganizationHow We Reorganized Our Entire Post-Sales Organization
How We Reorganized Our Entire Post-Sales Organization
Gainsight
 
Thinking Just Like Your Client
Thinking Just Like Your ClientThinking Just Like Your Client
Thinking Just Like Your Client
Michael Goldberg
 
IT Operations Consulting
IT Operations Consulting  IT Operations Consulting
IT Operations Consulting
Anubhav Lal
 
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhhOM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
rammanoharjharupnaga
 
Assignment .pdf
Assignment .pdfAssignment .pdf
Assignment .pdf
MUHTASIMALBUYESSHETA
 
Salvaging a call center
Salvaging a call centerSalvaging a call center
Salvaging a call center
ERNESTO MANUEL,MBA,CLSS MBB
 
QFD presentation final...............pptx
QFD presentation final...............pptxQFD presentation final...............pptx
QFD presentation final...............pptx
MohamedHafez359059
 
6 Sigma
6 Sigma6 Sigma
6 Sigma
alexjoseph813
 

Similar to Six Sigma DMAIC Case Study (20)

5 tools in dmai...
5 tools in dmai...5 tools in dmai...
5 tools in dmai...
 
Total Quality Management - II
Total Quality Management - IITotal Quality Management - II
Total Quality Management - II
 
Total Quality Management - II
Total Quality Management - IITotal Quality Management - II
Total Quality Management - II
 
Overview Six sigma by D&H Engineers
Overview Six sigma by D&H EngineersOverview Six sigma by D&H Engineers
Overview Six sigma by D&H Engineers
 
QFD-COVID TEST KIT updated-converted.pdf
QFD-COVID TEST KIT updated-converted.pdfQFD-COVID TEST KIT updated-converted.pdf
QFD-COVID TEST KIT updated-converted.pdf
 
Chapter 10 Tools and Techniques for Quality Management.ppt
Chapter 10 Tools and Techniques for Quality Management.pptChapter 10 Tools and Techniques for Quality Management.ppt
Chapter 10 Tools and Techniques for Quality Management.ppt
 
Tools and Techniques for Quality Management
Tools and Techniques for Quality ManagementTools and Techniques for Quality Management
Tools and Techniques for Quality Management
 
Six sigma define - v1.0
Six sigma   define - v1.0Six sigma   define - v1.0
Six sigma define - v1.0
 
Application sourcing focused on value, not cost alone
Application sourcing focused on value, not cost aloneApplication sourcing focused on value, not cost alone
Application sourcing focused on value, not cost alone
 
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
Inside Gainsight’s New Post-Sales Structure: Reorganizing the Team to Drive C...
 
Taylor.randall
Taylor.randallTaylor.randall
Taylor.randall
 
A Research Paper On Customer Satisfaction Evaluation Process
A Research Paper On Customer Satisfaction Evaluation ProcessA Research Paper On Customer Satisfaction Evaluation Process
A Research Paper On Customer Satisfaction Evaluation Process
 
How We Reorganized Our Entire Post-Sales Organization
How We Reorganized Our Entire Post-Sales OrganizationHow We Reorganized Our Entire Post-Sales Organization
How We Reorganized Our Entire Post-Sales Organization
 
Thinking Just Like Your Client
Thinking Just Like Your ClientThinking Just Like Your Client
Thinking Just Like Your Client
 
IT Operations Consulting
IT Operations Consulting  IT Operations Consulting
IT Operations Consulting
 
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhhOM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
OM2_Lecture 11vvvhhbbjjbjdjjeebjrhvhuuhh
 
Assignment .pdf
Assignment .pdfAssignment .pdf
Assignment .pdf
 
Salvaging a call center
Salvaging a call centerSalvaging a call center
Salvaging a call center
 
QFD presentation final...............pptx
QFD presentation final...............pptxQFD presentation final...............pptx
QFD presentation final...............pptx
 
6 Sigma
6 Sigma6 Sigma
6 Sigma
 

More from Body of Knowledge

Smart Art Graphics Ready to Use Template
Smart Art Graphics Ready to Use TemplateSmart Art Graphics Ready to Use Template
Smart Art Graphics Ready to Use Template
Body of Knowledge
 
Process maturity
Process maturityProcess maturity
Process maturity
Body of Knowledge
 
Powerpoint presentation template library
Powerpoint presentation template libraryPowerpoint presentation template library
Powerpoint presentation template library
Body of Knowledge
 
Hypothesis Test Bank with Solutions
Hypothesis Test Bank with SolutionsHypothesis Test Bank with Solutions
Hypothesis Test Bank with Solutions
Body of Knowledge
 
Project Handover Document Template
Project Handover Document TemplateProject Handover Document Template
Project Handover Document Template
Body of Knowledge
 
Voice of Customer
Voice of CustomerVoice of Customer
Voice of Customer
Body of Knowledge
 
Hypothesis Testing in Six Sigma
Hypothesis Testing in Six SigmaHypothesis Testing in Six Sigma
Hypothesis Testing in Six Sigma
Body of Knowledge
 
Industrialisation of Analytics in India
Industrialisation of Analytics in IndiaIndustrialisation of Analytics in India
Industrialisation of Analytics in India
Body of Knowledge
 

More from Body of Knowledge (8)

Smart Art Graphics Ready to Use Template
Smart Art Graphics Ready to Use TemplateSmart Art Graphics Ready to Use Template
Smart Art Graphics Ready to Use Template
 
Process maturity
Process maturityProcess maturity
Process maturity
 
Powerpoint presentation template library
Powerpoint presentation template libraryPowerpoint presentation template library
Powerpoint presentation template library
 
Hypothesis Test Bank with Solutions
Hypothesis Test Bank with SolutionsHypothesis Test Bank with Solutions
Hypothesis Test Bank with Solutions
 
Project Handover Document Template
Project Handover Document TemplateProject Handover Document Template
Project Handover Document Template
 
Voice of Customer
Voice of CustomerVoice of Customer
Voice of Customer
 
Hypothesis Testing in Six Sigma
Hypothesis Testing in Six SigmaHypothesis Testing in Six Sigma
Hypothesis Testing in Six Sigma
 
Industrialisation of Analytics in India
Industrialisation of Analytics in IndiaIndustrialisation of Analytics in India
Industrialisation of Analytics in India
 

Recently uploaded

一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
yhkoc
 
Investigate & Recover / StarCompliance.io / Crypto_Crimes
Investigate & Recover / StarCompliance.io / Crypto_CrimesInvestigate & Recover / StarCompliance.io / Crypto_Crimes
Investigate & Recover / StarCompliance.io / Crypto_Crimes
StarCompliance.io
 
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
vcaxypu
 
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
AbhimanyuSinha9
 
tapal brand analysis PPT slide for comptetive data
tapal brand analysis PPT slide for comptetive datatapal brand analysis PPT slide for comptetive data
tapal brand analysis PPT slide for comptetive data
theahmadsaood
 
FP Growth Algorithm and its Applications
FP Growth Algorithm and its ApplicationsFP Growth Algorithm and its Applications
FP Growth Algorithm and its Applications
MaleehaSheikh2
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
ArpitMalhotra16
 
一比一原版(BU毕业证)波士顿大学毕业证成绩单
一比一原版(BU毕业证)波士顿大学毕业证成绩单一比一原版(BU毕业证)波士顿大学毕业证成绩单
一比一原版(BU毕业证)波士顿大学毕业证成绩单
ewymefz
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
enxupq
 
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
ewymefz
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
axoqas
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
vcaxypu
 
Jpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization SampleJpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization Sample
James Polillo
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
Oppotus
 
Tabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflowsTabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflows
alex933524
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
John Andrews
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
ewymefz
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
ewymefz
 

Recently uploaded (20)

一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
一比一原版(CU毕业证)卡尔顿大学毕业证成绩单
 
Investigate & Recover / StarCompliance.io / Crypto_Crimes
Investigate & Recover / StarCompliance.io / Crypto_CrimesInvestigate & Recover / StarCompliance.io / Crypto_Crimes
Investigate & Recover / StarCompliance.io / Crypto_Crimes
 
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
一比一原版(RUG毕业证)格罗宁根大学毕业证成绩单
 
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
 
tapal brand analysis PPT slide for comptetive data
tapal brand analysis PPT slide for comptetive datatapal brand analysis PPT slide for comptetive data
tapal brand analysis PPT slide for comptetive data
 
FP Growth Algorithm and its Applications
FP Growth Algorithm and its ApplicationsFP Growth Algorithm and its Applications
FP Growth Algorithm and its Applications
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
 
一比一原版(BU毕业证)波士顿大学毕业证成绩单
一比一原版(BU毕业证)波士顿大学毕业证成绩单一比一原版(BU毕业证)波士顿大学毕业证成绩单
一比一原版(BU毕业证)波士顿大学毕业证成绩单
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
 
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
一比一原版(UMich毕业证)密歇根大学|安娜堡分校毕业证成绩单
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
 
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
一比一原版(ArtEZ毕业证)ArtEZ艺术学院毕业证成绩单
 
Jpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization SampleJpolillo Amazon PPC - Bid Optimization Sample
Jpolillo Amazon PPC - Bid Optimization Sample
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
 
Tabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflowsTabula.io Cheatsheet: automate your data workflows
Tabula.io Cheatsheet: automate your data workflows
 
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单
 

Six Sigma DMAIC Case Study

  • 1. Revised 04/13/07 Page 1 of 25 CASE STUDY Introduction This is a case study of how Six Sigma, using the DMAIC improvement methodology, can be applied in a commercial environment. In January 2004, district manager Ram Singh volunteered the office to be a Six Sigma beta site. The team was tasked with driving orders growth using the Six Sigma process. This case study shows how the mission to become a Six Sigma office would drive improvement projects for two key customer CTQs, mapped here step-by-step to the DMAIC methodology. The sequence and/or details of some of the actual events have been modified for the sake of simplicity or to better illustrate a point. Project Leadership District Manager Black Belt Small Project Specialist Systems Engineer Define 1: Project CTQs 1.1 Evaluate VOC The team knew they needed to drive orders growth by increasing customer satisfaction using the Six Sigma process; the question was, how? In this initial meeting, the team brainstormed • Potential beta site customers to be included in an improvement project • Methods for collecting Voice of the Customer Beta site customers The team identified eight beta site customers—four contractors, three distributors, and an original equipment manufacturer (OEM)—to be the target of their improvement efforts. These customers, each experiencing varying degrees of growth or stagnation, were carefully chosen so as to provide a true representation of the market growth drivers. Methodology for collecting Voice of the Customer data The team decided that personal interviews would be the best way to gather Voice of the Customer data. They felt that face-to-face interactions would enable them to read nonverbal cues and probe for more information, Furthermore, an interview would provide one more opportunity to get in front of the customer and reinforce their commitment to increasing customer satisfaction To collect Voice of the Customer data, the team needed an instrument that would ensure consistency in how the interviews would be conducted, and as a result, provide high quality data. The instrument the team developed was a survey—not to be filled out by the customer, but that instead would serve as a basis for conducting the customer interviews—with the intention of determining how the Delhi office's performance could help customer growth. The survey was structured around five categories the team
  • 2. Revised 04/13/07 Page 2 of 25 identified that paralleled the sales process. For each category, the team brainstormed some points of customer concern that might stimulate further discussion. Category Sample points for further discussion Customer service Call response time, availability of technical information, accuracy of information to customer The quotation/preorder process Accuracy, timing, pricing of quotations Post order service Submittal accuracy and lead time, return material/shipping damage, issue response time Items within partial control of the Delhi office Product specifications, product offerings Items out of the control of the Delhi office Overall business pricing structure, component product quality, product availability The team agreed on the following process and desired outcomes for the customer interviews: • The interviewers would help the customer identify CTQs for each of the five categories, making certain to keep customers focused on WHAT was important to them, not on HOW to solve the problem • The customer would force rank the CTQs within each category according to how they help drive mutual growth • The customer would define how to measure each CTQ and would set the specifications for what was considered to be acceptable product or service quality • The customer would identify their top five CTQs across all categories—again, according to how they help drive mutual growth Gathering Voice of the Customer data would be a team effort. 1.2 Contain Problem if Necessary After reviewing the results of the customer interviews, the team felt they did a good job of forecasting customer concerns during their brainstorming meeting. Because the customers’ key concerns were addressed by the CTQs, no containment actions were needed. 1.3 Translate VOC into CTQs Because of the way the team had gathered the Voice of the Customer data—by helping customers identify CTQs and asking customers for specific, measurable performance requirements—the task of translating customer feedback into CTQs had taken care of itself. 1.4 Prioritize CTQs The next task for the team was to prioritize CTQs based on the results of the customer survey. The prioritization process was facilitated by the survey's design, since it required customers to force rank CTQs. Given these considerations, the team identified the following two CTQs to target for their improvement project: • Call response time • On-time field submittals 1.5 Integrate CTQs with Business Strategy The goal of the sales office was simple: to drive growth for the eight beta site customers by 10% above their targeted growth for the year. During the Voice of the Customer surveys, the customers had agreed that improvement in the top-ranked CTQs could enhance current order rates.
  • 3. Revised 04/13/07 Page 3 of 25 Define 2: Define Charter 2.1 Develop Charter From the Voice of the Customer data and the initial brainstorming session, the team had enough information to create a project charter, a document that establishes the purpose and plan for the project. First the team agreed on a statement of the problem, or the unmet customer need targeted for improvement. Problem statement: "Voice of the Customer data indicates that in order to promote growth with our customers, we need to improve our process capability in the areas of call response time and on-time field submittals." Call response time: “Customers want your organisation to respond to their calls in less than 4 hours. Currently customers tell us we respond anywhere from immediately to 48 hours. Baseline evaluation with three customers shows that delays in responding to customer issues in the past 2 months have delayed customer construction or bidding projects in five different cases, resulting in a financial loss to these customers of approximately $50,000.” On-time submittals: "Customers expect a submittal package returned to them in less than 5 days from the purchase order date. (A submittal package includes technical drawings, project documentation, and a cover sheet. It is sent to the customer for approval before the construction process can begin.) Today, we are returning submittal packages anywhere from 1 to 10 days from the purchase order date. A delay in getting the submittal package to the customer will delay the construction process as a whole. Without the drawings, the equipment cannot be approved; without the approval, the manufacturing process cannot begin. In the past 2 months, extremely late submittals have caused our customers to have to work overtime in order to meet their customers’ timelines (at a cost of approximately $10,000)." Goal statement: "By the end of December 2004, both of these processes will be operating at a Six Sigma level." After considering the problem and goal statement, the team created a project scope, which detailed the process boundaries and project focus. Project scope: "The team will target improvement efforts on two CTQs: call response time and on-time field submittals." The team then identified the resources required to complete the project. Resources/team members: A separate team, lead by a Green Belt, was established for each targeted CTQ. Finally, the team determined the benefits expected to result from their improvement project. Expected benefits: • Increased top line growth of 10%, or $1,000,000, for the eight customers involved while reducing pre- and post-order write-offs (tangible benefit)
  • 4. Revised 04/13/07 Page 4 of 25 • Translation opportunities to the other 27 district offices (tangible benefit) • Stronger relationships with customers (intangible benefit) 2.2 Obtain Key Business Stakeholder Signoff The team identified the project's key stakeholders as follows: • Vice President, Sales, champion at high level and one who could support translation • Regional Manager, who similarly could support translation within the region • Ram Singh, District Manager, who could drive action in the office by setting priorities and measurements, provide critical resources, and remove barriers by influencing other critical constituents • The sales engineers, who are measured by and paid according to customer orders and whose roles could be most significantly impacted by the projects, and who could help ensure the process stays in place once the project is completed. All key stakeholders agreed to the project's purpose and other provisions as outlined in the charter. Define 3: High Level Process Map 3.1 Construct Process Map Call response time CTQ: The team mapped the process from when a customer calls to when their issue is resolved. This high-level process map separated calls into four types: • Product/service (for issues regarding product defect, missing product, RGA) • Computerized ordering system (for job release, price and availability, SPA, and order entry issues) • Transportation (shipping status inquiries) • Technical questions On-time field submittals CTQ: Team leader Green Belt and her team created a SIPOC map of their process, which depicted the following steps: 1. Sales engineer (SE) gets purchase order from customer 2. SE transmits quotation to sales engineer assistant (SEA) via a quotation software application called Speedi 3. SE hands off the purchase order to the SEA 4. SEA compiles the submittal package 5. SEA sends submittal to customer 3.2 Validate Process Map Against Charter Call response CTQ: Project leader and his team confirmed that the four types of calls received by the sales engineers as presented on their process map were within the scope of the project. Since their CTQ involved acknowledging customer calls and not resolving the issue, the team agreed that these calls were something that could be handled within the office. The team felt the early use of the Change Management Includes/Excludes tool helped to ensure that their efforts were directed to issues within their control. On-time field submittals CTQ:
  • 5. Revised 04/13/07 Page 5 of 25 Green Belt (PL) and the others assigned to this CTQ confirmed that the five steps in their process were within the scope of the project, and that the boundaries of the project were neither too narrow nor beyond the control of their Delhi office. This team also felt that early use of the Change Management Includes/Excludes tool helped to ensure that their efforts were directed to issues within their control. Measure 4: Project Y 4.1 Identify all possible Ys for CTQs Along with extracting CTQs during customer interviews, team collecting VOC / Interviews had worked with customers to determine output measures, or Ys, for each CTQ. Although each team felt they had adequately identified their Ys during the interviewing process, to be safe, they followed the Coach's recommendation to brainstorm all possible Ys. Call response CTQ: Team working on Call response project identified three possible Ys for the call response CTQ: Y1. The time it takes to respond to an initial customer call Y2. The time it takes to resolve the customer's issue Y3. Whether or not a call is returned within the 4-hour critical time period On-time field submittals CTQ: Team working on Submittal time CTQ brainstormed the following possible Ys: Y1. The number of drawings disapproved by the sales office Y2. The time it takes for the customer to forward the purchase order to the sales engineer Y3. The time it takes the sales engineer assistant to compile the submittal package Y4. The time from when the sales engineer gets the purchase order from the customer to the time when the submittal drawings are shipped to the customer 4.2 Prioritize and Select Project Y According to the Coach, the next step for each team was to select the best possible Y and establish the metric by which it would be measured. Call response CTQ: CTQ Metric (Y) Call response time Time (in minutes) to return a call as measured from when the call was initially received On-time field submittals CTQ: CTQ Metric (Y) On-time field submittals Time (in days) to ship field submittals as measured from the date of the customer purchase order
  • 6. Revised 04/13/07 Page 6 of 25 4.3 Ensure Project Scope is Manageable Y: Time (in minutes) for call response It would not seriously impact productivity since the sales engineer was simply logging information on a time sheet. The team agreed that the small size of this project and its limited scope made it possible to be accomplished in the period set for the project. Y: Time (in days) to ship field submittals Because the project Ys were well defined when the team created their charter, considerations such as timing, resources, equipment, facilities, and barriers and constraints were addressed at that time. Measure 5: Performance Standards for Y 5.1 Gather Required Information to Establish a Performance Standard Here are the customer-defined performance standards for each Y: Y Performance standard Time (in minutes ) to return call All phone calls from a customer should be returned within 4 hours Time (in hours) to ship field submittals All field submittals generated by the sales office should be shipped no later than 5 working days from the purchase order date 5.2 Set and confirm performance standards for the project Y: Time (in minutes) for call response The survey team was surprised to find that each one of the eight customers, independent of each other, set the same standard for returning a call: 4 hours. The team also learned that their competitors were returning calls within the 4-hour time period. They decided that the customer consensus and benchmark data was sufficient to confirm the performance standard. Y: Time (in days) to ship field submittals To confirm the performance standards the customer initially defined, the team they decided to 1) conduct follow-up interviews with customers, and 2) benchmark against other companies. The team interviewed two contractors and two distributors, who confirmed that the 5-day specification limit was an acceptable performance standard. In addition, they asked for customer input about the presentation and quality of their field submittals. 5.3 Define Unit, Opportunity, and Defect At this point, each team consulted the Coach to establish solid operational definitions of unit, opportunity, and defect in order to ensure consistency in measurement and focus for improvement. The Coach defines a unit as any item that is produced or processed. An opportunity is anything that is inspected, measured, or tested on a unit that provides a chance of not meeting the performance standard, thus allowing a defect. In addition, each team had to define the number of opportunities per unit.
  • 7. Revised 04/13/07 Page 7 of 25 Y: Time (in minutes) for call response The team defined unit, defect, and opportunity as follows: • A unit was defined as an initial call from a customer. Even initial fax transmissions from customers. • An opportunity was defined as a call returned to the customer. The team recognized that only one opportunity for defect existed per call; there could be only one first response to a customer call. • A defect was defined as any return call to address an initial call/voice mail/fax that was not placed within the 4-hour upper specification limit. Y: Time (in days) to ship field submittals The team defined unit, defect, and opportunity as follows: • A unit was defined as a project field submittal • An opportunity was defined as the field submittal shipment. Since there could be only one shipment of the project field submittal, the number of opportunities for a defect per unit is 1. • A defect was defined as a project field submittal shipped more than 5 days from the purchase order date. MEASURE 6: Project Data Collection Plan 6.1 Draft Project Data Collection Plan Y: Time (in minutes) for call response As advised by the Coach, they considered each of the following questions: What data needed to be collected? The team looked at their process map to determine the data they would need to collect to help them understand why it took longer than 4 hours to return a customer's call. One of the team members observed that the data would fall into one of two areas: 1) Call characteristics, and 2) Call/response time data. Green Belt ( Project lead) created a table and the team brainstormed to identify the data categories for each area. The final table listing the data to be collected looked like this: Call Characteristics Call/Response Time Data Type of communication: call, voice mail, or fax Date/time of initial call Caller's name Time of first response Caller's company Required Factory response time Name of factory called Actual factory response time Type of issue Date/time issue resolved As part of the data collection effort, the team needed to solidify some definitions. They decided that the time a call was received would be defined by the time it was noted in the log by the sales engineer. In the case of a voice the time stamp on the machine would define mail or fax, “received”. They also agreed that call response time would be defined by the time the call to the customer was placed; the call was considered successful if the sales engineer was able to reach the customer, leave a voice mail or leave a message with someone in the customer’s office. When will the data be collected? Data was collected for the last financial year Who will collect the data?
  • 8. Revised 04/13/07 Page 8 of 25 To provide a representative sample of data, each of the 15 members of the Delhi office would collect data. Each team member agreed to choose a specific week each month to log all calls and then submit the logs to an intern student for data entry in a spreadsheet. How will sampling be done? To reduce the effort of recording the data manually, the team would record data for 1-week intervals every other week over a 2-month period, resulting in 4 weeks of data. This guaranteed at least 150 data points over the period of the study and ensured the data was random and therefore statistically valid. The sample population would consist of the eight beta site customers. Who will analyze the data? On a monthly basis, Green Belt would work with Black Belt (Coach) to analyze the data using applicable statistical tools. They would then share the results with the rest of the team. Y: Time (in days) to ship field submittals What data needed to be collected? The team needed to collect data that corresponded to the steps in the process map that were within their office's full control—the five steps in their SIPOC map. This included the date the purchase order was received, the date it was handed off to the sales engineer assistant, and the date it was shipped to the customer. When will the data be collected? The team decided they would collect baseline data for a 3-month period between Sept to Dec. Who will collect the data? The person who shipped the field submittal package would be responsible for collecting his or her own data on the process and forwarding that data to Green Belt at the end of each week. Who will analyze the data? On a monthly basis, Green Belt (PL) would work with black Belt (Coach) to analyze the data using applicable statistical tools. They would then share the results with the rest of the team. 6.2 Complete Measurement System Analysis (MSA) To ensure that the data collected would be both complete and accurate, each team needed to perform a measurement system analysis. This would ensure that they could account for and/or eliminate any variation introduced by their measurement system. Y: Time (in minutes) for call response The team then turned to their Black Belt, for advice on selecting a tool to measure the variation in their data collection form. The team thought about their measurement process and brainstormed a list of possible sources of variation. They then considered how to eliminate or measure the source of variation. Measurement Source of Variation How to Measure/Eliminate Inaccurate recording of the time a call is received • Use time stamps in system for voice mail and fax. Audit 10 voice mail messages/fax transmissions to ensure that time stamps are accurate. • Conduct a Gage R&R for incoming calls: Call a customer representative and have that person record the time the call is received. Inaccurate recording of the time a call is responded to • Include in above Gage R&R. Inaccurate transcription of data collection • Audit data transcription. Defective watch or clock • Check for linearity accuracy ahead of time by
  • 9. Revised 04/13/07 Page 9 of 25 comparing to a standards clock. Intentional manipulation of data • Achieve team agreement that data manipulation would only result in lost credibility, sales, and income since customers would recognize whether or not call response time improved—regardless of what the data showed. One of his team members discovered that when logging onto the computer network, the clock on her laptop computer would synchronize to the clock of the network server. Since everyone was logging on to the same server, this solved the problem and gave a reliable time standard for the Project. It would now become standard office practice for sales engineers to synchronize their watches to the laptop at the time of log-in each morning. To conduct the Gage R&R, Green Belt of Project (Project lead) had each sales engineer complete the GR&R while in the office. Project lead called each sales engineer three times and recorded the time the phone was answered. The sales engineer also recorded the time he or she answered the phone. Green Belt of Project also compared the timestamps on the fax and in the voice mail system with his standards clock and determined that they, too, were accurate. (Use same spreadsheet setup to determine mean & s.d.) Y: Time (in hours) to ship field submittals Green Belt decided to audit 100% of her collected data. Smaller sample size. Data to be collected MSA plan Purchase order date Audit from physical or electronic purchase order copy Date submittal handed off to assistant Audit from Speedi program: "submittal date" field Date sales engineer downloaded Audit from Speedi program: "download date" field" Date submittal shipped to customer Time stamp and photocopy cover sheet before it is sent out 6.3 Finalize Project Data Collection Plan Y: Time (in minutes) for call response A member of Green Belt of Project's team noticed that the data collection form called for collecting data for items that the team had decided was beyond the project's scope. After some debate, the team agreed to refine the data collection form to include only those items that are related to the CTQ of response time to an initial call. Y: Time (in days) to ship field submittals Considering the low volume of both existing and future data points, the team decided they would collect data on all field project submittals, not just those for the Six Sigma-targeted customers. Measure 7: Data for Project Y 7.1 Communicate Data Collection Plan Y: Time (in minutes) for call response
  • 10. Revised 04/13/07 Page 10 of 25 To add emphasis on the need to collect data accurately and consistently, there was a top level communication for the data collection plan to the entire team in a full staff meeting, with members of team fielding questions. Y: Time (in hours) to ship field submittals • Baseline data would be collected between Sept and Dec, 2003. • The person who shipped the field submittal package would be responsible for collecting his or her own data and forwarding it to Project lead on a weekly basis. • Data to be collected included the date the purchase order was received, the date it was handed off to the sales engineer assistant, the date it was shipped to the customer, as well as additional information (such as method of shipment, requisition number) that would help with the tracking process. Project lead presented the data collection form that would be used to track submittals. At first, some team members resisted—they considered the data collection efforts to be just more demands on their time. To address this resistance, Project lead reinforced the Coach's assertion that data is the foundation of Six Sigma methodology, and that to ensure the success of a project; decisions should be based on data rather than simply intuition, internal knowledge, or business judgment. 7.2 Train Employees Y: Time (in minutes) for call response To ensure consistency and accuracy of the data, a member of Project team distributed the data collection form to each member of the office staff, explained how to collect the data, and answered individual questions. Y: Time (in hours) to ship field submittals Because of the relative simplicity of the data collection process for this project Y, no formal training was required. 7.3 Collect Data for Project Y and Potential Xs As outlined in their respective data collection plans, each team collected baseline data on their respective process. There were no problems with the data collection process. Measure 8: Process Capability 8.1 Graphically Analyze Data Now that each team had collected baseline data on their respective process, their next step was to visually display the data to help them understand how their processes were performing. Y: Time (in minutes) for call response The data from the 2-month period was collected and entered into the master spreadsheet. The team analyzed the data with help from their Black Belt. Using Minitab™, they first performed a normality test, which resulted in a p-value of 0.0. According to the Coach, as a general rule for normality testing, you can conclude that the distribution of the sample data is normal if the p-value is greater than 0.05. Therefore, the team concluded that their data was not normally distributed.
  • 11. Revised 04/13/07 Page 11 of 25 The Coach says that if you find that your data is not normally distributed, it is important to find out why. Non-normal distributions will often contain clues that will help you focus your project or identify vital causes of variation. The team searched the Coach for help on how to handle non-normal data. The first thing they did was look for outliers—data points that fall well outside the normal range of values for all the other data points. They identified two outliers: one at 28.6 hours and another at 38.7 hours. The Coach suggested that outliers could be removed if they were explainable by an activity that is outside of the process. For example, data that is improperly transcribed would be an activity outside of the call response process. Project lead went back to the phone logs to see if there was an error made in transcription but found that, in fact, those were the actual response times. In this case, there was no valid reason to remove the data. Black belt recommended treating the data as discrete data. Y: Time (in hours) to ship field submittals Project team collected baseline data on their process between mid-Sept and mid-Dec. A quick visual scan of the data revealed a great deal of variation on the number of days it took to ship drawings back to the customer. In addition, the team noticed an outlier—a data point that falls well outside the normal range of values for all the other data points. The outlier represented a single instance where it took 12 days to ship the submittal package; the rest of data ranged between 0 (same-day shipment) and 6 days. The team decided to include the outlier in their analysis because it reflected process variation within their control and there was no special cause for that data point. To determine whether the data in their sample was normally distributed, the team performed a normality test. The Coach says that if you find that your data is not normally distributed, it is important to find out why. Non-normal distributions will often contain clues that will help you focus your project or identify vital causes of variation. The normality test resulted in a p-value of 0.193. As a general rule for normality testing, you can conclude that the distribution of the sample data is normal if the p-value is greater than 0.05. 8.2 Calculate Z Value Y: Time (in minutes) for call response The team was now ready to calculate the Z value of their process. First, they needed to determine what kind of data they had—was it short term or long term? Because they were interested in knowing the capability of their current process, the data was considered to be long-term. The data is also considered long-term because the team could not differentiate between special cause variation (a specific, known factor) and common cause variation (variation caused by unknown factors); they had not grouped the data into rational subgroups. Using the Six Sigma product report, Green Belt and Project team calculated the Z bench value for their process. However, Z bench yields short-term process capability. To measure long-term capability, they needed to subtract Z shift from Z bench. This yielded a result of 0.55 sigma and 292,683 DPMO. This meant that their process, as it stood, was far short of being capable of meeting the customers' CTQ of having initial calls returned within 4 hours. Y: Time (in hours) to ship field submittals After visually analyzing the data, the team's next task was to examine the capability of their process, or how well their process was performing. Using the capability analysis tool with data entered for long-term
  • 12. Revised 04/13/07 Page 12 of 25 analysis, the team calculated long-term process capability for total ship time. This capability analysis established a baseline for comparing the process before and after improvements were made. The capability analysis showed a mean ship time of 3.6 days, with a standard deviation of 3.02 The initial sigma value, or ZLT, was 0.46—far from the 6 sigma goal. Assuming an estimated Z shift of 1.5, the ZST was 1.96. The calculated DPMO was 200,000. Measure 9 Improvement Goal for Project Y 9.1 Define Statistical Problem Y: Time (in minutes) for call response According to the Coach, this step does not apply to projects using discrete data. Because the call response team was treating their data as discrete, they proceeded to the next step. Y: Time (in hours) to ship field submittals Now that they had defined their process capability, the team needed to set their improvement goal, or statement of their project Y's performance that would meet their CTQ. To do this, the Coach states they first had to define the problem in statistical terms, stating the required reduction of defects in terms of defects per million opportunities (DPMO) and corresponding Z value. The statement of the statistical problem for the project Y was, "Reduce the variation of the time to return field submittals and to shift the mean of submittal response time closer to 0 so that all field-generated submittals are shipped less than 5 days from receipt of the purchase order.” 9.2 Identify Project Goal Each team's goal was to reach 6-sigma meaning process with a DPMO of less than four. Y: Time (in minutes) for call response The goal for this Y was to • Reduce defects from the current process's 292,683 DPMO to less than 4 DPMO • Increase its ZST value from 2.05 sigma to 6 sigma, or improve ZLT from 0.55 to 4.5 sigma You can revisit the goal statement at this stage and look at the performance opportunity. Y: Time (in hours) to ship field submittals The goal for this Y was to • Reduce defects from the current process's 200,000 DPMO to less than 4 DPMO • Increase its ZST value from 1.96 sigma to 6 sigma, or improve ZLT from 0.46 to 4.5 sigma At this point the team updated its charter to include these details of their project goal. 9.3 Determine Improvement Methodology Y: Time (in minutes) for call response The team reviewed the information from the voice of the customer to determine whether their process did, in fact, have the potential to reach the 6 sigma goal.They benchmarked other call centers with similar
  • 13. Revised 04/13/07 Page 13 of 25 processes and volumes. Even though their process was running at a sigma level of 0.55, based on the benchmark data, the team felt it was possible to improve the process to meet the performance goal of 6 Sigma. Y: Time (in hours) to ship field submittals Now the team needed to determine whether their process did, in fact, has the potential to reach the 6- sigma goal. They benchmarked other sales offices with similar processes and volumes and decided it would be possible to meet their performance goal by improving their existing process. Because a complete redesign of the process was not necessary, they continued their project using the DMAIC methodology. ANALYZE 10: PRIORITIZED LIST OF ALL Xs 10.1 Capture all possible Xs Next, each team looked at all potential sources of variation, or Xs, for each of their project Ys. Y: Time (in minutes) for call response The team gathered to brainstorm the causes for variation in returning customer calls. They created the following list: 1. Different cell phones/service providers: There were three different service plans in use among the sales engineers, each with its own set of charges, amount of airtime and service area. 2. Staff reduction: The administrative assistant, who used to screen and route calls, was laid off and not replaced. 3. Inconsistent practice in returning calls: The sales engineers were unaware of the customer's need for a call back within 4 hours. 4. Inability of the voice mail system to inform sales engineers of new messages 5. Inability of the voice mail system to allow caller to mark messages as urgent Y: Time (in hours) to ship field submittals The team came up with the following possible Xs: • A delay in handing off the purchase order to the sales engineer assistant • Complexity of quotation: A complex quotation may take extra time to put together. • Technological limitations: A sales engineer on the road may not have access to the Speedi quotation software and therefore would not be able to transmit the quotation to the sales engineer assistant • Need for factory input: The price of the job still needs to be cleared by the factory or the need for factory documentation could delay the process • Customer need for alternative quotes: The need to compile extra pieces for the submission package could delay the process • Sales engineer assistant experience level: In some cases, sales engineers may want to do the drawings themselves instead of placing the responsibility on less experienced sales engineer assistants. Then the team then consulted their SIPOC map and matched each X to corresponding steps in their process. Step Corresponding X Sales engineer hands off purchase order to sales engineer assistant • Delay in handing off purchase orders to sales engineer assistants • Technological limitations Sales engineer assistant compiles submittal package • Complexity of quotation • Need for factory input • Customer need for alternative quotes • Sales engineer assistant experience level
  • 14. Revised 04/13/07 Page 14 of 25 10.2 Draft Prioritized List of Xs In this step, each team selected the Xs that had the highest degree of impact on the process for their Y. Y: Time (in minutes) for call response Green Belt of Project then placed the Xs the team had captured into a table and asked each team member to rank them in order of priority. Before they got started, one of the team members asked if the inability of the voice mail system to mark urgent messages was really an X. "After all," she said, "it doesn't matter whether or not the customer marks this initial call as urgent. We still have to get back to them within 4 hours." After a few minutes of debate, the rest of the team agreed that this really wasn't an X and so it was taken off the list. Each team member prioritized the Xs. With Project lead serving as facilitator, they debated the rankings until they came to an agreement on the prioritization. Here's what they decided: 1. Inconsistent practice in returning calls 2. Different cell phones/service providers 3. Inability of the voice mail system to inform sales engineers of new messages 4. Staff reduction Y: Time (in hours) to ship field submittals The team looked at their baseline data to help them prioritize their Xs. They looked again at the outlier representing the single instance where it took 12 days to ship the submittal package. This outlier reflected a 9-day delay between the time the sales engineer received the purchase order and when it was handed off to the sales engineer assistant. From looking at their baseline data, the team suspected that had their project Y was mostly impacted by the Xs related to sales engineer to sales engineer assistant turnover time. Analyze 11: List Vital Few Xs 11.1 Verify Xs with Data Y: Time (in minutes) for call response Now it was time for Project lead to determine if the Xs they had uncovered had a significant impact on their project Y. They turned to the Coach to learn how to do this and found a 7-step process, which they followed for each of the Xs. X: Inconsistent practice in returning calls Green Belt of Project asked BB if they hadn't already verified this X by analyzing the baseline data. BB said they had calculated the sigma value of the process, but this X had not been verified with data. They needed to show whether there was a difference among the sales engineers' practices of returning calls . Black Belt on the project created a survey to examine the sales engineers' process for returning calls. He performed a measurement system analysis to ensure it would not add variation to the data collection process. The survey showed that the sales engineers fell into three broad categories: those who returned calls within an hour, those who returned calls within a couple of hours, and those who returned calls only when they had an answer for the customer, which could take up to several days.
  • 15. Revised 04/13/07 Page 15 of 25 The alternative hypothesis for this X stated, "There is a significant difference in the practice of returning customer calls among the sales engineers." The null hypothesis stated, "There is no difference in the practice of returning customer calls among the sales engineers." Because the team was testing the difference in the means of three levels of a factor, they selected the 1- way ANOVA tool. Here are the results: Analysis of Variance Source DF SS MS F P C2 2 1737 869 7.01 0.003 Error 38 4711 124 Total 40 6448 The p-value showed a 0.3% chance of error in rejecting the null hypothesis. A p-value of less than 0.05 indicated a significant difference between the means. This was a vital X. X: Different cell phone/service providers They agreed that this X needed to be verified with data since 1) it was not known to be an X from a previous project, and 2) it would not be too costly to collect the data to verify this X since there were only three different service plans in use. The data they had for this X, the amount of air time for a particular plan, was stable because the time was the same each month for each particular plan. The team then came up with the alternative hypothesis for this X: "There is a difference in the call response time of sales engineers with different cellular phone services." The null hypothesis for this X stated, "There is no difference in the call response time of sales engineers with different cellular phone services." Now the team needed to select the proper statistical tool. They returned to the Coach and found a table of tools in the How section of this step. They opted to use the 1-way ANOVA tool. Here are the results: Analysis of Variance for Different Calling Plans Source DF SS MS F P C2 2 1862 931 7.72 0.002 Error 38 4586 121 Total 40 6448 The p-value for this test told the team that there would only be a 0.2% chance of error if they rejected the null hypothesis. Because the p-value was less than 0.05, they could accept the alternative hypothesis that there is a significant difference in the call response time of the sales engineers with different cellular phone calling plans. This is a vital X. X: Inability of the voice mail system to inform sales engineers of new messages The team applied the same process to this X as they used in the previous one. This lack of capability was a particular problem for those sales engineers who were out in the field. Unless they called in for their messages on a regular basis, they had no way to know if a new call was in their voice mailbox. The team agreed that they needed to test this X because it was not known from another project and it would not be too costly to collect the data. The data was already known to be acceptable and the process, the voice mail system, was known to be stable. The team stated the alternative hypothesis as, "The call response time will be less for sales engineers in the office than those in the field." Their null hypothesis stated, "There will be no difference in the call response time between the sales engineers in the office and those in the field." The team chose to use a 2-sample t-test because they were interested in finding out if there was a difference in the means of two samples from the same group. Here are the results of the test:
  • 16. Revised 04/13/07 Page 16 of 25 Two sample T for Response Time N Mean StDev SE Mean 1 (Inside) 27 0.90 1.26 0.24 2 (Outside) 14 14.7 18.9 5.1 95% CI for mu (1) - mu (2): ( -24.74, -2.9) T-Test mu (1) = mu (2) (vs <): T = -2.73 P = 0.0086 DF = 13 The data clearly shows a difference in the mean response time between the two groups of sales engineers. The p-value shows only a 0.8% chance of error in rejecting the null hypothesis. Therefore, the inability of the voice mail system to inform sales engineers of new and urgent messages is also a vital X. X: Staff reduction The team was able to reject this as a vital X because data had never been collected on the administrative assistant's call screening and forwarding process. This would make it impossible to verify with data that this had an impact on call response time. Y: Time (in hours) to ship field submittals Next, the team needed to prioritize their Xs according to their effect on process variation. List of possible Xs • Delay in handing off purchase orders to sales engineer assistants • Technological limitations • Complexity of quotation • Need for factory input • Customer need for alternative quotes • Sales engineer assistant experience level According to the Coach, the prioritization of Xs would have to be based on data. In the last step, the team suspected that had their project Y was mostly impacted by the Xs related to purchase order hand-off time. Now they needed to prove statistically that this was the case. To obtain this proof, they did a regression analysis. The regression equation is response time = 1.52 + 1.11 hand off Predictor Coef StDev T P Constant 1.5190 0.5614 2.71 0.018 hand off 1.1148 0.1937 5.76 0.000 S = 1.663 R-Sq = 71.8% R-Sq(adj) = 69.6% Analysis of Variance Source DF SS MS F P Regression 1 91.639 91.639 33.13 0.000 Error 13 35.961 2.766 Total 14 127.600 Unusual Observations Obs hand off response Fit StDev Fit Residual St Resid 1 9.00 12.000 11.552 1.447 0.448 0.55 X 3 0.00 6.000 1.519 0.561 4.481 2.86R R denotes an observation with a large standardized residual X denotes an observation whose X value gives it large influence.
  • 17. Revised 04/13/07 Page 17 of 25 The high R-sq value from the regression analysis showed that purchase order hand-off time explained 72% of the variation in ship time. They recognized that this was due to correlation, not causality. They had not conducted any experiments where they deliberately altered this X to determine if it caused a change in the Y, but they could clearly tell purchase order hand-off time was a vital X. To be sure they did not overlook another significant source of variation, the team assessed the other Xs on their list for their effect on process variation. To do this, they reviewed the various logs and tracking mechanisms they had used in the data collection process and conducted 1-way analyses of variance to determine the degree of impact each X had on the project Y. They found that no X accounted for more than 6 percent of total process variation. They team prioritized their list of Xs, according to statistical significance. Prioritized list of Xs 1. Delay in handing off purchase orders to sales engineer assistants 2. Technological limitations 3. Complexity of quotation 4. Need for factory input 5. Sales engineer assistant experience level 6. Customer need for alternative quotes 11.2 Finalize List of Vital Few Xs Now each team needed to identify the vital few Xs—the small groups of inputs or processes that had the most direct effect on their project Ys. Y: Time (in minutes) for call response Since the project scope didn't permit hiring additional staff and data had not been collected when there was staff in place, the staff reduction X was taken off the list. This left the following three vital Xs: Vital Few Xs 1. Inconsistent practice in returning calls 2. Different cell phones/service providers 3. Inability of the voice mail system to inform sales engineers of new and urgent messages Y: Time (in hours) to ship submittal drawings Given the results of the regression analysis, the team could clearly see that most of the variation in their project Y was caused by purchase order hand-off time. However, the team realized they needed to "drill down" this X; it had a CAP element. They determined the root cause of the delay seemed to lie in the sales engineers' overall lack of awareness of the importance of timely purchase order turnover. The team agreed they needed to educate the sales engineers as to the importance of this hand-off, since it would be measured in the future. They also drilled down the technical limitations X to the real problem: access to the Speedi quotation software. Vital Few Xs 1. Engineers' overall lack of awareness of the importance of timely purchase order turnover 2. Access to Speedi quotation software
  • 18. Revised 04/13/07 Page 18 of 25 Analyze 12: Quantified Financial Opportunity 12.1 Refine Financial Benefits The two teams decided that at the end of their projects, they would compare the performance of the eight targeted accounts against the non-beta site accounts to see whether any improvement in business was found. In the meantime, each team assessed the financial benefits relating to their respective project Ys. Y: Time (in minutes) for call response Green Belt of Project's team knew they had an opportunity to improve customer satisfaction by improving call response time, but had no verifiable way to link that improvement with increased sales. Any connection was, at best, anecdotal. The call response team did wonder if a cost savings could be realized by moving everyone to the same cellular phone service plan. Green Belt of Project selected two team members to investigate this. Y: Time (in hours) to ship field submittals Using their 15 baseline data points, the team ran a chi square test to see if timeliness had any impact so far on bid win rates. The team recognized they had a small amount of data, but knew they should start at that point and add to it as they continued to collect data. Win Bid Lost Bid Submit <40 hours 4 1 Submit >40 hours 3 8 Here are the results of the chi square test: Expected counts are printed below observed counts win lost Total <5 days 4 1 5 2.19 2.81 > 5 days 3 8 11 4.81 6.19 Total 7 9 16 Chi-Sq = 1.502 + 1.168 + 0.683 + 0.531 = 3.883 DF = 1, P-Value = 0.049 3 cells with expected counts less than 5.0 Black Belt on the project knew the average bid was worth $50,000. The win rate when submittal turnaround time was more than 5 days was 3/8, or approximately 40%, while the win rate when less than 5 days was 4/5, or 80%. If the team could double their win rate from 40% to 80%, they could expect to double growth. Of course, other factors such a pricing may mitigate, but the team could take that into account with a two-tiered set of chi-squared tests. They needed more data in order to verify the benefits, and so Project lead continued to collect win/loss and value of bid data along with the performance of the submittal time Y. 12.2 Estimate Costs Associated with Improved Process
  • 19. Revised 04/13/07 Page 19 of 25 Y: Time (in minutes) for call response The team analyzed the cost associated with the different cellular phone provider/service plans held by the various members of the sales team. The team then solicited bids from other cellular service providers to compare costs and services. They team presented this information in a matrix. The team determined that if the entire office switched to one provider and one service plan, they could realize an annual savings of $19,000. This included the purchase of new cell phones with increased functionality including text messaging and paging. The cost structure eliminated roaming fees and expanded the service area to the entire country. Although there was an additional start-up cost associated with starting this service, it would quickly be recovered. Two members of Call response Project's team investigated the cost associated with changing the voice mail system. They learned that the functionality they wanted was already available on their existing system; it just hadn't been activated. There would be no new costs associated with activating this feature because it could be done under the existing service agreement. In fact, the activation of the notification feature included a way to allow the customer to mark their call as urgent. This would trigger the system to dial the sales engineers' beepers and alert them to the urgent message, thus interfacing with the new proposed cellular phones. Y: Time (in hours) to ship field submittals No new costs were associated with implementing the solution for this particular Y. 12.3 Identify Intangible Benefits At this point, each team needed to identify the intangible benefits, or favorable outcomes that are not reportable for formal accounting purposes, but nonetheless help justify the project. Y: Time (in minutes) for call response The team identified the following intangible benefits: • Increased quality of customer communications • The ability for customers to reach their sales engineer much faster • Increased customer confidence in the sales engineers’ ability to respond to their calls Y: Time (in hours) to ship field submittals The team identified the following intangible benefits: • Increased team cohesiveness as a result of working together toward a common goal • Enhanced customer relations; noticeable customer enthusiasm and appreciation for the team's efforts to improve quality of service • Increased exposure to customers; sharing scorecards provided an opportunity to get in front of the customer and reinforce the team's commitment to making the process improvement Improve 13: Proposed Solution 13.1 Identify Improvement Strategy Y: Time (in minutes) for call response The team created the following improvement strategy:
  • 20. Revised 04/13/07 Page 20 of 25 X: Inconsistent practice in returning calls Improvement strategy: The team would add an extra step to the call response process: Whether or not the customer issue was resolved, sales engineers would return customer calls within 4 hours, at least advising the customer of the status of their issue X: Different cell phones/service providers Improvement strategy: The team would change to a singular cellular carrier offering consistency in service and greater functionality. This strategy would be implemented in early July. X: Inability of the voice mail system to inform sales engineers of new and urgent messages Improvement strategy: The team decided to activate the functionality on the voice mail system that would allow urgent/new message flagging and automatic paging. This strategy would be implemented in early July. In addition, the sales engineers developed strategies to address some other concerns about the voice mail system that had surfaced in the Voice of the Customer data. These additional strategies would not cost any more to implement and would increase the intangible benefits. • All sales engineers would record a uniform greeting using a standardized format. This greeting would be updated daily. • They would simplify the menu system to make it easier to navigate. • The menu system would clearly instruct customers in how to reach a live person Y: Time (in hours) to ship field submittals In order to ship field submittals within the 5-day specification limit, the team needed to reduce purchase order hand-off time and establish a performance standard for that time. Through group discussion, they determined that it was reasonable to allow no more than 1 day from when the sales engineer received the purchase order to when it was handed off to the sales engineer assistant. This 1-day turnaround time became the operating tolerance for this vital X. The team also identified an improvement strategy to overcome technological limitations related to accessing the Speedi quotation software. When out-of-town, sales engineers would phone in confirmed orders to assistants so they could begin the submittal compilation process. In addition, the team identified two actions they could take to enhance the presentation and quality of service: • Send customers a fax notification of submittal shipment • Standardize the cover sheet that was sent with submittals These improvements were in response to customer feedback the team received when they gathered Voice of the Customer data. The fax notification supports the team's measurement system analysis by providing another time-stamped record to cross-reference. 13.2 Experiment to Determine Solution Y: Time (in minutes) for call response The team decided the most effective way to test their improvement strategies was to conduct a full-scale pilot that involved the entire office. The changes that needed to be made to the voice mail system could not be made for one or two individuals; rather, they would affect the entire office. Also, it did not make economic sense to switch only a few staff members over to a new cellular service. Savings would only be realized by switching the entire office. Y: Time (in hours) to ship field submittals
  • 21. Revised 04/13/07 Page 21 of 25 Black Belt and Project lead identified a pilot as an excellent way to test whether their suspected vital Xs were indeed vital. As is often the case in commercial processes, classic DOEs with high and low settings are more difficult to design, typically because of the scarcity of data points and the difficulty of setting multiple Xs at high and low settings. Instead, the team would use a form of evolutionary operations where they planned to set the process and Xs at their estimated optimal settings and run the process. They would then determine whether the Xs indeed had the effect on the Y. Improve 14: Piloted Solution 14.1 Plan the Pilot Y: Time (in minutes) for call response The team agreed that there would be no need to change the data collection plan or the form they used to collect their baseline data. Everyone would follow the same procedure and submit their logs for data entry. Because the new cellular service and changes to the voice mail system would not be implemented until early July, the team decided they would check their voice mail once an hour as a containment strategy. They also changed their voice mail greeting to the standardized format. Y: Time (in hours) to ship field submittals The team turned to the Coach for guidance in how to pilot their solution. The first step, according to the Coach, was to assess the risks involved. The team determined that running a pilot posed no risk of financial loss, negative effect on customers, or drain on internal resources. They also decided their resource requirements and procedure would be identical to what they had outlined in the data collection plan they used to gather their baseline data. The time frame set for the pilot was 2 months. 14.2 Run the Pilot and Collect Data Y: Time (in minutes) for call response The pilot was run from the beginning of May through the end of June. Following the data collection plan to collect the baseline data, the members of the office logged 5 to 10 initial calls and then submitted the log sheets for data entry. Y: Time (in hours) to ship field submittals Project lead communicated the strategy for running the pilot and collecting data at a full staff meeting. Since everyone was already knowledgeable of the data collection procedure, no formal training was necessary. 14.3 Analyze the Results of the Pilot Y: Time (in minutes) for call response At the end of the 2-month pilot, Black belt on project analyzed the results of the data collection in Minitab™ using a 2-sample t-test.
  • 22. Revised 04/13/07 Page 22 of 25 The alternative hypothesis stated, "The mean of the baseline would be greater than the mean of the pilot." The null hypothesis stated, "The mean of the baseline response and the mean of the pilot response time would be the same." The team needed to know if they had reduced the variation in their process. Turning to the Quality Coach, they determined they should use the homogeneity of variance tool. Green Belt of Project turned to the Coach to interpret the results of the homogeneity of variance test. The information in the Coach explained the p-value from Levene's test is used when the data comes from distributions where normality has not been determined or from continuous, but non-normal distributions. Though close, the p-value was less than 0.05, indicating a statistical difference in the variation of the baseline data and the data from the pilot. This meant the team had been successful in reducing both the mean and the variation. Y: Time (in hours) to ship field submittals After running the pilot for 2 months, Project lead reran her process capability analysis with the new data. The results of the capability analysis showed the new ZLT was 5.49. Assuming a Z shift of 1.5, the ZST was 6.99, well above the project goal. This clearly satisfies the Coach's criteria that the output shows a significant difference that can be attributed to the solution. Project lead then conducted a 2-sample t-test to confirm that the process was statistically improved. She established a null hypothesis: "There is no difference in submittal time between the original process and the improved process." Her alternate hypothesis was, "Submittal time for the original process is greater than for the improved process." Here are the results of the 2-sample t-test: Two sample T for Original Process vs Improved Process N Mean StDev SE Mean Time to 15 24.5 20.1 5.2 Response 10 10.30 5.40 1.7 95% CI for mu Time to - mu Response: ( 2.6, 25.8) T-Test mu Time to = mu Response (vs >): T= 2.59 P=0.0099 DF= 16 Control 15: Sustain Solution 15.1 Develop Control Plan Y: Time (in minutes) for call response The team met to develop a control plan for each of the vital Xs. Using brainstorming they came up with the following plan: Item Control Action Implementation Date Cellular phone service Switch to new carrier and purchase new cell phones for all sales engineers July 1 Data collection Sales engineers will continue to collect data for one week each month July 1 Internal dashboard Create to show previous month's results August 1 External dashboard Create to show to customers on July 1
  • 23. Revised 04/13/07 Page 23 of 25 sales calls Control chart Create to show previous month's results August 1 Training manual Update to show new procedures July 1 Y: Time (in hours) to ship field submittals To ensure that the benefits from the solution continued to be realized, the team created a control plan outlined as follows: • The team would continue to follow the data collection plan and collect data on every field submittal as part of their normal record keeping routine. This data would become part of the dashboards (or scorecards) for the eight beta site customers. In addition, the team would keep an internal metric on hand-off time for purchase orders. • Team leader Project lead would continue to compile and analyze the data using the applicable statistical tools. • Project lead would hold responsibility for responding if the process goes out of control based on the control chart for hand-off time. She would investigate the cause of the defect and take appropriate measures to prevent it from recurring. In addition, Project lead created a standard operating procedure documenting the team's control plan. This would ensure consistency for future projects and transcend future personnel changes to the team. 15.2 Implement Solution Y: Time (in minutes) for call response Beginning in July, the office implemented the control plan with the solution in place. This included the new cellular phone service and the changes to the voice mail system. Monthly control charts were produced to show the results of the plan. As the office became more confident that the plan would become the established way of doing business and the results showed sustained performance, the sampling would be cut back to once a quarter. Y: Time (in hours) to ship submittal drawings The implemented solution was the same as the proposed solution: • Sales engineers handed off all quotations to sales engineer assistants within 1 day after receipt to ensure the customer-defined 5-day performance standard for submittal shipment would be met. • In the future, 1-day handoffs would be included as part of the sales engineers metrics as part of the control plan. • If they did not have access to Speedi quotation software, sales engineers phoned in confirmed orders to assistants so they could begin the submittal compilation process. In addition to implementing the solution, to enhance the presentation and quality of service, the team • Sent customers a fax notification of submittal shipment • Standardized the cover sheet that was sent with submittals 15.3 Confirm Solution Y: Time (in hours) for call response As before, project lead and green belt tested the normality of the data. Since the p-value of the data was less than 0.05, the data was still considered to be non-normal. They used the same method to determine
  • 24. Revised 04/13/07 Page 24 of 25 the sigma value of the process as they had when they first analyzed the baseline data. Using the Six Sigma product report to calculate Z value, the team achieved a sigma level of 5.5 by the third month of implementing the solution for this Y. To show that the mean and the variation had been reduced, Green Belt of Project and black belt repeated the analysis they performed to analyze the results of the pilot study. Here are the results. 2-sample t-test: C2 N Mean StDev SE Mean 1 41 5.6 12.7 2.0 2 26 0.698 0.622 0.12 95% CI for mu (1) - mu (2): ( 0.9, 8.93) T-Test mu (1) = mu (2) (vs >): T = 2.48 P = 0.0088 DF = 40 The team was able to say with statistical certainty that they had improved on both the mean and the variation of the call response Y. Green Belt of Project wanted to be able to show the rest of the team that new process for handling call response was in control. Y: Time (in days) to ship submittal drawings In order to statistically prove they had met their 6 sigma goal, the team calculated the capability of the improved process and compared it to their goal and original process data. A final 2-sample t-test was run comparing the baseline data with the implemented solution from data collected from August to December. The test confirmed that the mean of their Y had indeed shifted. An F-test was also conducted; it confirmed that the standard deviation was significantly reduced as well. Finally, confidence intervals were calculated for the new process ZLT and compared to the confidence intervals for the old process ZLT. The confidence intervals did not overlap, confirming that the process Y had been improved. The impact of the project's success was evident in the Voice of the Customer. Here is what two of the beta customers had to say: "This project has made the company much easier to do business with and has helped them gain a competitive advantage." "The projects the Delhi team have done truly made a difference in their total service level." Control 16: Project Documentation 16.1 Finalize Financial Results The sales results were calculated at the end of the fiscal year and shown to the entire Delhi staff. "According to this chart", said Green Belt of Project," it looks like we saved the year by growing business with our eight beta customers." Black belt on project explained that their projects had been very successful in reducing process variation, but there was no statistical way to prove that reducing that variation caused an increase in business. It was hoped that as a result, customers would remain loyal to the company and do more business and that appears to have been the case. 16.2 Complete Documentation Package
  • 25. Revised 04/13/07 Page 25 of 25 Each improvement team then completed a documentation package that would ensure the effectiveness of future projects and help other teams with their improvement projects. This package was compiled from a variety of documents the teams worked with during the course of their projects and included the following features, as recommended by the Coach: • Project abstract • Problem statement • Baseline data on process performance • List of vital Xs • Solution • Control mechanism • Performance metrics • Financial results • Lessons learned/best practices • Translation opportunities