SlideShare a Scribd company logo
1 of 7
Download to read offline
Overview
Specification Metrics are used to track various aspects of the specification process to determine areas of
issue and make improvements. Several data categories are tracked while they aren't specifically metrics
themselves, though they help facilitate improving metrics in various ways.
Types of documents/specifications to track metrics on:
• SUB – Top level single use bioreactor specification
• SUM – Top level single use mixer specification
• SUF – Top level single use fermentor specification
• Controller – Top level controller specification that is generally paired with a SUB/SUM/SUF
• SV5 – Component specification or Subassembly Specification
• SV2B – Component specification
• SUMDS0300 – 300 Liter docking station specification
• WID – Work instruction document
• FCD – Form control document
• Other – Some documents/specifications are initiated, and administered by other departments with us
only tracking the metrics. These are generally BPC documents instead of hardware documents.
There are four primary metrics, and a handful of secondary metrics. All the primary metrics and some of the
secondary metrics are reported at the R&D Tier 3 board on a metric flip book and the identical Tier 4 metric
flip book. This gives visibility for the issue we're seeing with the specifications.
Primary metrics: These metrics are currently considered the most important and the changes being
pushed for in the improvement process are taking information from these metrics.
Rejections: This is the number of times a document was rejected before it was released. We
also track rejection reasons to look for trends.
We report rejections on a weekly, monthly, and yearly basis in the metric book.
Review Time: This is the amount of time any given document spends in the review process.
Each department is meant to have a specific time allotted for review, and the Review Time is
the summation of that time. This metric is used to show us where we might be having trouble
with getting approvals done for a particular category. The different types of specs have a
different combination of approvers, so something that might not be a problem in one
category, is a problem in another category.
This number varies depending on how long the Department Approval Times are and the
number of times a specification is rejected. For example, if a department averages 5 days to
review a document, and one document is rejected 3 times, the Review Time will appear to be
20 days (5 for the 3 rejections and another 5 for the time it wasn't rejected). That's a really
big difference. This number highlights problems that might not seem as severe looking only at
Department Review Times and Rejections.
We report review times on a weekly, monthly, and yearly basis in the metric book.
Department Review Time: This is the amount of time the individual departments is taking on
each specification. For instance, a specification that had 2 rejections will have three numbers
summed into the department time average. If a department took 3 days to review the first
cycle, and 2 days to review the second and third cycles, it would add up to 7 days. When
taking into consideration that this was three different iterations of one document, we would
divide 7 by 3 to see that average review time every time the department got the document
was 2.3 days, well within the goal of 4-day reviews. This metric is primarily used to see if a
particular department is developing long term problems with getting specs reviewed. There
are always occasional spikes in the numbers because if someone takes a week of PTO and they
don't have backup for their position, then the numbers are going to look bad for a bit, but
tracking this metric allows us to pinpoint the issues and determine if they're actual problems
or temporary flukes.
We report department review times on a weekly, monthly, and yearly basis in the metric
book.
Overdue: An overdue specification is a specification that's been in one area of approval for
longer than the approval goal. This is usually the first indicator if any of the other metrics are
going to get worse. If a department has a whole bunch of specs that are overdue, one week is
probably going to have some poor Department Review Times in the next couple weeks.
We report overdue specifications on a weekly basis in the metric book.
Secondary metrics: These metrics produce useful information, but not as immediately important as
the primary metrics. Currently, not all of the secondary metrics are being reported at the tier
boards, but it's important to be able to answer questions with information these metrics provide
because they can support change driven with the primary metrics.
WIP Time: The Work In Progress Metric measures how long a specification spends in
collaboration. This is only a marginally useful specification because there are a number of
factors that prevent change or improvement to this metric. For instance, something from the
SUB/SUM category may spend 6-8 months in collaboration because we're waiting for customer
approval. The BPC category specs might spend several months in collaboration because those
that work on them have to research regulatory requirements or wait for experimental or
validation data. This is only marginally useful with SV5 category specs because though they do
have the least number of variables going into them collaboration-wise, they vary greatly in
size. A small SV5 category spec will have between 1 and 10 line items and take approximately
a half hour to complete. This turns into a lot of 0 day (same day) WIP times. A medium SV5
will have between 15 and 30 line items and take an hour or two to finish. This turns into a lot
of 0-1 day WIP times. A large spec, however, can have upwards of 300 line items in it, taking
up to a week to finish, while a more typical 50 line item large SV5 spec can still take several
hours, increasing WIP times as other specs may have to wait longer for the larger one to be
completed.
Total Time: This metric measure the total time a spec takes from initial request date to
release. It's the addition of the WIP and Approval Times. It's less useful than the Approval
Times because of the wild variations the WIP numbers can induce.
Average Requests Per Month: This is important for determining average workloads, so it's
listed in the metric book. It would only become very important if the workload increased
dramatically, making it so the current resources put toward specification creation and
approval were inadequate to meet current demand. This metric is reported in the metric
book.
Open, Closed, Total: These three metrics simply show how many requests are still in process,
how many have been completed, and the total number of requests altogether. Data from
these metrics feeds up into the Average Requests Per Month and would only be valuable in and
of themselves in cases of high volume, or when certain requests are exceptionally old (like if
there are still 10 requests open from eleven months ago). These metrics are reported in the
metric book as they can give a sense of how much has been completed, and how long it
sometimes takes to complete things.
Number of Approvals Per Month: When using this metric in comparison with the number of
requests received per month, you can get a sense of when the approval process might be
slowing down for one reason or another and indicate that some research needs to be done.
For instance, one would want to know why if they received 38 requests in November, only 10
requests were closed in November.
Percentage of Approvals w/o Rejection: This metric provides data on how the number of
rejections is either increasing or decreasing. It's not immediately important if the Cycles and
Approval Times are kept in check, but it can show longer term improvement. For instance, if
the 6 month average six months ago is 24%, and the six month average now is 51%, then you
know that the number of specs being approved without rejection has doubled.
The Specification Metrics file tracks a few more points of information that aren't metrics, but are useful for
streamlining the spec process. Typically, this data is kept for only 6 months after a request is completed as
reference. The important parts of any of these information sets will have been transferred to more
permanent data areas in the file before deletion.
MC Task Status: This is just a small note of where the spec is in the approval route. It points out
which specs are likely to be completed soon and which are going to take longer, without constantly
checking the status of every spec in the approval process. It's most useful for noting if something is in
escalation, or which ones are most likely to have changed when the metric book needs to be
updated.
Requested: This is the original request date and won't always match the date the infocard task in
Master Control is created. I keep this information so what I put in the final metric data reflects the
actual date of the request instead of when the Master Control task is started.
Notes: Occasionally a task requires special handling in some form or another. This is where I put
notes on if someone has requested to be notified when the task is finished, or why the spec hasn't
been sent out for approval yet. It's a quicker reference than going into the task every time I wonder
what it's doing there and reading through all the notes that may be interspersed in task comments
and my separate work email.
Requestor: Noting the person who made the original request makes it easy to know who to go to
with questions in case they come up.
Revision, ECO, IM, Change Table, Attachments, SOP0020: Some of these are my equivalent of the
checklists at the end of this instruction, and some of them are redundancies meant to hold the
information needed if there's a typo on other documentation, such as the New Revision and ECO. The
ECO in particular can be problematic because there's no way to search for an ECO by its content. If
the ECO is accidentally input incorrectly on the spec, say typing 010_12993 instead of 010_12933,
unless it's noted somewhere that the ECO is 010_12933, you will have to manually go through
hundreds of ECOs looking for the one you made, or creating a new one and dealing with the even
more complicated problem of the revisions needing to be changed.
How the data for the metrics and other information is obtained varies. At the lowest level, everything is
entered manually. Then some of the upper level information is calculated using Excel's programming
capabilities. This file contains mostly the simpler kind of equations, but it's important not to break them, or
copy the equations incorrectly because it can skew the metrics the data is meant to gather.
The first seven sections in this instruction will cover an individual tab within the Spec Metrics file. It will go
over all the listed information, as in what it's used for and what it means, if it feeds into other data in other
parts of the file, any programming or equations built into it, why things are formatted the way they are
(color coding is important in some place, unimportant in others, may mean something different on another
tab, and may mean the same thing as other tabs – I should perhaps work on making this more consistent
than it is), and exactly what method is being used to do the things the tabs are being used for.
After the sections explaining the Spec Metrics file, there will be sections containing process checklists, and
instructions for the metric book.
Summary
Purpose/Use
This tab contains historical information on Cycle, Approval Time, WIP, To Release, and Specs Per Month
average. It also lists the number of open, closed, and total requests. Request data is compiled by
request date rather than completion date, and the remainder of the information, while also garnered
from the Complete and Tier4 tabs, shows 6-mo, YTD, and 12-mo averages. This data isn't reported
directly on any tier board, but is useful for forecasting workload and observing data trends over an
extended period of time.
Relation to other tabs
Open requests are pulled from the Tracking tab, but all other non-automatically-calculated information is
taken from the Completed tab.
Programming/Equations
The equations in this tab are AVERAGE and SUM equations. Each section has a 6-month, YTD,
and a 12-month AVERAGE, and the section regarding number of requests also has SUM
equations. The content between the parentheses will be different depending on what
content is being averaged or summed.
Requests column
This is a simple SUM equation. The cell with an existing equation in it can be dragged up or down into a new
cell and the equation will pull from the correct cells to calculate the new value.
6-Month Average column
This is a simple AVERAGE equation. The cell with an existing equation in it can be dragged up or down into a
new cell and the equation will pull from the correct cells to calculate the new value.
Formatting
Average values are grayed out. Otherwise, there is no special formatting.
How to use this tab
Since the data is not vital to metric reporting, this tab is only updated periodically. I've been updating it
about twice a month.
Requests Section
To update the requests section, first check your numbers against the open requests in the Tracking tab.
In this example, the Summary tab shows 15 open requests, and when sorting
the entries in the Tracking tab by date, there are 14 open requests. So
the value changes to 14. This automatically recalculates the total
number of requests for the month and all average values (if the value is
enough to affect the averages).
Process Checklists
In the Specification Itself
Formatting
□ Document is in Arial 10 pt
□ All paragraphs of equal number rank have the same indents
□ Automatic or manually input section numbering is correct
□ No single row in any given table is split between pages
□ Drawn lines in table do not have inconsistent or odd formatting
Standard Document Compliance
□ Spec meets SOP0020 requirements (refer to SOP 0020)
□ All changes to standard template have been applied to the spec
Change Table
□ All changes made to the document are listed for the revision (documents on revision 1 list that it's on the
initial release of the document)
□ All non-standard changes have the reason for the change listed in the "Change Made" column
□ New revision's ECO is shown in far right column (if there was no ECO column, it was added with "N/A"
shown for previous ECOs)
□ Change table is altogether on one page (unless the change table is larger than one page, in which case,
the change table begins at the top of its own page)
Content
□ Part number descriptions match the drawing's description
□ Correct labeling instructions are included (i.e. standard labeling requirements or electrical labeling
requirements)
□ Correct packaging instructions are included (standard for most SV5, and matching packaging table for
SUB/SUM/SUF)
□ Listed suppliers are current and certified to supply the listed products
□ For SV5: Motor, Tank, Cart, E-Box sub-component spec includes Passivation, COC, and FCD requirements
□ For SV5: Standard sub-component spec includes only COC requirements, and no FCD requirements
□ For SUB/SUM/SUF: FCD Requirements are listed
□ For SUB/SUM/SUF: Complete products, those that have an E-Box, include DOC and User Guide
requirements, not DOI or Assembly Instruction requirements
□ For SUB/SUM/SUF: Incomplete products, those that do not have an E-Box, include DOI and Assembly
Instruction requirements, not DOC or User Guide requirements
For Item Masters
Initial Setup
□ Part number is created correctly and matches part number on the drawing
□ For SV5: Part number description matches description on the drawing
□ Product unit is listed as EA
□ Part expiration is set to "No Control"
□ Organization 010 is assigned
□ Template "010 Hardware" is assigned
□ Planning Category "BPC Component" is assigned
□ Planning Class "94" is assigned
□ Storage Category is assigned per specification requirements
□ Receiving instructions are assigned per specification requirements
For Parts with Changes
□ Part number description already in Oracle matches description on the drawing
□ Receiving instruction match what is currently in the specification
For ECOs
□ All part number organization assignments are verified
□ All parts assigned to 010 are listed on the 010 ECO at the same spec revision
□ All parts assigned to 050 are listed on the 050 ECO at the same spec revision
□ For all parts assigned to organizations other than 010 & 050, refer to WID0380 to see if the atypical
organization requires an ECO, and if they do, verify that all parts assigned to these organizations are
listed on the corresponding ECO at the same spec revision
□ Only Active part numbers are listed on the ECO(s)
□ Only part numbers from one specification are listed on the ECO(s)
For Master Control
In the Infocard
□ The Title field matches the Title in the spec exactly
□ The revision number matches the spec
□ The Author field is populated with the name of the person who is responsible for this revision of the spec
□ The Owner field is empty
□ The spec file is attached as the Main File and is not in the Attachments section
□ There are no corrupt files in the Attachments section
□ For SUB/SUM/SUF ETO Specs: The Customer is listed in the "Company Name" field of the Custom Fields
section
In the Task Packet
□ A reasonable due date is listed for completion
□ Any changes regarding drawing revisions are listed in the instructions
□ Any changes regarding new part numbers are listed in the instructions
□ Any changes regarding inspections are listed in the instructions
□ The spec is on the correct approval route
□ The complete spec number is in the task name
□ For SUB/SUM/SUF ETO Specs going through initial release: The Customer Name is listed in the task name

More Related Content

Viewers also liked

Viewers also liked (8)

Luxe Leasing
Luxe LeasingLuxe Leasing
Luxe Leasing
 
preservation of life at sea - editorial
preservation of life at sea - editorialpreservation of life at sea - editorial
preservation of life at sea - editorial
 
Los Cabos, Baja Sur, Mexico 2013
Los Cabos, Baja Sur, Mexico 2013Los Cabos, Baja Sur, Mexico 2013
Los Cabos, Baja Sur, Mexico 2013
 
Business directory for local business listing
Business directory for local business listingBusiness directory for local business listing
Business directory for local business listing
 
The Importance of LinkedIn for Students
The Importance of LinkedIn for StudentsThe Importance of LinkedIn for Students
The Importance of LinkedIn for Students
 
Nueva generación de enfriamiento en Data Centers: Smart Aisle
Nueva generación de enfriamiento en Data Centers: Smart AisleNueva generación de enfriamiento en Data Centers: Smart Aisle
Nueva generación de enfriamiento en Data Centers: Smart Aisle
 
mark reference doc-3
mark reference doc-3mark reference doc-3
mark reference doc-3
 
CV
CVCV
CV
 

Similar to met_ins_exerpts

Improve Issue Book Management
Improve Issue Book ManagementImprove Issue Book Management
Improve Issue Book ManagementHiram Alvarez
 
Ch 5 - Requirement Validation.pptx
Ch 5 - Requirement Validation.pptxCh 5 - Requirement Validation.pptx
Ch 5 - Requirement Validation.pptxbalewayalew
 
HCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing MetricsHCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing MetricsHCL Technologies
 
Why we need to control scope
Why we need to control scope Why we need to control scope
Why we need to control scope Nooria Esmaelzade
 
Smef2008 Van Heeringen Outsourcing Testing Activities – How To Prove Cost R...
Smef2008 Van Heeringen   Outsourcing Testing Activities – How To Prove Cost R...Smef2008 Van Heeringen   Outsourcing Testing Activities – How To Prove Cost R...
Smef2008 Van Heeringen Outsourcing Testing Activities – How To Prove Cost R...Harold van Heeringen
 
Acceptance Criteria as Requirements and Tests
Acceptance Criteria as Requirements and TestsAcceptance Criteria as Requirements and Tests
Acceptance Criteria as Requirements and TestsJohan Hoberg
 
Early watch report
Early watch reportEarly watch report
Early watch reportcecileekove
 
Static techniques software development - Testing & Implementation
Static techniques software development - Testing & ImplementationStatic techniques software development - Testing & Implementation
Static techniques software development - Testing & Implementationyogi syafrialdi
 
Iso 9001 audit procedures
Iso 9001 audit proceduresIso 9001 audit procedures
Iso 9001 audit procedurespertermasuki
 
Quality management in projects
Quality management in projectsQuality management in projects
Quality management in projectsselinasimpson311
 
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...Maaret Pyhäjärvi
 
Extending Change Auditing to Exchange Server
Extending Change Auditing to Exchange ServerExtending Change Auditing to Exchange Server
Extending Change Auditing to Exchange ServerNetwrix Corporation
 
PMI-Service-Level-Agreement for all transactions
PMI-Service-Level-Agreement for all transactionsPMI-Service-Level-Agreement for all transactions
PMI-Service-Level-Agreement for all transactionsRaymondSmith96
 
MS Project 2010 schedule template for large NASA procurements $10-450 million
MS Project 2010 schedule template for large NASA procurements $10-450 millionMS Project 2010 schedule template for large NASA procurements $10-450 million
MS Project 2010 schedule template for large NASA procurements $10-450 millionSelf employed
 
Course Project Hospital Data Analysis and ReportingObjectiveThe
Course Project Hospital Data Analysis and ReportingObjectiveTheCourse Project Hospital Data Analysis and ReportingObjectiveThe
Course Project Hospital Data Analysis and ReportingObjectiveTheCruzIbarra161
 
Phases of a formal review
Phases of a formal reviewPhases of a formal review
Phases of a formal reviewTaufik hidayat
 
ISAS 600 – Database Project Phase III RubricAs the final ste.docx
ISAS 600 – Database Project Phase III RubricAs the final ste.docxISAS 600 – Database Project Phase III RubricAs the final ste.docx
ISAS 600 – Database Project Phase III RubricAs the final ste.docxbagotjesusa
 

Similar to met_ins_exerpts (20)

Improve Issue Book Management
Improve Issue Book ManagementImprove Issue Book Management
Improve Issue Book Management
 
Ch 5 - Requirement Validation.pptx
Ch 5 - Requirement Validation.pptxCh 5 - Requirement Validation.pptx
Ch 5 - Requirement Validation.pptx
 
HCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing MetricsHCLT Whitepaper: Landmines of Software Testing Metrics
HCLT Whitepaper: Landmines of Software Testing Metrics
 
Why we need to control scope
Why we need to control scope Why we need to control scope
Why we need to control scope
 
Smef2008 Van Heeringen Outsourcing Testing Activities – How To Prove Cost R...
Smef2008 Van Heeringen   Outsourcing Testing Activities – How To Prove Cost R...Smef2008 Van Heeringen   Outsourcing Testing Activities – How To Prove Cost R...
Smef2008 Van Heeringen Outsourcing Testing Activities – How To Prove Cost R...
 
Acceptance Criteria as Requirements and Tests
Acceptance Criteria as Requirements and TestsAcceptance Criteria as Requirements and Tests
Acceptance Criteria as Requirements and Tests
 
Early watch report
Early watch reportEarly watch report
Early watch report
 
Static techniques software development - Testing & Implementation
Static techniques software development - Testing & ImplementationStatic techniques software development - Testing & Implementation
Static techniques software development - Testing & Implementation
 
Iso 9001 audit procedures
Iso 9001 audit proceduresIso 9001 audit procedures
Iso 9001 audit procedures
 
Quality management in projects
Quality management in projectsQuality management in projects
Quality management in projects
 
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
Tackling Barriers in Multi-Customer Contract Acceptance Testing (or Why Can't...
 
1B project MS V2
1B project MS V21B project MS V2
1B project MS V2
 
Extending Change Auditing to Exchange Server
Extending Change Auditing to Exchange ServerExtending Change Auditing to Exchange Server
Extending Change Auditing to Exchange Server
 
PMI-Service-Level-Agreement for all transactions
PMI-Service-Level-Agreement for all transactionsPMI-Service-Level-Agreement for all transactions
PMI-Service-Level-Agreement for all transactions
 
MS Project 2010 schedule template for large NASA procurements $10-450 million
MS Project 2010 schedule template for large NASA procurements $10-450 millionMS Project 2010 schedule template for large NASA procurements $10-450 million
MS Project 2010 schedule template for large NASA procurements $10-450 million
 
Course Project Hospital Data Analysis and ReportingObjectiveThe
Course Project Hospital Data Analysis and ReportingObjectiveTheCourse Project Hospital Data Analysis and ReportingObjectiveThe
Course Project Hospital Data Analysis and ReportingObjectiveThe
 
Oscpa webinar sox change readiness
Oscpa webinar sox change readinessOscpa webinar sox change readiness
Oscpa webinar sox change readiness
 
Phases of a formal review
Phases of a formal reviewPhases of a formal review
Phases of a formal review
 
ISAS 600 – Database Project Phase III RubricAs the final ste.docx
ISAS 600 – Database Project Phase III RubricAs the final ste.docxISAS 600 – Database Project Phase III RubricAs the final ste.docx
ISAS 600 – Database Project Phase III RubricAs the final ste.docx
 
Static Techniques (Chapter 3)
Static Techniques (Chapter 3)Static Techniques (Chapter 3)
Static Techniques (Chapter 3)
 

met_ins_exerpts

  • 1. Overview Specification Metrics are used to track various aspects of the specification process to determine areas of issue and make improvements. Several data categories are tracked while they aren't specifically metrics themselves, though they help facilitate improving metrics in various ways. Types of documents/specifications to track metrics on: • SUB – Top level single use bioreactor specification • SUM – Top level single use mixer specification • SUF – Top level single use fermentor specification • Controller – Top level controller specification that is generally paired with a SUB/SUM/SUF • SV5 – Component specification or Subassembly Specification • SV2B – Component specification • SUMDS0300 – 300 Liter docking station specification • WID – Work instruction document • FCD – Form control document • Other – Some documents/specifications are initiated, and administered by other departments with us only tracking the metrics. These are generally BPC documents instead of hardware documents. There are four primary metrics, and a handful of secondary metrics. All the primary metrics and some of the secondary metrics are reported at the R&D Tier 3 board on a metric flip book and the identical Tier 4 metric flip book. This gives visibility for the issue we're seeing with the specifications. Primary metrics: These metrics are currently considered the most important and the changes being pushed for in the improvement process are taking information from these metrics. Rejections: This is the number of times a document was rejected before it was released. We also track rejection reasons to look for trends. We report rejections on a weekly, monthly, and yearly basis in the metric book. Review Time: This is the amount of time any given document spends in the review process. Each department is meant to have a specific time allotted for review, and the Review Time is the summation of that time. This metric is used to show us where we might be having trouble with getting approvals done for a particular category. The different types of specs have a different combination of approvers, so something that might not be a problem in one category, is a problem in another category. This number varies depending on how long the Department Approval Times are and the number of times a specification is rejected. For example, if a department averages 5 days to review a document, and one document is rejected 3 times, the Review Time will appear to be 20 days (5 for the 3 rejections and another 5 for the time it wasn't rejected). That's a really big difference. This number highlights problems that might not seem as severe looking only at Department Review Times and Rejections. We report review times on a weekly, monthly, and yearly basis in the metric book. Department Review Time: This is the amount of time the individual departments is taking on each specification. For instance, a specification that had 2 rejections will have three numbers summed into the department time average. If a department took 3 days to review the first cycle, and 2 days to review the second and third cycles, it would add up to 7 days. When taking into consideration that this was three different iterations of one document, we would divide 7 by 3 to see that average review time every time the department got the document was 2.3 days, well within the goal of 4-day reviews. This metric is primarily used to see if a particular department is developing long term problems with getting specs reviewed. There
  • 2. are always occasional spikes in the numbers because if someone takes a week of PTO and they don't have backup for their position, then the numbers are going to look bad for a bit, but tracking this metric allows us to pinpoint the issues and determine if they're actual problems or temporary flukes. We report department review times on a weekly, monthly, and yearly basis in the metric book. Overdue: An overdue specification is a specification that's been in one area of approval for longer than the approval goal. This is usually the first indicator if any of the other metrics are going to get worse. If a department has a whole bunch of specs that are overdue, one week is probably going to have some poor Department Review Times in the next couple weeks. We report overdue specifications on a weekly basis in the metric book. Secondary metrics: These metrics produce useful information, but not as immediately important as the primary metrics. Currently, not all of the secondary metrics are being reported at the tier boards, but it's important to be able to answer questions with information these metrics provide because they can support change driven with the primary metrics. WIP Time: The Work In Progress Metric measures how long a specification spends in collaboration. This is only a marginally useful specification because there are a number of factors that prevent change or improvement to this metric. For instance, something from the SUB/SUM category may spend 6-8 months in collaboration because we're waiting for customer approval. The BPC category specs might spend several months in collaboration because those that work on them have to research regulatory requirements or wait for experimental or validation data. This is only marginally useful with SV5 category specs because though they do have the least number of variables going into them collaboration-wise, they vary greatly in size. A small SV5 category spec will have between 1 and 10 line items and take approximately a half hour to complete. This turns into a lot of 0 day (same day) WIP times. A medium SV5 will have between 15 and 30 line items and take an hour or two to finish. This turns into a lot of 0-1 day WIP times. A large spec, however, can have upwards of 300 line items in it, taking up to a week to finish, while a more typical 50 line item large SV5 spec can still take several hours, increasing WIP times as other specs may have to wait longer for the larger one to be completed. Total Time: This metric measure the total time a spec takes from initial request date to release. It's the addition of the WIP and Approval Times. It's less useful than the Approval Times because of the wild variations the WIP numbers can induce. Average Requests Per Month: This is important for determining average workloads, so it's listed in the metric book. It would only become very important if the workload increased dramatically, making it so the current resources put toward specification creation and approval were inadequate to meet current demand. This metric is reported in the metric book. Open, Closed, Total: These three metrics simply show how many requests are still in process, how many have been completed, and the total number of requests altogether. Data from these metrics feeds up into the Average Requests Per Month and would only be valuable in and of themselves in cases of high volume, or when certain requests are exceptionally old (like if there are still 10 requests open from eleven months ago). These metrics are reported in the metric book as they can give a sense of how much has been completed, and how long it sometimes takes to complete things. Number of Approvals Per Month: When using this metric in comparison with the number of requests received per month, you can get a sense of when the approval process might be slowing down for one reason or another and indicate that some research needs to be done.
  • 3. For instance, one would want to know why if they received 38 requests in November, only 10 requests were closed in November. Percentage of Approvals w/o Rejection: This metric provides data on how the number of rejections is either increasing or decreasing. It's not immediately important if the Cycles and Approval Times are kept in check, but it can show longer term improvement. For instance, if the 6 month average six months ago is 24%, and the six month average now is 51%, then you know that the number of specs being approved without rejection has doubled. The Specification Metrics file tracks a few more points of information that aren't metrics, but are useful for streamlining the spec process. Typically, this data is kept for only 6 months after a request is completed as reference. The important parts of any of these information sets will have been transferred to more permanent data areas in the file before deletion. MC Task Status: This is just a small note of where the spec is in the approval route. It points out which specs are likely to be completed soon and which are going to take longer, without constantly checking the status of every spec in the approval process. It's most useful for noting if something is in escalation, or which ones are most likely to have changed when the metric book needs to be updated. Requested: This is the original request date and won't always match the date the infocard task in Master Control is created. I keep this information so what I put in the final metric data reflects the actual date of the request instead of when the Master Control task is started. Notes: Occasionally a task requires special handling in some form or another. This is where I put notes on if someone has requested to be notified when the task is finished, or why the spec hasn't been sent out for approval yet. It's a quicker reference than going into the task every time I wonder what it's doing there and reading through all the notes that may be interspersed in task comments and my separate work email. Requestor: Noting the person who made the original request makes it easy to know who to go to with questions in case they come up. Revision, ECO, IM, Change Table, Attachments, SOP0020: Some of these are my equivalent of the checklists at the end of this instruction, and some of them are redundancies meant to hold the information needed if there's a typo on other documentation, such as the New Revision and ECO. The ECO in particular can be problematic because there's no way to search for an ECO by its content. If the ECO is accidentally input incorrectly on the spec, say typing 010_12993 instead of 010_12933, unless it's noted somewhere that the ECO is 010_12933, you will have to manually go through hundreds of ECOs looking for the one you made, or creating a new one and dealing with the even more complicated problem of the revisions needing to be changed. How the data for the metrics and other information is obtained varies. At the lowest level, everything is entered manually. Then some of the upper level information is calculated using Excel's programming capabilities. This file contains mostly the simpler kind of equations, but it's important not to break them, or copy the equations incorrectly because it can skew the metrics the data is meant to gather. The first seven sections in this instruction will cover an individual tab within the Spec Metrics file. It will go over all the listed information, as in what it's used for and what it means, if it feeds into other data in other parts of the file, any programming or equations built into it, why things are formatted the way they are (color coding is important in some place, unimportant in others, may mean something different on another tab, and may mean the same thing as other tabs – I should perhaps work on making this more consistent than it is), and exactly what method is being used to do the things the tabs are being used for. After the sections explaining the Spec Metrics file, there will be sections containing process checklists, and instructions for the metric book.
  • 4. Summary Purpose/Use This tab contains historical information on Cycle, Approval Time, WIP, To Release, and Specs Per Month average. It also lists the number of open, closed, and total requests. Request data is compiled by request date rather than completion date, and the remainder of the information, while also garnered from the Complete and Tier4 tabs, shows 6-mo, YTD, and 12-mo averages. This data isn't reported directly on any tier board, but is useful for forecasting workload and observing data trends over an extended period of time. Relation to other tabs Open requests are pulled from the Tracking tab, but all other non-automatically-calculated information is taken from the Completed tab. Programming/Equations The equations in this tab are AVERAGE and SUM equations. Each section has a 6-month, YTD, and a 12-month AVERAGE, and the section regarding number of requests also has SUM equations. The content between the parentheses will be different depending on what content is being averaged or summed. Requests column This is a simple SUM equation. The cell with an existing equation in it can be dragged up or down into a new cell and the equation will pull from the correct cells to calculate the new value. 6-Month Average column This is a simple AVERAGE equation. The cell with an existing equation in it can be dragged up or down into a new cell and the equation will pull from the correct cells to calculate the new value.
  • 5. Formatting Average values are grayed out. Otherwise, there is no special formatting. How to use this tab Since the data is not vital to metric reporting, this tab is only updated periodically. I've been updating it about twice a month. Requests Section To update the requests section, first check your numbers against the open requests in the Tracking tab. In this example, the Summary tab shows 15 open requests, and when sorting the entries in the Tracking tab by date, there are 14 open requests. So the value changes to 14. This automatically recalculates the total number of requests for the month and all average values (if the value is enough to affect the averages).
  • 6. Process Checklists In the Specification Itself Formatting □ Document is in Arial 10 pt □ All paragraphs of equal number rank have the same indents □ Automatic or manually input section numbering is correct □ No single row in any given table is split between pages □ Drawn lines in table do not have inconsistent or odd formatting Standard Document Compliance □ Spec meets SOP0020 requirements (refer to SOP 0020) □ All changes to standard template have been applied to the spec Change Table □ All changes made to the document are listed for the revision (documents on revision 1 list that it's on the initial release of the document) □ All non-standard changes have the reason for the change listed in the "Change Made" column □ New revision's ECO is shown in far right column (if there was no ECO column, it was added with "N/A" shown for previous ECOs) □ Change table is altogether on one page (unless the change table is larger than one page, in which case, the change table begins at the top of its own page) Content □ Part number descriptions match the drawing's description □ Correct labeling instructions are included (i.e. standard labeling requirements or electrical labeling requirements) □ Correct packaging instructions are included (standard for most SV5, and matching packaging table for SUB/SUM/SUF) □ Listed suppliers are current and certified to supply the listed products □ For SV5: Motor, Tank, Cart, E-Box sub-component spec includes Passivation, COC, and FCD requirements □ For SV5: Standard sub-component spec includes only COC requirements, and no FCD requirements □ For SUB/SUM/SUF: FCD Requirements are listed □ For SUB/SUM/SUF: Complete products, those that have an E-Box, include DOC and User Guide requirements, not DOI or Assembly Instruction requirements □ For SUB/SUM/SUF: Incomplete products, those that do not have an E-Box, include DOI and Assembly Instruction requirements, not DOC or User Guide requirements For Item Masters Initial Setup □ Part number is created correctly and matches part number on the drawing □ For SV5: Part number description matches description on the drawing □ Product unit is listed as EA □ Part expiration is set to "No Control" □ Organization 010 is assigned □ Template "010 Hardware" is assigned □ Planning Category "BPC Component" is assigned □ Planning Class "94" is assigned □ Storage Category is assigned per specification requirements □ Receiving instructions are assigned per specification requirements
  • 7. For Parts with Changes □ Part number description already in Oracle matches description on the drawing □ Receiving instruction match what is currently in the specification For ECOs □ All part number organization assignments are verified □ All parts assigned to 010 are listed on the 010 ECO at the same spec revision □ All parts assigned to 050 are listed on the 050 ECO at the same spec revision □ For all parts assigned to organizations other than 010 & 050, refer to WID0380 to see if the atypical organization requires an ECO, and if they do, verify that all parts assigned to these organizations are listed on the corresponding ECO at the same spec revision □ Only Active part numbers are listed on the ECO(s) □ Only part numbers from one specification are listed on the ECO(s) For Master Control In the Infocard □ The Title field matches the Title in the spec exactly □ The revision number matches the spec □ The Author field is populated with the name of the person who is responsible for this revision of the spec □ The Owner field is empty □ The spec file is attached as the Main File and is not in the Attachments section □ There are no corrupt files in the Attachments section □ For SUB/SUM/SUF ETO Specs: The Customer is listed in the "Company Name" field of the Custom Fields section In the Task Packet □ A reasonable due date is listed for completion □ Any changes regarding drawing revisions are listed in the instructions □ Any changes regarding new part numbers are listed in the instructions □ Any changes regarding inspections are listed in the instructions □ The spec is on the correct approval route □ The complete spec number is in the task name □ For SUB/SUM/SUF ETO Specs going through initial release: The Customer Name is listed in the task name