THOUGHT LEADERS FOR MANUFACTURING & SUPPLY CHAIN
ARC INSIGHTS
By Steve Banker
Effective benchmarking requires
good data and comparability - the
ability to make effective peer to peer
comparisons. Benchmark data
should be actionable, and ideally
should serve as a catalyst to
transform a company’s operations.
INSIGHT# 2003-24E
JUNE 11, 2003
Benchmarking Logistics Performance
Keywords
Benchmarking, Logistics
Summary
Logistics professionals believe benchmarking can improve performance,
but there are several dimensions to benchmarking. Understanding these
dimensions is critical to effective benchmarking.
Analysis
At the recent Richmond Events “Logistics and Supply Chain Forum,” the
most popular think tank topic was “Performance Management.” Participat-
ing supply chain and logistics executives frequently returned to the topic of
benchmarking as one key aspect of managing supply
chain performance. Benchmarking can help validate
whether a company’s performance is strong, average,
or below average compared to industry standards or
standards that cross industries. While benchmarking
looks simple on the surface, the topic is far more com-
plex than it first appears.
Are We Comparing Apples to Oranges?
One participant said, “Nobody is exactly the same, you have to flush out
what you measure, how you measure it, what those differences are –
whether its picking efficiencies or (something else) – once you (get) it out
on the table, there are some comparisons (that are possible).”
One problem is that key metrics that have the same name can be defined
differently. For example, the perfect order can be defined as orders shipped
on time divided by total orders shipped. Or the perfect order can be much
more stringently defined as shipments that arrive at the customer within 2
hours of the promised delivery time, with no damage, the right SKUs in the
right quantities with the correct Valued Added Services performed, and
billed correctly.
ARC Insights, Page 2
©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
While there are many associations that companies can belong to that send
out benchmark questionnaires, it is not always clear that metrics are clearly
defined or that the respondents pay careful attention to the definitions if
they are defined. But participants still believed there was some value in
these exercises.
The Supply Chain Council’s Supply Chain Operations Reference (SCOR)
model does provide common definitions across a wide set of supply chain
activities. Supply Chain Council meeting attendees are frequently exposed
to best-in-class performance on some of these key metrics by industry.
Are the Benchmarks Actionable?
A common benchmarking approach is to use data from the financial state-
ments of public companies to calculate global, enterprise-wide measures of
effectiveness. Such metrics often have strong supply chain components
embedded within them. But the danger here is that
global financial metrics do not provide much perspec-
tive or fidelity on why one company’s asset turnover, for
example, is significantly higher than another’s. In other
words, while accurate, these benchmarks do not provide
information in sufficient detail to develop actionable
plans for logistics managers.
Cost data can also be very accurate, but can be far more
actionable than more global financial metrics. For ex-
ample, there are firms that audit the transportation
charges of carriers for shippers. These firms end up with a wealth of data
that can be used to benchmark transportation costs. They sell this data for a
fee. Collaborative initiatives can be another way to gather cost benchmark
data. One company in one of the focus groups was part of a collaborative
transportation initiative that included suppliers and even other manufac-
turers in the same industry. A side benefit of this initiative was that they
ended up with a much better understanding of what other companies were
paying for transportation. Financial metrics such as these will almost al-
ways be more useful from a manager’s point of view.
Beware Single Factor Productivity Metrics!
Granular productivity benchmarks are particularly difficult to use. These
measures are often single factor performance metrics based on a ratio of
Measure Calculation
Inventory
Turnover
Cost of Goods Sold/
Average Inventory
Asset Turnover Sales/Total Assets
Cash to Cash
Cycle Time
Days sales out-
standing + inventory
days of supply –
days of payables
outstanding
Sample Financial Metrics
ARC Insights, Page 3
©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
Georgia Tech’s “Internet Data
Envelopment Analysis for warehousing”
(iDEAs) can be found online at
www.isye.gatech.edu/ideas
some system output quantity to some resource input quantity. Order lines
(the output) per hour is one example. One participant said, “I struggle with
benchmarking. For example, if one of my warehouses is picking a thou-
sand order lines an hour, and another is picking 200 per hour, does that
mean the first warehouse is better than the second? Obviously not! It de-
pends on whether pallets or broken cases are picked, the amount of
material handling equipment, and other things as well.”
An alternative is to create a mathematical model
that assesses an asset’s productivity by considering
all relevant resource inputs and production out-
puts simultaneously. A group of professors at the
Georgia Institute of Technology has put together a
model to benchmark warehouse performance using a statistical tool called
Data Envelopment Analysis (DEA). Companies that participate enter
warehouse inputs such as total labor hours, warehouse area, and equip-
ment replacement cost into an Internet portal. They also enter their
warehouse’s outputs, things like total shipments, order lines shipped, and
other less obvious data elements. After analysis, results are sent back to the
participants.
Using DEA, a hypothetical composite warehouse is then constructed. The
composite is compared to the candidate’s warehouse.
The composite is constructed in such a way that it uses
at least as much output as the candidate warehouse, but
uses the minimum possible resources. The DEA score
for the candidate warehouse is than reported as a per-
centage. If the score was 75 percent, for example, that
means that the composite warehouse used no more than
75 percent of any single resource used by the candidate
warehouse. In other words, based on this mathematical
benchmark, the candidate warehouse could reduce its
resource usage by 25 percent.
The Georgia Tech project is not quite ready for prime time. The model does
not include relevant factors like the degree of Value Added Resource com-
plexity. Furthermore, if managers find that their DCs are performing
poorly, at present there is no feedback on how to correct the problem.
However, the project team is busy constructing a second model that in-
Measure Calculation
Cost per Order Total Warehouse &
Transportation
Cost/Total Orders
Shipped
Damage Shipment Damage in
$/Total Shipment $
Demurrage Demurrage
Costs/Total Trans-
portation Costs
Sample Logistic Cost Metrics
ARC Insights, Page 4
©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com
cludes more variables for better model fidelity and more user oriented
analysis for improved usability.
Another, currently more usable, warehouse model is based on engineered
labor standards. Warehouses that have installed Labor Management Sys-
tems (LMS) based on engineered labor standards have a highly actionable
internal benchmark for the optimum potential of utilizing labor in their fa-
cility.
Recommendations
• Realize that benchmarking the wrong things can lead to counter pro-
ductive decisions. What is benchmarked should fit the company’s
strategy and key business philosophies.
• Several WMS, TMS, and Supply Chain suppliers are releasing supply
chain or logistics analytics modules. These modules can provide the
data to begin benchmarking projects.
• A number of Universities, industry consortiums, and other interest
groups such as the Supply Chain Council provide companies with an
opportunity to share benchmarking information. Often, these may be
the best sources of information for your own benchmarking initiatives.
Please help us improve our deliverables to you – take our survey linked to this
transmittal e-mail or at www.arcweb.com/myarc in the Client Area. For further
information, contact your account manager or the author at sbanker@arcweb.com.
Recommended circulation: All EAS clients. ARC Insights are published and
copyrighted by ARC Advisory Group. The information is proprietary to ARC and
no part of it may be reproduced without prior permission from ARC.

Benchmarking Logistics Performance

  • 1.
    THOUGHT LEADERS FORMANUFACTURING & SUPPLY CHAIN ARC INSIGHTS By Steve Banker Effective benchmarking requires good data and comparability - the ability to make effective peer to peer comparisons. Benchmark data should be actionable, and ideally should serve as a catalyst to transform a company’s operations. INSIGHT# 2003-24E JUNE 11, 2003 Benchmarking Logistics Performance Keywords Benchmarking, Logistics Summary Logistics professionals believe benchmarking can improve performance, but there are several dimensions to benchmarking. Understanding these dimensions is critical to effective benchmarking. Analysis At the recent Richmond Events “Logistics and Supply Chain Forum,” the most popular think tank topic was “Performance Management.” Participat- ing supply chain and logistics executives frequently returned to the topic of benchmarking as one key aspect of managing supply chain performance. Benchmarking can help validate whether a company’s performance is strong, average, or below average compared to industry standards or standards that cross industries. While benchmarking looks simple on the surface, the topic is far more com- plex than it first appears. Are We Comparing Apples to Oranges? One participant said, “Nobody is exactly the same, you have to flush out what you measure, how you measure it, what those differences are – whether its picking efficiencies or (something else) – once you (get) it out on the table, there are some comparisons (that are possible).” One problem is that key metrics that have the same name can be defined differently. For example, the perfect order can be defined as orders shipped on time divided by total orders shipped. Or the perfect order can be much more stringently defined as shipments that arrive at the customer within 2 hours of the promised delivery time, with no damage, the right SKUs in the right quantities with the correct Valued Added Services performed, and billed correctly.
  • 2.
    ARC Insights, Page2 ©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com While there are many associations that companies can belong to that send out benchmark questionnaires, it is not always clear that metrics are clearly defined or that the respondents pay careful attention to the definitions if they are defined. But participants still believed there was some value in these exercises. The Supply Chain Council’s Supply Chain Operations Reference (SCOR) model does provide common definitions across a wide set of supply chain activities. Supply Chain Council meeting attendees are frequently exposed to best-in-class performance on some of these key metrics by industry. Are the Benchmarks Actionable? A common benchmarking approach is to use data from the financial state- ments of public companies to calculate global, enterprise-wide measures of effectiveness. Such metrics often have strong supply chain components embedded within them. But the danger here is that global financial metrics do not provide much perspec- tive or fidelity on why one company’s asset turnover, for example, is significantly higher than another’s. In other words, while accurate, these benchmarks do not provide information in sufficient detail to develop actionable plans for logistics managers. Cost data can also be very accurate, but can be far more actionable than more global financial metrics. For ex- ample, there are firms that audit the transportation charges of carriers for shippers. These firms end up with a wealth of data that can be used to benchmark transportation costs. They sell this data for a fee. Collaborative initiatives can be another way to gather cost benchmark data. One company in one of the focus groups was part of a collaborative transportation initiative that included suppliers and even other manufac- turers in the same industry. A side benefit of this initiative was that they ended up with a much better understanding of what other companies were paying for transportation. Financial metrics such as these will almost al- ways be more useful from a manager’s point of view. Beware Single Factor Productivity Metrics! Granular productivity benchmarks are particularly difficult to use. These measures are often single factor performance metrics based on a ratio of Measure Calculation Inventory Turnover Cost of Goods Sold/ Average Inventory Asset Turnover Sales/Total Assets Cash to Cash Cycle Time Days sales out- standing + inventory days of supply – days of payables outstanding Sample Financial Metrics
  • 3.
    ARC Insights, Page3 ©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com Georgia Tech’s “Internet Data Envelopment Analysis for warehousing” (iDEAs) can be found online at www.isye.gatech.edu/ideas some system output quantity to some resource input quantity. Order lines (the output) per hour is one example. One participant said, “I struggle with benchmarking. For example, if one of my warehouses is picking a thou- sand order lines an hour, and another is picking 200 per hour, does that mean the first warehouse is better than the second? Obviously not! It de- pends on whether pallets or broken cases are picked, the amount of material handling equipment, and other things as well.” An alternative is to create a mathematical model that assesses an asset’s productivity by considering all relevant resource inputs and production out- puts simultaneously. A group of professors at the Georgia Institute of Technology has put together a model to benchmark warehouse performance using a statistical tool called Data Envelopment Analysis (DEA). Companies that participate enter warehouse inputs such as total labor hours, warehouse area, and equip- ment replacement cost into an Internet portal. They also enter their warehouse’s outputs, things like total shipments, order lines shipped, and other less obvious data elements. After analysis, results are sent back to the participants. Using DEA, a hypothetical composite warehouse is then constructed. The composite is compared to the candidate’s warehouse. The composite is constructed in such a way that it uses at least as much output as the candidate warehouse, but uses the minimum possible resources. The DEA score for the candidate warehouse is than reported as a per- centage. If the score was 75 percent, for example, that means that the composite warehouse used no more than 75 percent of any single resource used by the candidate warehouse. In other words, based on this mathematical benchmark, the candidate warehouse could reduce its resource usage by 25 percent. The Georgia Tech project is not quite ready for prime time. The model does not include relevant factors like the degree of Value Added Resource com- plexity. Furthermore, if managers find that their DCs are performing poorly, at present there is no feedback on how to correct the problem. However, the project team is busy constructing a second model that in- Measure Calculation Cost per Order Total Warehouse & Transportation Cost/Total Orders Shipped Damage Shipment Damage in $/Total Shipment $ Demurrage Demurrage Costs/Total Trans- portation Costs Sample Logistic Cost Metrics
  • 4.
    ARC Insights, Page4 ©2003 • ARC • 3 Allied Drive • Dedham, MA 02026 USA • 781-471-1000 • ARCweb.com cludes more variables for better model fidelity and more user oriented analysis for improved usability. Another, currently more usable, warehouse model is based on engineered labor standards. Warehouses that have installed Labor Management Sys- tems (LMS) based on engineered labor standards have a highly actionable internal benchmark for the optimum potential of utilizing labor in their fa- cility. Recommendations • Realize that benchmarking the wrong things can lead to counter pro- ductive decisions. What is benchmarked should fit the company’s strategy and key business philosophies. • Several WMS, TMS, and Supply Chain suppliers are releasing supply chain or logistics analytics modules. These modules can provide the data to begin benchmarking projects. • A number of Universities, industry consortiums, and other interest groups such as the Supply Chain Council provide companies with an opportunity to share benchmarking information. Often, these may be the best sources of information for your own benchmarking initiatives. Please help us improve our deliverables to you – take our survey linked to this transmittal e-mail or at www.arcweb.com/myarc in the Client Area. For further information, contact your account manager or the author at sbanker@arcweb.com. Recommended circulation: All EAS clients. ARC Insights are published and copyrighted by ARC Advisory Group. The information is proprietary to ARC and no part of it may be reproduced without prior permission from ARC.