1. Number 3 2015www.iomnet.org.uk
V41
PLUS – IOM Annual Conference 2015 preview
Navigating the Future – see page 8
The Future of Manufacturing
A new era of opportunity for the UK – see page 19
1front_cover 4/9/15 16:18 Page 1
2. Proactively driving operational
performance through visual
management
Steven White FIOM
10
n an ever more competitive business landscape, businesses
are on the lookout for new ways to improve performance
or make efficiencies. This article looks at how Severn Trent
Water approached improving performance, looking at some
of the key considerations when developing metrics to track
and drive performance, targeted at different audiences
within the business.
‘If you can’t measure it, you can’t improve it’1
Severn Trent Water is a water company operating in the UK
servicing 4.3 million homes and businesses across the Midlands
and mid-Wales, providing clean water and taking waste water
away. In the pursuit of improving levels of service, we set out to
understand how teams were driving and tracking their own
performance, sharing best practice through internally and
externally benchmarking drawing from management tools,
techniques and academia.
In 2008, an internal improvement team was set up to help
shape the culture of Severn Trent. Continuous improvement
from all staff was and still is actively encouraged and is supported
by tools and techniques to aid and embed improvements
for a lasting impact. The pace of change was driven through
empowering all employees, especially those in front-line
positions that are closest to customers and processes and in the
I
6proactivedrive 4/9/15 16:21 Page 1
3. One of the most striking observations made during the
workshops was that although we had a good
understanding of our processes, we focused on lagging
indicators at almost all of the levels within the business.
Operations Management
www.iomnet.org.uk
Number 3
2015
perfect position to identify opportunities. However, we found a
trade-off had occurred. With teams developing metrics to drive
their own performance, metrics to monitor the full end-to-end
process performance had not been considered, which meant
that some key performance indicators (KPIs) were not being
reviewed. We realised we had a golden opportunity to learn
from teams that were driving performance and apply some
theoretical principles to help teams understand how they can
proactively influence performance, as well as ensure measures
are in place to review overall end-to-end performance.
We defined our goal statement, such that for each key
commitment we should:
Use proactive leading indicators, as well as lagging measures
Present this information to teams that can influence them
Provide a line of sight of how each team can influence
commitment performance
Define key commitments
Our starting point was to define our key commitments as a
business. These consisted of external commitments to
customers, such as ‘Responding to customer request within four
hours’, and to regulators, such as ‘How many times we leave
customers without a supply of water’. Internal commitments to
the board of directors or executive committee could include
commitments such as reducing operating costs or increasing
productivity by fixing more leaks each day, or commitments to
staff such as reducing the number of accidents at work.
The water industry in the UK is regulated and a business plan
is submitted to the regulator every five years. In the latest
regulatory period, 2015–20, water companies submit a
11
Severn Trent Water is a water company servicing
4.3 million homes and businesses across the Midlands
and mid-Wales
Workshops with staff
were organised to
identify any gaps in
reporting
6proactivedrive 4/9/15 16:21 Page 2
4. business plan where, working with regulators and customers,
SMART (specific, measureable, achievable, realistic and time
bound)2
commitments were identified, with targets being
agreed based on acceptability to the customer, historic
performance and planned interventions.
A prioritisation exercise was undertaken on these commitments
to understand where the business would gain the largest benefit
through focused effort. We considered the size of the gap
between projected performance vs desired performance, as
well as the resulting impact of poor performance, be that service
or customer failures, fines, reputational impact or licence
conditions.
Once a list of commitments had been finalised, a senior
management sponsor was agreed for each measure. The role
of the sponsor was multifaceted: to act as a champion of the
measure through supporting the activity; to help manage
resistance by removing blockers and bottlenecks in the process;
and, most important, to promote and communicate the
importance of the activity so teams and individuals would be
more willing to engage fully in the process, participate in
workshops and use the newly developed measures to drive
performance.
Use proactive leading indicators
We understood there may be gaps in reporting and brought
people together to share existing reporting outputs and
procedures formally and to share best practice. Working with
key stakeholders and subject matter experts from around the
business, workshops were organised to identify any gaps in
reporting and define new measures. Each of the new measures
was peer reviewed with the end-users, so their learning and
feedback could be included in the final build. Engaging with
teams at this early stage paved the way for an easier rollout
process, with teams primed to accept new measures.
One of the most striking observations made during the
workshops was that although we had a good understanding of
our processes, we focused on lagging indicators at almost all of
the levels within the business. Lagging indicators are defined as
an after-the-event measurement, such as counts of failure. This
could be the amount of energy we use, which can be calculated
once we have used the energy. To drive performance effectively,
we needed to include more leading indicators or proactive
measures. These fall under three main categories:
Behaviours: a cultural change to encourage the business to
do something differently – for example, turning off all
personal computers after use
Process – for example, installing software to put computers
into hibernation mode automatically when not in use
Asset upgrade – for example, replacing old energy-hungry
equipment with more energy-efficient models
Leading indicators act as an early warning system for the
business and will indicate potential future changes in
performance of the lagging indicators or commitments – for
example, if we have a target of reducing energy consumed in
the office by 10% but budgets have been frozen, which has
arrested the upgrade of energy efficient computing hardware
and lighting, which were seen as two of the biggest contributors
to performance, we can see this as a sign that we are unlikely to
achieve our target of 10% efficiency.
Presenting the right information to the right people
During these exploratory workshops, we found teams of the
same function but in different geographical location had been
measuring performance of the same commitment slightly
Proactively driving operational performance
through visual management
12
Continuous
improvement from all
staff is actively
encouraged throughout
Severn Trent Water
Example: how teams can proactively influence performance
of a commitment to reduce energy consumption by 10%
Figure 1
6proactivedrive 4/9/15 16:21 Page 3
5. differently. Reports had been created using data from different
sources, different reporting timescales even different filters on
the data including and excluding different criteria. Each report
then presented different outputs, which denied viewing
comparative performance and made the sharing lessons learned
or best practice more difficult. We resolved such issues by
aligning standards across areas to provide consistency in
reporting to enable areas to be compared. Development of
these reports were then produced by a central team, which
freed up resource in the departments.
To ensure we did not overwhelm teams by expecting them to
review too many KPIs, measures were targeted at specific teams
rather than a wider scale rollout. This meant that teams would
be able to focus on key metrics where they could directly
influence performance, rather than more generalised ‘for info’
measures not driving specific teams into action. Where teams
are relatively diverse, influencing a number of the key measures,
a Pareto prioritisation exercise3
was conducted to identify which
measures would have the greatest impact on performance in
the first instance.
Even though some of the lower priority measures may not be
used immediately, they were recorded for two reasons:
1. If we found performance dipping even though we were on
track with our lower level performance indicators, we may not
have prioritised correctly. By keeping this record of all measures,
they could be swapped so new measures could be brought in
to replace those that were found to have less of an impact on
performance than expected.
2. Looking holistically at driving performance across the suite of
previously identified commitments, if performance is sustained
at an acceptable level and change initiatives have been
embedded to sustain performance, these measures could be
retired. When performance is sustained longer term, review of
these measures could move towards a monthly or quarterly
check-in to ensure performance remains stable.
Providing a line of sight of how each team can
influence performance
As the suite of measures was developed at the workshops, this
was represented visually following principles of critical to quality
trees4
to make it clear how each team and each level of
management within the business can contribute to business
performance. We found this visual representation of key metrics
and the teams that are involved, brought previously disparate
teams closer together, creating better relationships through a
common cause and breaking down working silos as colleagues
started to see how each team influence overall performance of
the commitment.
Let us look at the monthly energy bill in an office – see Figure 1.
Our key commitment is to reduce energy consumption by 10%
by September 2016 following principles of SMART objectives.
The four levels of the pyramid represent the layers of
management in the business from the director level, who would
review the 10% reduction in energy consumption outlined in
orange, through the organisation to the team leaders on the
lower level of the pyramid highlighted in purple. It is important
to note that in most instances approximately 80% of the
business is either the team management level or their direct
reports, so metrics need to be targeted at these teams to
maximise impact.
Whilst developing measures, if we ask the question: ‘What can
we influence that will drive this performance?’, we will identify
themes of improvement. Within this example we have focused
on two key areas of energy consumption that the teams can
manipulate that analysis showed would have the highest impact:
Operations Management
www.iomnet.org.uk
Number 3
2015
13
To ensure teams are not overwhelmed by too many KPIs, measures were targeted at specific teams
6proactivedrive 4/9/15 16:21 Page 4
6. being energy used through key types of equipment and energy
consumed by lighting. These two measures will be monitored
by the senior management. In this example, it is important to
note these measures are targeted at those that can influence
them. The call centre managers are responsible for the
equipment they use, shown by the grey-shaded boxes, whereas
office lighting is controlled by the facilities management teams,
shown by blue-shaded boxes.
The second level of the organisational structure – middle
managers that report to senior management – would review
measures that influence the level above. In this example, call
centre middle management would have measures around
energy consumed by heating and a second measure viewing the
energy used by personal computing. The team managers in turn
would influence each of these, so there may be a target around
installing each site with thermostatic controls, which will
influence energy consumption though heating. Three key
measures to influence energy used through personal computers
would be:
Computers can be standby enabled through an update that
will put them into hibernation when not in use
The percentage of computers left on overnight can be
influenced by changing behaviours to ensure computers are
shut down when users leave the office
Understanding current assets and replacing them with more
energy-efficient models where available
Looking at the facilities management teams, the senior
management would have a measure looking at the overall
energy consumed by lighting, middle management would look
at lighting by floor and finally team managers may be targeted at
installing energy saving lightbulbs, which will all enable teams to
drive performance.
Lessons learnt
It has been an interesting journey and we have learnt many
lessons along the way:
1. The workforce had been equipped with the tools to improve
continuously and teams had embraced this. However, some
teams spent time sharing lessons learnt and best practice
primarily because of a lack of visibility of roles and responsibilities
of other teams. By getting a cross-section of key stakeholders
together at workshops, it helped facilitate better working
relationships between teams and sharing of lessons learnt. Gaps
in reporting were spotted and key performance metrics were
developed, with the added benefit of smoothing rollout of
metrics through providing the ability for those teams to shape
the metrics they would be using.
2. Proactive leading measures help to arrest slipping
performance before it becomes a problem.
3. When developing metrics to drive performance against
commitments, it is important to understand the capacity of
teams using them. Initially, too many metrics were proposed.
Some teams that influenced several key measures had around
40 measures, and it was not feasible to expect teams to review
and drive performance against each of them. This was
countered by developing targeted measures for teams to a
more manageable number of metrics and, as noted earlier,
swapping these with metrics to refocus teams once
performance stabilised.
4. Documenting measures and the proposed metrics that drive
performance helped cement cross-team cohesion, because
individuals have a line of sight to how their work impacts on
others’ teams and, ultimately, the commitment.
In summary, there will always be competing views on which
performance indicators are the most important. Agreeing KPIs
and getting buy-in from the business was crucial to success, as it
allowed the business to focus improvement activities in a
prioritised order. Getting key stakeholders together to discuss
how to best track performance around a prioritised measure,
sharing best practice, standardising measures and exploring
more of the proactive leading indicators got teams thinking
about how their day-to-day activities could contribute to higher
level business commitments, as well as providing a line of sight
of how each team can influence performance.
Performance needs to be measured before it can be improved.
At Severn Trent, we now have a framework in place to monitor
performance proactively against key commitments and are
much more nimble, enabling us to react quickly to changing
performance.
About the author
References
teven White FIOM is a Senior Business Analyst, Severn Trent. He has used his black belt in lean Six Sigma
to implement a system that proactively drives performance.
1. DRUCKER, P (1993), The Practice of Management (reissue), Harper Business
2. DORAN, G T (1981), ‘There’s a S.M.A.R.T. way to write management’s goals and objectives’, Management Review,
70, 35
3. GALLOWAY, L, ROWBOTHAM, F and AZHASHEMI, M (2000), Operations Management in Context,
Butterworth-Heinemann
4. ECKES, G (2003), Six Sigma for Everyone, John Wiley & Sons
S
Proactively driving operational performance
through visual management
14
6proactivedrive 4/9/15 16:21 Page 5