SlideShare a Scribd company logo
1 of 112
Download to read offline
University of Belgrade
Faculty of Civil Engineering
Postgraduate Programme in Water Resources and Environmental
Management
BENCHMARKING AND PERFORMANCE INDICATORS
IN WATER SUPPLY AND WASTEWATER SERVICES
by
BORISAV MILUTINOVIĆ
2013
II
III
DEDICATION
This thesis is dedicated to my family,
to my beloved son “Archer” Stevan and
to my dearest wife Zorica.
To my sister Dejana.
To my mother Mara and father Stevan
who have not lived to see this moment.
To all young generations in my family, as a role model to them,
just to convince them, that everything is possible and
everything in life depends just on you,
with a little help of luck
given by God the Almighty.
Try always to do your best, making efforts day by day.
IV
V
ACKNOWLEDGEMENTS
The author would like to thank Dr. Zorana Naunović, a dear lecturer. Without
her and her commitment, these postgraduate studies would not be possible.
Special thanks to the major professor of this thesis, Professor Dr. Jovan
Despotović and his valuable guidance and suggestions.
All gratitude to Professor Dr. Rafaela Matos, for sharing her early works and
valuable information about IWA performance indicators.
Thanks to dear colleague Aleksandar Krstić, for sharing the information about
benchmarking program in the Republic of Serbia, “Benchmarking I and II” and
giving me the unpublished papers about it. This thesis would not have been
possible without this.
Here, I also want to thank to all of my dear colleagues in EUREAU, for giving
me the idea for this thesis.
I also owe appreciation to colleagues from committee of CEN/TC165,
Association for Water Technology and Sanitary Engineering (UTV) and
Chamber of Commerce and Industry of Serbia (CCIS).
Special thanks to my dear Bros.: and Comp.: M. B. and B.B., for their support,
untiring encouragement and everything else that makes life easier with real
friends. Special thanks to Mr. Branislav Babić, who had the patience to read
all versions of the thesis.
VI
Silent gratitude to a brotherhood of free men of good standing, which have a
great influence on me.
Thanks to all of my unmentioned friends and colleagues, who have
encouraged me in these two years and influenced my career.
Once again, particular thanks to my family, my beloved son Stevan and my
dearest wife and friend Zorica. They have had patience, inspired me, had
understanding for my work, given me the infinite love, comfort when it was
hard and encouraged me throughout this study and finishing this thesis.
Heaven knows that only with my love can I make this up to them.
Thanks to my sister Dejana, who raised me, with so much love and to my
mother Mara and father Stevan who have not lived long enough to see this.
Moreover, first and foremost, thanks to God the Almighty, for making this
possible.
VII
TABLE OF CONTENTS
Page
DEDICATION .................................................................................................III
ACKNOWLEDGEMENTS............................................................................... V
TABLE OF CONTENTS ................................................................................VII
LIST OF TABLES ...........................................................................................IX
LIST OF FIGURES......................................................................................... X
ABSTRACT ..................................................................................................XIII
CHAPTER 1. INTRODUCTION.....................................................................15
1.1. OBJECTIVES....................................................................................................................15
1.2. ORGANIZATION................................................................................................................15
CHAPTER 2. BENCHMARKING ...................................................................17
2.1. INTRODUCTION................................................................................................................17
2.2. EARLY HISTORY OF BENCHMARKING ................................................................................20
2.3. BENCHMARKING DEFINITIONS AND EXPLANATIONS.............................................................21
2.4. CONCEPT OF BENCHMARKING..........................................................................................22
2.5. PERFORMANCE ASSESSMENT AND PERFORMANCE IMPROVEMENT (OR METRIC
BENCHMARKING AND PROCESS BENCHMARKING) ....................................................................26
2.6. BENCHMARKING PROCESS...............................................................................................30
2.6.1. IWA (INTERNATIONAL WATER ASSOCIATION) BENCHMARKING PROCESS ........................30
2.6.2. EBC BENCHMARKING PROCESS ...................................................................................34
2.7. EXAMPLES OF BENCHMARKING INITIATIVES AND PROJECTS IN WATER AND WASTEWATER
INDUSTRY ..............................................................................................................................35
2.7.1. THE BENCHMARKING PROGRAM OF EUROPEAN BENCHMARKING CO-OPERATION (EBC)..37
2.7.2. THE BENCHMARKING PROGRAM OF INTERNATIONAL BENCHMARKING NETWORK FOR
WATER AND SANITATION UTILITIES (IBNET)............................................................................39
VIII
CHAPTER 3. PERFORMANCE ASSESSMENT (METRIC BENCHMARKING)
...................................................................................................................... 43
CHAPTER 4. ISO STANDARDS SERIES 24500 (24510, 24511 AND 24512)
AND STANDARDIZATION OF PERFORMANCE INDICATORS .................. 51
CHAPTER 5. PERFORMANCE INDICATORS SYSTEM.............................. 57
5.1. IWA PERFORMANCE INDICATORS SYSTEMS IN WATER SUPPLY AND WASTEWATER SECTOR..58
5.2. ELEMENTS OF THE PI SYSTEM..........................................................................................60
5.3. ADVANTAGES OF IWA PERFORMANCE INDICATORS SYSTEM..............................................67
5.4. IMPLEMENTATION OF IWA PERFORMANCE INDICATORS SYSTEM........................................69
5.5. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WATER SUPPLY SERVICES 74
5.6. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WASTEWATER SERVICES ..76
5.7. PERFORMANCE INDICATORS AND RELATED COMPONENTS - CONFIDENCE-GRADING SCHEME 83
5.8. SIGMA LITE SOFTWARE..................................................................................................86
CHAPTER 6. BENCHMARKING INITIATIVES IN REPUBLIC OF SERBIA.. 87
CHAPTER 7. CONCLUSION ........................................................................ 91
LIST OF REFERENCES ............................................................................... 95
APPENDIX.................................................................................................. 105
IX
LIST OF TABLES
Table 2.1: Overview of some benchmarking efforts in the water industry
(Cabrera et al. 2006) and as found in literature........................ 36
Table 3.1: Scope of application of PI systems (Alegre and Baptista 2002)46
Table 4.1: Structure of Standards on service activities related to drinking
water supply and wastewater systems – comparison review (ISO
24510, 24511, 24512) and (Talib et al. 2005)........................... 52
Table 5.5.1: IWA PI system structure for water supply services................... 75
Table 5.6.1: Structure of the PI framework (Matos et al. 2003).................... 78
Table 5.6.2: IWA PI system structure for wastewater services..................... 79
Table 5.7.1: Reliability bands of data for PI system...................................... 83
Table 5.7.2: Accuracy bands of data for PI system ...................................... 84
Table 5.7.3: Matrix of confidence grades, according to ISO standard series
24511 (Matos et al. 2003)......................................................... 84
Table 5.7.4: Reporting of confidence grades (c.g.) for a sequence of years,
according to ISO standard series 24511 (Matos et al. 2003).... 85
APPENDIX Table 7.1: List of Performance Indicators for Wastewater services,
adopted from Matos et al. 2003, Cabrera et al. 2006 and Matos
et al. 2002b............................................................................. 107
X
LIST OF FIGURES
Figure 2.1: Plan – Do – Check – Act CYCLE (Cabrera et al. 2006) and
EBC (European Benchmarking Co-operation) Web site:
http://www.waterbenchmark.org/content/benchmarking.html. 23
Figure 2.2: Benchmarking Process “Plan, Do, Check, Act”, according to
ISO standard series 24500 .................................................... 23
Figure 2.3: Illustration of “performance assessment” (metric benchmarking)
and “performance improvement” (process benchmarking) as
described by Kingdom and Knapp (1996) (adopted from
Cabrera 2006)........................................................................ 28
Figure 2.4: Benchmarking cycle according to DVGW and DWA (2008) with
the aims of performance assessment and performance
improvement as well as the components of benchmarking
according to the IWA (Cabrera et al. 2006)............................ 29
Figure 2.5: The IWA benchmarking process (Cabrera et al. 2006) .......... 30
Figure 2.6: EBC’s levels of participation (sources Cabrera et al. 2006 and
EBC, Web site
http://www.waterbenchmark.org/content/benchmarking.html) 38
Figure 2.7: Wastewater sets of input variables and performance indicators
for basic, standard and advanced level of benchmarking,
according to Benchmarking Program of European
Benchmarking Cooperation (EBC)......................................... 38
Figure 3.1: Illustration of typical flowchart in performance assessment
(Sjøvold et al. 2008)............................................................... 43
Figure 3.2: PI as a part of a performance measurement system, followed
by example for water supply service (Alegre et al. 2006)....... 44
Figure 3.3: Water undertaking context (Algere et al. 2006)...................... 47
Figure 3.4: Wastewater undertaking context (Matos et al. 2003, adopted
from Algere et al. 1997).......................................................... 47
Figure 3.5: Metric benchmarking process (Sjøvold et al. 2008) and
(adapted, Cabrera 2001)........................................................ 49
XI
Figure 4.1: Fields of application of ISO 24511 (Wastewater), source of the
schematic: IWA Performance Indicators for Wastewater
Services ................................................................................. 53
Figure 4.2: Fields of application of ISO 24511 (Wastewater), source of
figure: based on a scheme from Hydroconseil, France, 2002.53
Figure 4.3: Fields of application of ISO 24512 (Drinking Water)............... 54
Figure 4.4: Relevant relationships between stakeholders for establishing
objectives, according to ISO standard 24511 and 24512 ....... 55
Figure 5.2.1: Structure of PI system – six separate categories. .................. 60
Figure 5.2.2: General outline of IWA variable definition, according to Alegre
et al. 2002 and Alegre et al. 2006........................................... 61
Figure 5.2.3: IWA variable definition, classification and description,
according to Alegre et al. 2002 and Alegre et al. 2006........... 61
Figure 5.2.4: General outline of the PI definitions within the IWA Manual,
according to Alegre et al. 2002 and Alegre et al. 2006........... 63
Figure 5.2.5: PI identification and classification, description, terms and
processing rules, according to Alegre et al. 2002 and Alegre et
al. 2006 .................................................................................. 63
Figure 5.2.6: Illustration of components (data elements) of a performance
indicators system (Sjøvold et al. 2008 and adapted Alegre et al.
2006)...................................................................................... 66
Figure 5.4.1: Phases of the PI system implementation process for water
supply services (Alegre et al. 2006) ....................................... 70
Figure 5.4.2: Phases of the PI system implementation process for
wastewater services (Matos et al. 2003, adapted from Alegre
2000 and 2002) ...................................................................... 71
Figure 5.4.3: PI and CI selection procedure (Alegre et al. 2006 and Matos et
al. 2003) ................................................................................. 72
Figure 5.4.4: Example of data flows concerning PI and CI and team
responsibilities (Alegre et al. 2006 and Matos et al. 2003) ..... 73
Figure 5.6.1: Wastewater balance (Matos et al. 2002a and Matos et al.
2003)...................................................................................... 76
XII
Figure 5.6.2: Illustration of service provision in terms of customers and main
impacts (Matos et al. 2003).................................................... 77
Figure 5.6.3: Structure of wastewater CI and PI (Matos et al. 2002a and
Matos et al. 2003) .................................................................. 78
Figure 5.6.4: Context Information data for wastewater services (Matos et al.
2002b).................................................................................... 81
Figure 5.8.1: Recommended PI evaluation procedure using SIGMA Lite and
SIGMA Lite WW (Cabrera et al. 2003)................................... 86
XIII
ABSTRACT
Milutinovic, Borisav. 2013. Benchmarking in Water Supply and Wastewater
Services. Major Professor (Mentor): Jovan Despotovic.
This thesis reviews and analyses throughout the following methods:
 Benchmarking;
 Performance Assessment (Metric Benchmarking);
 Performance Improvement (Process Benchmarking); and,
 Performance Indicators (PIs) system (especially ISO standardized PIs
and International Water Association PIs).
Special attention was given to the International Water Association (IWA)
methods of benchmarking, performance assessment, and performance
indicators system, its structure and components, performance indicators, data
confidence-grading scheme and implementation of those processes and
methods. The specific objective was to make a deeper reflection on the IWA
Performance Indicators for wastewater services.
XIV
15
CHAPTER 1. INTRODUCTION
1.1. Objectives
This thesis is written to be an instructional manual or reference for
colleagues wanting to work on benchmarking and performance indicators
systems in water utility companies (especially wastewater services). The
thesis should be understood as a literature review or “quick guidelines” to
benchmarking, performance assessment and performance indicators.
The overall goal of this thesis research was to theoretically explore tools for
comparison and improvement of efficiency and effectiveness in water utilities
(water supply and wastewater sector).
The benchmarking initiatives are presented for three levels of analysis: the
worldwide level (IBNET); the European level (EBC); and the national – local
level (IBNET and IPM). For the Republic of Serbia, information in the form of a
report on benchmarking is presented.
16
1.2. Organization
The thesis consists of seven chapters.
Chapter 2 is the introduction chapter and provides a summary of the early
history, definitions and explanations, concept and process of benchmarking,
with examples of benchmarking initiatives. An overview of two benchmarking
programs is also presented.
Chapter 3 presents the performance assessment (metric benchmarking)
process, with detailed explanations and a literature review.
The ISO standards series 24500:2007 (24510, 24511 and 24512) are
presented in Chapter 4.
Chapter 5 is the core chapter in which the International Water Association
(IWA) performance indicators system for water supply and especially for
wastewater services is presented and discussed An emphasis is placed on
the elements, confidence-grading scheme of data, structure, advantages,
implementation process and strategy of the system.,.
Information on the benchmarking project initiative in the Republic of Serbia is
given in the form of a report in Chapter 6..
Chapter 7 provides a conclusion to the thesis, where the results of this
research and recommendations are once again presented and conclusion on
benchmarking, performance assessment and performance indicators are
highlighted.
In the appendix, the complete list of IWA performance indicators for
wastewater services is presented.
17
CHAPTER 2. BENCHMARKING
2.1. Introduction
Water, which is essential to sustaining life and livelihoods, is a core sector of
the global economy.
Enhancing operational and financial performance of the water industry and
water utilities will provide the basis necessary for expanding access and
improving the quality of service.
Water companies are different in nature, as noted in Cabrera (2006):
 They are of public interest, doing business as a public services;
 They are natural monopolies; and,
 They have no direct competition.
The international water and wastewater industry is going through a period of
great change. The water sector faces one of the most – if not the most –
challenging strategic outlooks in its history.
Numerous global trends are placing pressures on the water sector, and in turn
on asset management and other business processes (Water Services
Association of Australia, 2008). These inter-related trends include:
 Responses to global warming / climate change;
 Significant asset development and growth;
 Skills shortages arising from a variety of different factors;
 Changes and meeting the demands from other industries;
 New technologies enabling data collection and analysis on a previous
unprecedented scale;
 Increasing levels of stakeholder involvement and engagement in
decision making;
 Increasing complexity in customer needs and relationships;
 Regulatory scrutiny and control; and,
 Access to capital for investment.
18
These aspects of the strategic context for water utilities are driving changes to
the way water utilities are being managed (Gee and Vincent, 2009).
Challenges such as climate change, increased regulation and competition for
funds, skills shortages, technological development, environmental constraints,
increasing customer expectations and ageing infrastructure mean that water
businesses need to be more and more efficient and effective each day. The
industry needs to understand how it can best manage the various
expectations required of it.
The formerly monopolistic industry sectors of water and wastewater treatment
and services are today increasingly influenced by free market mechanisms
and been pressurized by both societies and governments for more
transparency and efficiency.
Urban water and wastewater utilities are under increasing pressure to
perform. In addition, regulators and citizens demand increasingly higher
standards of environmental, social, and economic sustainability. If water and
wastewater utilities are to meet these increasing demands and expectations in
both developed and developing countries, they must first take stock of their
performance over time and the need for improved performance is not limited
only to developing countries.
The crucial statement (Principles for Economic Regulation) made by the UK
Government in April 20011 was a catalyst for many improvements in the
water sector in April 2011 and it reads as follows:
“In certain sectors network effects and/or economies of scale create
circumstances, such as natural monopolies, which limit the prospects for
effective competition. In these areas, independent economic regulation will be
needed over the long term to continue to provide vital consumer protections
and ensure consumers’ interests are promoted through efficient provision of
good quality, reliable and sustainable services.”
19
Recognizing the challenges facing the industry and with an appreciation of
how modern asset management may assist these businesses meet their
various commercial, environmental, social and regulatory obligations, a lot of
organizations, institutions and individuals have developed a lot of methods
and tools of benchmarking best practice within the water industry.
The objectives of those methods and tools are to identify how well particular
participants are currently managing their strategy as well as identifying those
participants that represent best practice in a range of key process areas.
To achieve their management goals, the water undertakings need to strive for
high degrees of:
 Efficiency – the extent to which the resources of water undertakings
are utilized optimally to produce the service (Makropoulos 2009); and,
 Effectiveness – the extent to which declared objectives (specifically
and realistically defined, in advance) are achieved (Makropoulos 2009).
To improve the desired efficiency and effectiveness to a certain level, each
utility must first know what the best practice is. The benchmarking method
was developed in this effect. The term benchmarking is used to describe the
technique of identifying best practices from those “best in class” for specific
and critical processes, adapting them and providing continuous performance
improvement. This management tool is a well-known example of the “Xerox
benchmarking” (resulting from its early application in Xerox in the 1980s)
(Camp, 1989; Sjøvold et al., 2008).
20
2.2. Early History of Benchmarking
Performance assessment is a natural need in human psychology. People, as
individuals, social groups, regions and states, always want to know if they can
be better, or how they are doing compared to others. Often people want to
know if they have made an improvement over time. It is difficult to determine
whether one is good at something, if there are no previous references.
Through the process of benchmarking, these references can be established.
Early history of benchmarking begins in the 1970’s with a problem raised in
the Xerox Corporation, USA, a leading copier manufacturer. At that time,
Xerox heavily lost market share to Japanese manufacturers. Looking for
explanations as to what caused the problem with the market, Xerox made a
comparative analysis of different companies. They realized that Fuji sold
copiers at Xerox’s production cost and result confirmed significantly higher
production costs in the USA.
This was the first step in establishing the modern concept of benchmarking.
Robert C. Camp described this concept in 1989 in his book “Benchmarking –
The Search for Industry Best Practices that Lead to Superior Performance”,
based on the case of Xerox Corporation (Cabrera et al., 2006).
The next steps were transferring and implementing the benchmarking concept
to the water industry, as it is the concept universally applicable in any kind of
industry.
It must be underlined that benchmarking is not the only tool for improving the
water industry (water and wastewater services). There are other options,
which include process optimization, business process redesign, restructuring,
merging utilities and so on.
The scope of this work deals with benchmarking and over the past two
decades, many projects have proven benchmarking to be a powerful
management instrument in the water industry.
21
2.3. Benchmarking definitions and explanations
“Benchmarking is the process of comparing one's business processes
and performance metrics to industry bests or best practices from other
industries” (WIKIPEDIA).
Performance benchmarking is learning how others do business, whether they
are more efficient than others are, and if they are, whether their methods can
be understandable and usable to others advantage.
By the definition given in IWA’s Manuals of Best Practice: ”Benchmarking is a
tool for performance improvement through systematic search and adaption of
leading practices”.
The American Water Works Association (AWWA) has defined benchmarking
as ”a systematic process of searching for best practices, innovative ideas, and
highly effective operating procedures that lead to superior performance, and
then adopting those practices, ideas, and procedures to improve the
performance of one’s own organization”.
Benchmarking is a process of a comparative evaluation of system
performances in similar systems (elements of the same operating system or,
more often competing systems). This process enables the identification of the
best techniques of "best in class". Benchmarking is therefore a key instrument
for establishing a quality management system based on continuous
improvement (Gspan et al., 2009).
Comparisons with similar utilities elsewhere in the country or region or with
standards of international good practice can shed light on how well a utility is
performing, identify areas for improvement, and help indicate a plan of action.
A major challenge for measuring and benchmarking water and wastewater
utility performance has been the lack of standardized information, pointed out
Van den Berg and Danilenko in 2011.
22
2.4. Concept of Benchmarking
Benchmarking is essential for those developing and implementing water
policy. The tools are important for documenting past performance,
establishing baselines for gauging productivity improvements, and making
comparisons across service providers. In addition, if managers do not know
how well their organization or division has performed (or is performing) they
cannot set reasonable targets for future performance.
Benchmarking provides regulators and utility managers with a way to make
performance comparisons over time, across water utilities, and across
countries. It can promote conflict resolution between these two groups by
allowing participants to focus on performance, and can help bridge the gap
between technical researchers and those practitioners currently conducting
studies for government agencies and water utilities (Berg and Padowski,
2007).
As Van den Berg and Danilenko stated in 2011, a wide range of stakeholders
can use benchmarking, like:
 Utilities: to identify areas of improvement and set realistic targets;
 Governments: to monitor and adjust sector policies and programs;
 Regulators: to ensure that adequate incentives are provided for
improved utility performance and that consumers obtain value services;
 Consumers and civil society: to express valid concerns;
 International agencies and advisers: to perform an evaluation of utilities
for lending purposes; and,
 Private investors: to identify opportunities and viable markets for
investments.
Benchmarking is organized in projects with clear starting and ending dates.
Nevertheless, from a management point of view, benchmarking should be
considered as continuous, permanent process, because searching for better
23
practices never ends. According to this, benchmarking should follow the Plan-
Do-Check-Act principle presented in Figure 2.1.
Figure 2.1: Plan – Do – Check – Act CYCLE (Cabrera et al., 2006) and EBC
(European Benchmarking Co-operation) Web site:
http://www.waterbenchmark.org/content/benchmarking.html
The same principle is a part of ISO standard 24500:2007 (more detailed) and
its illustration is given as Figure 2.2.
Figure 2.2: Benchmarking Process “Plan, Do, Check, Act”,
according to ISO standard series 24500:2007
24
According to this, benchmarking should be done on annual basis and it should
be embedded in the (annual) business cycle.
Benchmarking is both a science and an art. Even the best statistical analysis
will be ignored if it is not carefully presented. There are several methodologies
available for benchmarking and it is important to keep in mind that a single
index of utility performance has the same problems of any indicator: it will be
neither comprehensive nor fully diagnostic. Therefore, when conducting
benchmarking analyses, water professionals must understand the strengths
and limitations of different methodologies.
Benchmarking is a fundamental requirement of good management and can
help managers and regulators identify historical trends, determine today’s
baseline performance, and quantify relative performance across utilities (Berg
and Padowski, 2007).
Benchmarking is a practical management and decision-making tool to
measure and compare the performance of utilities. The objective of
benchmarking is to improve performance by comparing with and learning from
other, similar organizations.
Performance monitoring can play a significant role in the sector as a tool for
performance improvement. Benchmarking can help utilities identify
performance gaps and effect improvements through the sharing of information
and best practices, ultimately resulting in better water services.
Van den Berg and Danilenko in 2011 concluded that the primary objectives of
benchmarking are as follows:
 To provide a set of Performance Indicators (PIs) related to a utility’s
managerial, financial, operational, and regulatory activities that can be
used to measure internal performance and provide managerial
guidance.
25
 To enable an organization to compare its performance on PIs with
those of other relevant utilities to identify areas needing improvement,
with the expectation of developing more efficient or effective methods
to formulate and attain company goals as set forth in its business plan.
Performance comparisons and benchmarking projects can be organized on a
voluntary basis but can also be obligatory. Depending on the purpose, these
projects are initiated by different organizations, e.g. associations, consultants
or government agencies.
Performance comparisons provide useful information about the
water/wastewater sector and are, therefore, of the highest interest for
deducing standard values for the sector.
26
2.5. Performance Assessment and Performance Improvement
(or Metric Benchmarking and Process Benchmarking)
In general, there are two different approaches to benchmarking:
 “Metric Benchmarking” and
 “Process Benchmarking”.
Metric benchmarking is intended as a quantitative comparative assessment of
company performance, normally measured by performance indicators (PI).
Performance benchmarking is intended as a mechanism for identifying
specific work procedures to be improved by emulating external examples of
excellence that can be set as the best standard.
“Metric benchmarking identifies areas of under-performance where changes
need to be made to the way things are done, whilst process benchmarking is
a vehicle for achieving change, and the improvement required can be
imported from other best practice partners.” (Larsson et al. 2002)
Van den Berg and Danilenko in 2011 made a distinction of two types of
benchmarking: metric and process benchmarking.
Metric benchmarking involves systematically comparing the performance of
one utility with that of other similar utilities, and even more importantly,
tracking one utility’s performance over time. A water or wastewater utility can
compare itself to other utilities of a similar size in the same country or in other
countries. Similarly, a nation’s regulators can compare the performance of the
utilities operating there. Metric benchmarking, essentially an analytical tool,
can help utilities better understand their performance. Such benchmarking is
most powerful when carried out over time, tracking year-to-year changes in
performance.
27
Process benchmarking is a normative tool with which one utility can compare
the effectiveness of its processes and procedures for carrying out different
functions to those of selected peers. A utility can compare its billing and
collection system, for example, to those used by other utilities to see which
system performs better. When the comparison reveals one utility’s system to
be more effective or efficient than the others, the underperforming utility can
adopt and internalize those processes and procedures as appropriate. The
performance indicator constitutes the building block of both types of
benchmarking. Indicators are quantitative, comparable measurements of a
specific type of activity or output.
Benchmarking can be of a two kinds, says a definition given by Gspan et al.
(2009):
 Metric benchmarking: regular periodic measurement of relevant
(internal) metric variables, calculating relevant indicators and
comparing their values to date;
 Process benchmarking: comparison of internal indicators for certain
processes to the one from the other companies in the same industry,
with the aim of discovering vulnerabilities, weaknesses, needs and
opportunities for improvement in efficiency, competence and
competitiveness, impact analysis for introduction of a specific action on
the process and the announcement of a future state.
Here is a good and short definition of metric and process benchmarking by
Gspan et al. (2009):
Metric benchmarking gives thus a response to the questions such as: "Where
am I, what am I doing?"
Process benchmarking answers the questions: "Where and what are the
opportunities for improvement?"
Instead of this differentiation between metric and process benchmarking more
appropriate words (to give a clear terminology) are going to be used. The IWA
Specialist Group on Benchmarking strongly recommends abandoning the use
28
of the terms “metric benchmarking” and “process benchmarking”. Instead
“performance assessment” and “performance improvement” should be
considered consecutive components of benchmarking (Cabrera et al., 2006).
Performance assessment and improvement should be understood as two
differentiated phases, which are the necessary parts of benchmarking. Figure
2.3 illustrates this concept.
Figure 2.3: Illustration of “performance assessment” (metric benchmarking)
and “performance improvement” (process benchmarking) as described by
Kingdom and Knapp (1996) (adopted from Cabrera, 2006)
As introduced in pervious paragraphs, benchmarking consists of two
consecutive components. First step, performance assessment, aims at
analysing performance, comparing it with other organizations within or outside
the industry, and identifying performance gaps. The next step, performance
improvement, is designed to find improvements by learning from the leading
practices and adapting them to the own situation.
FUNCTION
UTILITY
PROCESS
TASK
BENCHMARKING
Performance
ASSESMENT
Performance
IMPROVEMENT
LEVEL
OF
DETAIL
METRIC
Benchmarking
PROCESS
Benchmarking
29
To illustrate this process, Figure 2.4 gives the schematic view of
benchmarking process in German wastewater industry, standardized by
DVGW Deutsche Vereinigung des Gas- und Wasserfaches and DWA
Deutsche Vereinigung für Wasserwirtschaft, Abwasser und Abfall, acording to
IWA in 2008.
Figure 2.4: Benchmarking cycle according to DVGW and DWA (2008) with the
aims of performance assessment and performance improvement as well as
the components of benchmarking according to the IWA (Cabrera et al. 2006)
30
2.6. Benchmarking Process
2.6.1. IWA (International Water Association) Benchmarking Process
Through the literature review, it could be found that every benchmarking
reference has its own benchmarking process, with different number of steps.
However, more or less, they all follow the same procedure.
Here, a typical IWA benchmarking process with six different steps is going to
be presented, as Cabrera et al. (2006) have described them.
Figure 2.5 illustrates those steps:
1. Project Planning
2. Orientation, Training and Project Control
3. Data Acquisition and Validation
4. Data Analysis and Assessment Reporting
5. Improvement Actions
6. Review of Improvement Actions
Figure 2.5: The IWA benchmarking process (Cabrera et al., 2006)
The following explanation of six IWA’s benchmarking process steps has been
completely adopted from Cabrera et al. (2006), without any changes:
Preparaton Performance Assessment Performance Improvement
Project Planning
Orientation,
Training &
Project Control
Data
Acquisition
& Validation
Data Analysis
& Assessment
Reporting
Improvement
Actions
Review of
Improvement
Actions
31
1. Project Planning
At the start of a benchmarking project, the scope and level of detail are
determined based on the demands and needs of the interested utilities. The
performance assessment model and the data requirements also need to be
elaborated to show participants what they can expect, and to estimate project
resources. Based on this information, a detailed project plan with budget and
planning can be drafted. Interested utilities are invited to participate at this
stage and based on their response the project may or may not be launched.
2. Orientation, Training and Project Control
Before starting to benchmark, all staff involved in the project needs to be
prepared. The objectives of the exercise and the project plan should be clear.
Furthermore, staff needs to be informed about the methodology and data
requirements and trained in the data methods that will be applied in the
project.
These considerations include the staff from participating utilities and staff from
the project team (organizing body and/or commissioned third parties).
3. Data Acquisition and Validation
One of the most time consuming activities in a benchmarking project is the
data acquisition by the participants. This step requires significant efforts from
the participants, depending on their experience, availability of the information
and accessibility. The role of the project team in this step is to assist utilities in
clearing up methodology issues and definition problems and to secure
meeting deadlines.
32
When the required data are collected, they need to be validated by the utilities
and by the project team, for instance by looking at consistency with data from
previous years, outliers, on-site visits or auditing. Although this activity may be
intensely time consuming, the availability of a reliable dataset is key to
successful benchmarking. Participants in a benchmarking project expect good
quality comparison and, accordingly, proper identification of performance
gaps, as this is the trigger for improvement actions.
4. Data Analysis and Assessment Reporting
Once data are validated, they are analysed, performance indicators are
calculated and performance comparison is made between the participants. In
this stage, possible remaining errors in the dataset can be identified and
cleared up to improve data quality. Performance gaps are then determined
and explained (if possible) keeping in mind the differences in the operating
environment of the utilities.
The result of this step is a draft report (at individual and/or group level) with
the preliminary results of the performance assessment. This is the basis for
discussing performance differences with the participating utilities in a
workshop.
After discussing, the preliminary results of the performance assessment in a
workshop, possible errors and further explanations on the performance gaps
or differences in the operating environment of utilities are processed. Final
reports on the performance assessment are produced and distributed to
disseminate the results within the company and to its stakeholders. These
assessment reports can be supplemented by improvement action plans after
the upcoming steps.
33
5. Improvement Actions
One of the most important activities for reaching the final goal of a
benchmarking exercise involves taking theory into action. Based on the
performance assessment and the knowledge that is available in the network,
the project team and the participating utilities jointly try to discover best
practices, present and discuss these in the workshop and identify
improvement opportunities. For further analysis of interesting practices, site
visits or task groups may additionally be organized.
With the best practices identified, participants should be able to draft their own
improvement action plan. The action plan can be quite different for each utility
and needs to be prioritized, based on the contribution of the proposed actions
to the strategic objectives of the utility and the cost/benefit ratio.
Benchmarking without improvement usually equals frustration. The
implementation step is often overlooked but is crucial in finishing the job that
was started at step 1. In order to implement the suggested improvement
initiatives, senior utility management should approve the necessary changes
and the necessary internal procedures should be followed to secure
investments, organizational changes, etc.
6. Review of Improvement Actions
After implementing improvement actions, the results should be assessed to
review if the objectives have been reached. Usually, this is done in a following
benchmarking exercise. In order for the benchmarking process to be
complete, this needs to be documented and evaluated, including lessons
learnt and new benchmarking needs. Closing the cycle provides essential
information for preparing a new benchmarking effort.
34
2.6.2. EBC Benchmarking Process
The European Benchmarking Co-operation (EBC) benchmarking process is
given here, as an example. A typical EBC benchmarking cycle comprises the
following steps and activities:
 Performance Assessment:
 defining benchmarking objectives;
 defining a performance assessment model;
 preparing tools;
 inviting participants;
 collecting data;
 validation and analysis;
 identifying performance gaps;
 reporting.
 Performance Improvement:
 identifying good/best practices;
 preparing a Performance Improvement Plan with targets and priorities;
 implementing performance improvement measures;
 evaluating.
It should be noted, that methodology of EBC is almost the same as IWA. Both
(IWA and EBC) consist of two main steps, as already has been explained:
Performance Assessment and Performance Improvement. Both (IWA and
EBC) have in common almost the same methodology and steps under
different names.
The same cycle is followed by ISO 24500:2007 (see Figure 2.2). In
conclusion, it could be found that all benchmarking processes, are basically
the same with minor differentiations and they follow the same cycle of
Performance Assessment and Performance Improvement.
35
2.7. Examples of Benchmarking Initiatives and Projects in Water and
Wastewater Industry
Since the early 1990’s, benchmarking evolved in water sector. A number of
benchmarking methodologies, through a different initiatives and projects have
been developed and successfully implemented all over the World. Here, some
of the projects are going to be mentioned.
Table 2.1 provides some examples of key benchmarking initiatives in the
water industry, as they have been found in the literature. Those projects are
not necessarily the most important or relevant, but were chosen to illustrate
the evolution of benchmarking in water industry.
36
Table 2.1: Overview of some benchmarking efforts in the water industry (Cabrera et al., 2006) and as found in literature
Program Name Country Program type
Level of
detail
Type
of activity
Geographical
scope
IWA manuals
based
6 – Cities Group Scandinavia BM U, F & P WS & WW R No
DANVA Denmark BM U & F WS N No
European Benchmarking Co-operation (EBC) Europe BM U, F & P WS & WW R & I Yes
Germany (several) Germany BM U, F & P WS & WW N Yes
NWWBI Canada BM U, F & P WS & WW N No
OEWAV Austria BM U & F WW N No
OVGW Austria BM U & F WS N Yes
QualServe USA BM U WS & WW N No
SEAWUN South – East Asia BM U WS R No
VEWIN The Netherlands BM U, F & P WS N No
WSAA Australia BM F & P WS R & I No
ADERASA Latin America PA U WS & WW R No
FIWA Finland PA U WS & WW N No
IBNET World Bank PA U WS & WW I No
Norsk Vann Norway PA U WS & WW N No
OFWAT England & Wales PA U & F WS & WW N No
Svensk Vatten Sweden PA U WS & WW N No
PAS India BM U & F WS & WW N & R & L No
FEDERGASAQUA Italia BM U, F & P WS & WW R No
IRAR & ERSAR Portugal BM U WS & WW N Yes
CARE-W Europe BM U & F WS I Yes
ALUSEAU Luxemburg PA U & F WS & WW N No
France (several) France BM U, F & P WS & WW N No
WaBe Czech Republic PA U & F WS N Yes
WSOP Slovenia BM U & F WS N Yes
AWSR Slovak Republic BM U WS N Yes
IPM Republic of Serbia PA U WS & WW N & I No
PA – Performance Assessment; BM – Benchmarking (Performance & Improvement); U – Utility level; F – Functions level (core process); P – Process
level; WS – Water supply; WW – Wastewater; I – International; N – National; R – Regional; L - Local
36
37
2.7.1. The Benchmarking Program of
European Benchmarking Co-operation (EBC)
It is important to mention here, the project of European Benchmarking Co-
operation (EBC) in the field of water and wastewater benchmarking.
The drivers for the EBC initiative to carry out cross-national benchmarking
projects were on the one hand the discussions on liberalization and
privatization, and on the other hand the demand for transparent and efficient
public services. The requirements of the EU Directives also played an
important role, for example, the full cost recovery according to Water
Framework Directive (WFD).
Using the benefits of international benchmarking, utilities are able to show
stakeholders their intention to optimize procedures and this initiative allows
them to have a larger opportunity for networking and identifying best
practices. For this purposes EBC provides a platform to exchange best
practices of management and operations, as well as, benchmarking
knowledge and experience. The approach is fully supported by International
Water Association (IWA).
EBC survey is restricted to a manageable amount of data.
A three level performance assessment model has been developed by EBC
(Figure 2.6), because water utilities differ a lot (regarding size, type of
activities, technological development etc.) and for performance assessment,
they can be chosen from three different participation levels:
 Basic level: Only service quality data, finance and efficiency data in
addition to context information are evaluated.
 Standard level: On this level, the three remaining pillars, namely water
quality, reliability and sustainability, are added.
 Advanced level: On this level, the focus is directed on sustainability
and economy.
38
The numbers of the different sets of input variables, which are necessary to
calculate the performance indicators, are shown in Figure 2.7.
Figure 2.6: EBC’s levels of participation (Cabrera et al., 2006; EBC, Web site:
http://www.waterbenchmark.org/content/benchmarking.html)
Figure 2.7: Wastewater sets of input variables and performance indicators for
basic, standard and advanced level of benchmarking, according to
Benchmarking Program of European Benchmarking Cooperation (EBC)
Up to now, 41 participants from 18 different countries have joined the EBC
project (Thaler and Dimitrova, 2009).
39
2.7.2. The Benchmarking Program of International Benchmarking Network for
Water and Sanitation Utilities (IBNET)
There are two international initiatives on water supply and wastewater utilities
benchmarking which are very relevant for benchmarking of utilities in Serbia
that is currently being implemented in Serbia. The first initiative is
benchmarking by the model of International Network for Benchmarking of
Water Supply and Sanitation Utilities (IBNET), initiated in 1996. This initiative
had great success in developing countries. IBNET methodology is currently
being used in Serbia through the projects implemented by Inter-institutional
Professional Network in Water Sector of Serbia (IPM), with financial support
from the World Bank. The second international initiative showed important
success in developed countries. European Benchmarking Co-operation (EBC)
was initially established in year 2006 by Scandinavian and Dutch water
associations and several individual water companies (Krstić, 2013).
The International Benchmarking Network for Water and Sanitation Utilities
(IBNET) helps to build the resources for meeting this demand and suggests
ways of providing improved services.
Through its performance-assessment standards and continually updated
database, IBNET serves as a global yardstick with which utilities and national
policy makers, as well as the public, governments, municipalities, utilities,
investors, and other users, can compare and evaluate the performance of
water and wastewater utilities throughout the world.
IBNET provides a set of tools that allows water and sanitation utilities to
measure their performance both against their own past performance and
against the performance of similar utilities at the national, regional, and global
levels.
40
IBNET consists of three major tools. The first is the IBNET Data Collection
Toolkit, which can be downloaded from the IBNET Web site at http://www.ib-
net.org; this Excel spread sheet indicates a set of data to be completed and
offers detailed instructions on the precise data to enter.
The second tool is a continuously updated database of water and sewerage
utilities’ performance. This database allows utilities and other sector
stakeholders to search for data in different formats and provides the means
for simple benchmarking of utility data. The benchmarking tool enables the
utility to compare itself to other utilities with similar characteristics (for
example, size, factors related to location, and management structure).
The third tool provides data on participating agencies. This information helps
organizations interested in measuring utility performance to contact
neighbouring utilities and other organizations to build local networks for
performance assessment and benchmarking.
IBNET has three key aspects. The first is that participation is voluntary, with
the result that organizations contributing to IBNET are very diverse. They
include, for example, regulatory associations, national water associations,
government departments and agencies involved in monitoring urban water
supplies and sewerage utilities, and, more recently, individual utilities.
A second feature of IBNET is that it does not itself collect data. Rather, it sets
up mechanisms by which many different organizations conduct data
collection.
From its start, IBNET’s strategy has been to use a highly decentralized
approach. IBNET’s role is to provide instruments, such as the IBNET Toolkit,
to support this process. In its feedback, IBNET checks the quality of the data
to ensure internal consistency and helps participants to analyse the data.
The third key IBNET feature, one fairly rare among agencies involved in utility
benchmarking, is its focus on developing time-series data. Without time-series
41
data, trends in utility performance and the impact of water and sanitation
policies are difficult to detect. Effective development of time-series data
requires ensuring that the data remain comparable over time through the
rigorous use of a standardized data set and indicators as well as frequent
data updating. In IBNET practice, most of the data are updated every two
years. As performance assessment and benchmarking gain more
prominence in the sector as regulation and monitoring tools, obtaining data on
an annual basis is becoming easier, especially in countries with increasingly
institutionalized performance assessment. This database allows innovative
time-series performance analysis as well as cross-section analysis (Berg and
Danilenko, 2011; Krstić, 2013).
42
43
CHAPTER 3. PERFORMANCE ASSESSMENT
(METRIC BENCHMARKING)
The scopes of this work are Benchmarking and Performance Indicators. As it
has been explained before performance assessment is a part of
benchmarking process. The following figure illustrates the steps in
performance assessment.
Figure 3.1: Illustration of typical flowchart in performance assessment
(Sjøvold et al., 2008)
Performance assessment is a widespread activity used in economics,
business, and sports and in many other areas of life in general, in order to
compare and score entities and individuals and make management decisions.
Assessment is defined as a “process or result of this process, comparing a
specific subject matter to relevant references” (ISO 24500).
Performance assessment is therefore any approach that allows evaluation of
the efficiency or the effectiveness of a process or activity through the
production of performance measures. (Sjøvold et al., 2008)
Utility
Challenges
Data collection
PI system
Benchmarking Forecasting Decision support
44
The ISO series 24500 provides the following definitions:
 Effectiveness is the extent to which the planned activities are realized
and the planned results achieved;
 Efficiency is the relationship between the result achieved and the
resources used.
Performance measures are the specific parameters that are used to inform
assessment. (Sjøvold et al. 2008).
Figure 3.2: PI as a part of a performance measurement system, followed by
example for water supply service (Alegre et al., 2006)
Objectives
Which results are to be reached in
the future?
Strategies
How can those results be reached?
Critical success factors
Depending on the constraints
and the context, the optimum
strategies to reach objectives
PIs
Have the objectives been reached?
What happened with the
critical success factors?
Reduce non revenue
water by 2%
New metering
program
Increase lekage detection
To replace non-accurate meters by
new / more accurate ones
To more accurately
read / report meters
To increase leakage detected volume
Op8 – Meter replacement
Op30 – Customer reading efficiency
Op4 – Leakage Control
Op23 – Apparent losses
Op28 – Real losses per mains lenght
45
Performance indicators are efficiency or effectiveness measures of the activity
of a utility (Sjøvold et al., 2008).
Once again, it will be repeated, that performance assessment (metric
benchmarking) is quantitative comparative assessment of a company
performance. It enables comparison between peer utilities and it is used for
quantitative analysis (answers to the question “Where am I?”).
Performance assessment (metric benchmarking) is a natural and intuitive way
of comparison. Due to the fact that it is one of the last natural monopolies,
water industry is a business, which needs some sort of regulation. This is why
performance assessment is a great tool for comparison and it has gained a
great importance. However, comparing performance indicators has become
the natural tool for regulators of the water industry worldwide.
All around the world, all industries are carrying out rudimentary metric
benchmarking (performance assessment). Nevertheless, it is not enough just
to simply compare the figures and this does not make a metric benchmarking
(performance assessment). It is much more than that; it implies the analysis of
the results, a key phase needed to account for differences in local conditions
and to assess the impact of all factors in performance (Cabrera et al., 2006).
Potential users (entities or “stakeholders”) of performance assessment in
water and wastewater services can be (Alegre et al., 2006; Matos et al.,
2003):
 The water and wastewater undertakings (no matter of ownership
status);
 The consumers or direct users;
 The indirect stakeholders (affected by impact on surrounding
environment);
 The pro-active stakeholders (environmental organisations, consumer
protection agencies and other pressure groups);
 The policy-making bodies (at local, regional or national level);
46
 The regulatory agencies (responsible for economics and quality of
service regulation);
 The auditors, financing bodies and agencies;
 The quality certifying organisations;
 The multi-lateral organisations.

Table 3.1: Scope of application of PI systems (Alegre and Baptista, 2002)
Exclusivelywithinthe
undertaking
Intheframeworkof
benchmarkinginitiatives
Aspartofaregulatory
framework
Aspartofcontractual
agreements
InthescopeofQuality
CertificationSystems
InthescopeofGuaranteed
StandardSchemes
Inthescopeofstatistic
reportspubliclyavailable
Water/Wastewater utilities       
Policy-making bodies  
Regulatory agencies  
Financing bodies  
Quality certifying entities 
Auditors    
Direct users and indirect and pro-active
stakeholders
  
Supra-national organisations  
The following figures illustrate water (Alegre et al., 2006) and wastewater
(Matos et al., 2003) undertaking context and interconnection with potential
users of performance assessment.
47
Figure 3.3: Water undertaking context (Algere et al., 2006)
Figure 3.4: Wastewater undertaking context
(Matos et al. 2003, Algere et al., 1997)
REGION
Water
Energy
Other resources
Environment
Technological assessts
Human resources
Supply system
Waterutility
National/regional policies
Demography
Economics
Legislation
Direct
Indirect
Proactive
Consumers
service
financial
resources
ENVIRONMENT
REGION
ECONOMICS AND DEMOGRAPHICS
ENVIRONMENTAL
RESOURCES
WATER
ENERGY
SOIL
AIR
BIOTA
ASSETS
HUMAN
TECHNOLOGICAL
FINANCIAL
OTHER STAKEHOLDERS
INDIRECT
USERS
CUSTOMERS
PROACTIVE
USERS
AUTHORITIES
POLICY MAKERS REGULATORS
WASTEWATER
UNDERTAKING
48
Performance assessment of water utilities is not an easy job. The amount of
data present in a single utility can be overwhelming. That is the reason why
number of data and PIs should be very carefully chosen. Performance
assessment could be, for this reason, described as the art of simplification:
the more concise the data, the better; but an over simplification of the whole
picture can provide insufficient information for making good decisions.
Indicators are a great tool to assess performance. The traditional ratio
combines at least two relevant variables measured in the real world and
provides significant information. By combining the adequate indicators, a
general picture of reality can be achieved. An indicator is a very intuitive tool,
and is easily understood. Indicators facilitate comparisons, as denominators
usually provide a size or quantity reference (Cabrera et al., 2006).
Performance indicators are useful only when they have been compared to
established reference. Without explanatory (additional) information, value of
an indicator could be meaningful. Designing of a performance assessment
system means, that method of comparison must be clear.
The usage of indicators can be of a different purpose (Cabrera et al., 2006):
 Assessing the fulfilment of objectives/targets. This depends of what
goals utility wants to achieve. Those objectives/targets should be
established in advance, with fixed values of indicators that should be
reached.
 Trend analysis. If utility, or parts of it, wants to follow evolution with
time, indicators provide trends and can be used even for prediction in
the future. In this case, the indicators are being compared to previous
values of the same indicators obtained in the past, and delivered
information about evolution in time and whether it is improved or not.
 Peer comparison. A natural follow-up to any indicator system is to try to
compare the indicator values with those obtained by another utility. In
this case, the required system is more complex, as the analysis of the
results needs to take into account size and context differences.
49
The following figure illustrates the implementation phases in metric
benchmarking (performance assessment) given by Sjøvold et al. in 2008 and
adapted by Cabrera in 2001.
Figure 3.5: Metric benchmarking process
(Sjøvold et al., 2008; Cabrera, 2001)
1. Identify the goals and necessary resources
2. Find metric benchmarking partners
3. Define PI system and data collection procedures
4. Collect data, calculate and validate indicators
5. Results analysis
6. Communicate results
7. Identify potential improvement areas
8. Undertake improvement actions
ActionIntegrationExecutionPlanning
50
51
CHAPTER 4. ISO STANDARDS SERIES
24500:2007 (24510, 24511 AND 24512) AND STANDARDIZATION OF
PERFORMANCE INDICATORS
The International Organization for Standardization (ISO) is a worldwide
federation of national standards bodies (ISO member bodies). The work of
preparing international standards is normally carried out through ISO technical
committees (TC). In 2007, the ISO TC 224 published the international
standard series ISO 24500:2007 (Koelbl, 2008).
The full series consists of the following international standards:
 ISO 24510 (2007): Activities relating to drinking water and wastewater
services — Guidelines for the assessment and for the improvement of
the service to users,
 ISO 24511 (2007): Activities relating to drinking water and wastewater
services — Guidelines for the management of wastewater utilities and
for the assessment of wastewater services,
 ISO 24512 (2007): Activities relating to drinking water and wastewater
services — Guidelines for the management of drinking water utilities
and for the assessment of drinking water services.
The objective of these international standards series is to provide the relevant
stakeholders with guidelines for assessing and improving the service to users,
and with guidance for managing water utilities, consistent with the overarching
goals set by the relevant authorities and by international intergovernmental
organizations.
ISO / TC 224 developed this series of standards and it is applicable on a
voluntary basis. It means that those standards are not normative, just
informative guidelines.
The standards were developed by consensus internationally, with the
objective of being applicable worldwide. All three standards share common
parts, including terminology, annexes and structure of PIs. The standards
52
recommend building PI Systems according to IWA recommendations (IWA
Manual of Best Practice Handbooks). The examples provided are mostly from
the IWA Handbooks.
The three standards share the same structure:
 Scope
 Components
 Objectives
 Guidelines for the Management
 Service Quality Assessment
 Related Performance Indicators (PIs)
 Use of PIs for Operation
ISO 24511:2007 and ISO 24512:2007 are twin standards, but ISO
24510:2007 is slightly different.
Table 4.1: Structure of Standards on service activities related to drinking
water supply and wastewater systems – comparison review
(ISO 24510:2007, 24511:2007, 24512:2007) and (Talib et al., 2005)
ISOWD 24510 ISOWD 24511 ISOWD 24512
Scope Scope Scope
Normative
References
Normative References Normative References
Terms and
Definitions
Terms and Definitions Terms and Definitions
Components of
Services
Components of Wastewater
Systems
Components of Water
Supply Systems
User’s Needs and
Expectations
Management Components of
a Wastewater Service
Management
Components of a
Drinking Water Supply
Service
Performance
Indicators
Wastewater Service
Objectives
Drinking Water Supply
Service Objectives
Guidance on the Management
of Wastewater Service
Guidance on the
Management of Drinking
Water Supply Service
Service Assessment Service Assessment
Performance Indicators Performance Indicators
53
Components of wastewater system, according to ISO 24511:2007:
Figure 4.1: Fields of application of ISO 24511:2007 (Wastewater), source of
the schematic: IWA Performance Indicators for Wastewater Services
Types of wastewater system, according to ISO 24511:2007:
Figure 4.2: Fields of application of ISO 24511:2007 (Wastewater), source of
figure: based on a scheme from Hydroconseil, France, 2002.
54
Components of water supply system (drinking water), according to ISO
24512:2007:
Figure 4.3: Fields of application of ISO 24512:2007 (Drinking Water)
The ISO 24511:2007 and 24512:2007 are introducing the term responsible
body, which has the overall legal responsibility for providing drinking water
and/or wastewater services and for establishing the policy and the general
organization of the relevant water utility, for a given geographic area.
The responsible body should establish the objectives, associated service
criteria and performance indicators for a wastewater utility, taking into account
the legal requirements of the relevant authorities as a basis and the
expectations of the users and other stakeholders in conjunction with its
operators. Figure 4.4 illustrates relevant relationships between stakeholders
for establishing objectives.
55
Figure 4.4: Relevant relationships between stakeholders for establishing
objectives, according to ISO standard 24511:2007 and 24512:2007
According to those two standards, management of wastewater utility needs:
 Formulation of objectives and service assessment criteria,
 Targeting the service assessment criteria by the use of a set of
performance indicators,
 Evaluation of the performance by measuring and assessment.
The ISO 24500:2007 series of standards are giving definitions (same as IWA)
concerning Performance Indicators (PI) and they will be presented in Chapter
5.2.
NOTE RB and OPERATOR can be the same body
56
57
CHAPTER 5. PERFORMANCE INDICATORS SYSTEM
Definition
Performance Indicators (PI) represent a quantitative measure of a particular
aspect of the undertakings performance or standard of service. It assists in
the monitoring and evaluation of the efficiency and effectiveness of the
undertakings, thus simplifying an otherwise complex evaluation (Makropoulos,
2009).
There are many Performance Indicators systems (PIs) worldwide. ISO
standards for PIs have already been presented in pervious section. IBNET
(The International Benchmarking Network for Water and Sanitation Utilities) is
one of them, used by IPM (Inter-institutional Professional Network in Water
Sector of Serbia) initiative and it is going to be commented on, for its results in
Republic of Serbia.
In this chapter, IWA (International Water Association) Performance Indicators
systems in water supply and wastewater sector are going to be presented in
detail. IWA offers a large number of PIs for Drinking Water and Waste Water
Services (each approx.170), issued in two IWA Handbooks (Alegre et al.,
2006; Matos et al., 2003).
IWA manuals on performance indicators provide a structure that may prove to
be a valuable guide when building up such a system.
58
5.1. IWA PERFORMANCE INDICATORS SYSTEMS IN WATER SUPPLY
AND WASTEWATER SECTOR
Performance indicator systems (PIs) and benchmarking are instruments for
internal corporate management but also for comparisons of utilities on a
regional, national and international scale (Merkel, 2001).
The basis for corporate benchmarking are standardized performance indicator
systems, which evaluate all the tasks of a sustainable water supply and
wastewater sector holistically, considering supply safety, supply quality,
customer service, sustainability and efficiency. Such a “quasi-competition” on
a voluntary basis can display the performance, but also enables the derivation
of measures for improvement (Hirner and Merkel, 2002).
According to these principles, a large number of benchmarking projects have
been carried out all over the world in the water supply sector over the last
decade.
At the end of the 1990’s a committee of the International Water Association
developed a system of performance indicators for the water supply services
(Alegre et al., 2000) and carried out several national field tests in order to
adapt the system to practical applications. Six years later, after a field test
with more than 70 undertakings worldwide, Alegre et al. (2006) published an
updated, improved version of the manual of best practice. Matos et al. (2003)
did the same for wastewater services, with the manual of best practice
published by IWA.
Undoubtedly, the IWA PI system is the state of the art performance indicator
systems in the water supply and wastewater sector and is the basis for many
projects worldwide, although individual adaptations (e.g. additional PIs) for the
frame conditions of single countries may be useful.
59
The main objective of both manuals is to provide guidelines for the
establishment of a management tool for water supply and wastewater utilities
based on the use of performance indicators. Further objectives are to provide
a coherent framework of indicators for benchmarking initiatives but also for
regulatory agencies and international statistics collected by the IWA (Alegre et
al., 2006; Matos et. al., 2003).
This chapter gives an overview of the IWA performance indicator system for
water supply and wastewater services described by Alegre et al. (2006) and
Matos et al. (2003).
60
5.2. ELEMENTS OF THE PI SYSTEM
The methodology of data elements of PI system is the same for water supply
and wastewater services and explanation of them is going to be given for
both. The structure of PIs for those two services is going to be explained
further on.
Figure 5.2.1: Structure of PI system – six separate categories.
The PI system consists of four types of data elements, each of them with
different rules within the system:
 variables;
 performance indicators (PI);
 context information (CI);
 explanatory factors.
Variables
Variables are the data elements of which the performance indicators are
calculated from. The variables are values (resulting from a measurement or
record) expressed in a specific unit (e.g. “length of mains”, unit: km; “average
Environmental /
Water Resources
PIs
Economic and
Financial
Quality of service Operational
Physical
Personnel
61
service pressure head”, unit: m; “total sub-process costs”, unit: €/a).
Confidence grades indicate the data quality for each variable.
Variables should fulfil the following requirements:
 univocal definitions,
 reasonably achievable,
 refer to the same geographical area and the same assessment period
as the PI and CI they are used for and fit their definition,
 be as reliable and accurate as the decision made based on them
requires.
Figure 5.2.2: General outline of IWA variable definition,
according to Alegre et al. 2002 and Alegre et al. 2006
Figure 5.2.3: IWA variable definition, classification and description,
according to Alegre et al. 2002 and Alegre et al. 2006
62
Some of the variables are external data and mainly informative, and their
availability, accuracy, reference dates and limits of the corresponding
geographical area are generally out of the control of the utility. In this case,
variables should also:
 whenever possible be collected from official sources, and
 be essential for the performance indicator assessment or interpretation.
Performance indicators (PI)
Performance Indicators are measures of the efficiency and effectiveness of a
utility in achieving its objectives, which are results from the combination (ratio)
of several variables. Each PI should express the level of performance
achieved in a certain area and during a given assessment period (e.g. one
year). A clear processing rule should be defined for each performance
indicator to specify all the variables required and their algebraic combination.
As with variables, the performance indicators also consist of values expressed
in specific units and confidence grades indicate the quality of data
represented by the indicator. Performance indicators are typically expressed
as ratios between variables.
These ratios may be commensurate (e.g., “non-revenue water”, unit: %) or
non-commensurate (e.g. “total process costs”, unit: €/km or €/100 service
connections, unit: l/conn·d). In general, the latter case allows a better
performance comparison due to the fact the denominators represent the
dimension of the water supply or wastewater system (e.g. number of service
connections or total mains length). This allows for comparisons through time,
or between systems of different sizes.
63
Figure 5.2.4: General outline of the PI definitions within the IWA Manual,
according to Alegre et al. 2002 and Alegre et al. 2006
Figure 5.2.5: PI identification and classification, description, terms and
processing rules, according to Alegre et al. 2002 and Alegre et al. 2006
Performance indicators should fulfil the following requirements. They should
be:
 clearly defined with a concise and unequivocal meaning,
 reasonably achievable (depends on the related variables) at a
reasonable cost,
64
 auditable,
 as universal as possible and provide a measure which is independent
from the particular condition of the utility,
 simple and easy to understand,
 quantifiable so as to provide an objective measurement of the service,
avoiding any personal or subjective appraisal,
 every PI should provide information significantly different from other
PIs,
 only PIs, which are deemed essential for effective performance
evaluation, should be established.
65
Context Information (CI)
These data elements provide information on the characteristics of an
undertaking and account for differences between water supply/wastewater
systems.
There are two possible types of context information:
 External factors that cannot be changed by management decisions.
These information describes the frame conditions of a system (e.g.
geography, demography, topography, climate), which are relatively
constant through time. They are not under the control of the utility.
 Data elements that are not modifiable by management decisions in a
short or medium term, but the management policies can influence them
in the long term (e.g. the condition of the infrastructure of a system,
pipe material).
Context information are necessary when comparing different structured
systems and gives support in cause analyses.
The requirements for context information are, in general, the same as for
performance indicators and variables. If the level of detail and confidence
grading is not the same, they should be:
 univocal definitions,
 reasonably achievable,
 if external, be collected whenever possible from official survey
departments,
 fundamental for the interpretation of PIs,
 as few as possible.
The context information related to the undertaking and to the region profiles is
rather alike in both cases (water supply and wastewater services). The
system profiles have the same organization, although the contents are
different (Duarte et al. 2003).
66
Explanatory factors
Explanatory factors are key elements of PI systems, which are used to explain
PI values but they are also used for the grouping of comparable water
supply/wastewater systems.
Explanatory factors may be context information, variables or PIs (e.g. average
age of network, service connection density or network delivery rate).
Figure 5.2.6: Illustration of components (data elements) of a performance
indicators system (Sjøvold et al. 2008 and adapted Alegre et al. 2006)
DataElements
(allinformationin
thesystem)
Variables
(DataElements
usedtocalculate
PIs)
Explanatory
factors
(DataElem.used
toexplainPIs)
CI
(DataElementsnot
modifiableby
management)
PIs
(PIsusedto
explainPIs)
PIs
67
5.3. ADVANTAGES OF IWA PERFORMANCE INDICATORS SYSTEM
There are several advantages to use PIs proposed by IWA, for water supply
and wastewater services (Cabrera et al., 2006):
 The system is as universal as possible and the proposed indicators
have become an industry standard.
 All indicators fulfill previously listed requirements. Indicators published
by IWA went through a revision process over many contributors and
they are proved in practice.
 The indicators and the corresponding variables are well defined. IWA
definitions may not be perfect, but they are detailed enough to
guarantee that the debate will only concern project specific details.
 IWA PIs can be understood, as choosing a set of indicators off the
shelf, or at least using them as a base for modification to create the
new ones.
 The structure of a performance assessment system provided by IWA is
a framework, which allows adding, replacing or modifying indicators
with the assurance of the system being coherent and compatible with
other systems in the world.
A very important fact should be stated here. The number of performance
indicators should be well balanced. Too many indicators will significantly
increase costs and difficulties of implementing the system. Too few will result
in the system not being able to provide a proper assessment of the
performance of undertaking in the terms defined by the objectives and the
selected strategies (Algere et al., 2006).
First, the goals and targets to achieve should be established. According to
those objectives, the PI should be chosen. The adverse process is a terrible
mistake. If the PI are chosen first, without the objectives, than the result will
be tracking PI without straight and clear idea what the improvement is.
68
Another recommendation is to start with not more than about 20 PI and then
to review the system, modify, tailor and adapt the new or change the starting
PI, according to objectives of the PIs.
Both PI systems aim to be a kind of ‘yellow pages’ where the undertaking
managers may find relevant indicators regarding all their key activities. After
defining the object of the performance assessment (e.g. the undertaking as a
whole, water losses, rehabilitation, etc.), the intended use of the results (e.g.
objective-oriented management) and the type of initiative (e.g. internal
analysis, benchmarking, reporting for a regulator), undertaking managers may
select the subset of the IWA indicators considered relevant to respond to their
needs (Duarte et al., 2003).
69
5.4. IMPLEMENTATION OF IWA PERFORMANCE INDICATORS SYSTEM
In this chapter, implementation process of IWA Performance Indicators
System is going to be presented by illustration in the following figures.
The implementation of any Performance Indicators system has to be
objective-oriented. Performance Indicators are the last step of a larger
management strategy that should link the undertaking’s objectives to
strategies, define critical success factors and only then bring performance
indicators both as means to evaluate the success of these strategies and as a
mechanism to detect problems in advance. Objectives need to be precise and
clear. They need to be both demanding and realistic and most importantly
reflect the mission and vision of a company (Algere et al., 2006).
After establishing the objectives, implementation process can be performed.
Implementation process is the same (unique) for both, water supply services
and wastewater services. Both consist of three phases:
 Strategy,
 Development and
 Assessment.
There is no crucial difference, between the next two figures, Alegre et al.
(2006) presented the first one in 2000 and Matos et al. (2003) adapted,
improved and upgrade the methodology with details.
Figure 5.4.3 shows the sub-processes of PI and CI selection procedure and
figure 5.4.4 shows, as sub-process, an example of the possible way for
solution of data flows concerning PI and CI and team responsibilities.
All the processes presented in figures are intended to be a continuous
improvement process with in utility. The defining of objectives and strategies
to reach them is a periodic task in the management of any organization at any
level.
70
Figure 5.4.1: Phases of the PI system implementation process for water
supply services (Alegre et al., 2006)
Appoint Strategic team.
Define objectives, strategies and
critical success factors
Define a PI team profile.
Appoint a PI Team
Identify suitable PI system elements
for the critical success factors.
Prepare SIGMA Lite file
Establishment of data management
routines, schedules and
responsibilities
Pilot test (with SIGMA Lite)
Data collection, validation, input and
assessment
Result interpretation according to
objectives, and global reporting
Determine success in achieving
objectives and prepare new strategies
Strategy
RefineselectionifPIsisnotoptimum
forcriticalsuccessfactors
DevelopmentAssessment
71
Figure 5.4.2: Phases of the PI system implementation process for wastewater
services (Matos et al. 2003, adapted from Alegre 2000 and 2002)
Phase 3. Implementation of important of Pis to be assessed
Phase 1. Definition of the strategic performance assessment policy
Definition of
objectives
Definition of the
scope of application
Definition of the
PI team profile
Appointment of a PI Team
Adoption of
IWA Manual of Best Practices
“Performance Indicators for Wastewater
Services”
Phase 2. Selection of Pis to be assessed
PI significance level assignment
Selection of
performance indicators (PI)
and context information (CI)
Definition of data collection and PI
assessment frequencies
Preparation of SIGMA Lite file by
selecting PI and CI
Definition of the internal data flows and
team-member responsibilities
Carry out
pilot test with
SIGMA Lite WW
Software
List of important PIs
Elaborates of written procedures for data
collection
FULL IMPLEMENTATION – PIs ASSESSMENT
Improvements
See separate Figure: PI and
CI selection procedure
Improvements
See separate Figure:
Example of data flows
concerning PI and CI and
team responsibilities
72
Figure 5.4.3: PI and CI selection procedure
(Alegre et al. 2006 and Matos et al. 2003)
For each
pre-selected
significant
PI and CI item:
Identification of
the data
required
Is all
necessary
data
available?
Reliability
Accuracu
Yes
Are reliability
and accuracy
acceptable?
Confirm
selection of PI
and CI
Yes
Reject PI
No
Reject CI
No
Is it efficient to
obtain this CI? Do
you really need
this CI?
Is it efficient to
obtain the the
data? Is PI
important?
Modify / Add data
collection
procedures
Yes
Yes
No
PI
CI
No
73
Figure 5.4.4: Example of data flows concerning PI and CI and team
responsibilities (Alegre et al. 2006 and Matos et al. 2003)
DEPARTMENTS
PI TEAM
TOP
MANAGEMENT
· Definition of the
strategic
performance
assessment policy
· Guidelines for
selection of PIs and
related Cis
· Pre-selection of PI
and related CI
· Refinment of the PI
and CI listings
· Decision on
improvements to
be carried out on
data collection
procedures
List of pre-selected
PIs and related CI
· Selection of
possible additional
PI (more detailed
information)
· Selection of the
relevant variables
Request for data
availability
Reports on
PI and CI
assessing availability
· Check data
availability from
departments
· Check possibility of
assessing pre-
selected PI
· Proposes
refinements of the
PI and CI listings
· Prepares reports
Request for Variables
and CI data
· Selection of the
relevant variables
· Preparation of data
collection forms
· Preparation of the
SIGMA Lite/SIGMA
Lite WW file
Reports on
PI and CI
· Collect data from
departments
· Enter data in
SIGMA Lite/SIGMA
Lite WW
· Calculate PI
· Output SIGMA
spreadsheets
· Prepares reports
· Data availability
verification (value,
reliability and
accuracy) by each
responsible in the
departments
Variables and CI
availability
information
Variables and
CI data
· Data collection
(value, reliability
and accuracy) by
each responsible in
the departments
· Filling in the data
collection forms
Assessment PI and CISelection PI and CI
· Decision on PI and
CI use
· Definition of the set
of PI and related CI
to be assessed
· PI results
interpretation
· Decision on
improvement
measures in the
undertaking
· Decision on next
steps according to
intended uses
PI and CI assessment
request
74
5.5. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR
WATER SUPPLY SERVICES
Within the IWA PI system for water supply, the performance indicators are
structured into six main groups (following Table 5.5.1):
 Water Resources (WR),
 Personnel (Pe),
 Physical (Ph),
 Operational (Op),
 Quality of Service (QS) and
 Economic and Financial (Fi).
These main groups are divided into subgroups and some of the indicators are
broken down into sub-indicators.
75
Table 5.5.1: IWA PI system structure for water supply services
Group
Code
Main PI
group
Subgroup
Number of PIs
subgroup
(+ sub-
indicators)
Number of PIs
main group
(+ sub-
indicators)
WR
Water
Resources
no subgroup 3 (+1) 3 (+1)
Pe Personnel
Total personnel 2
20 (+6)
Personnel per main function 5 (+2)
Technical services personnel
per activity
6
Personnel qualification 3
Personnel training 1 (+2)
Personnel health and safety 2 (+2)
Overtime work 1
Ph Physical
Water treatment 1
11 (+4)
Water storage 2
Pumping 4
Valve, hydrant and meter
availability
2 (+4)
Automation and control 2
Op Operational
Inspection and maintenance 6
31 (+13)
Instrumentation and
calibration
5
Electrical and signal
transmission
equipment inspection
3
Vehicle availability 1
Rehabilitation 2 (+5)
Operational water losses 3 (+4)
Failure 6
Water metering 4
Water quality monitoring 1 (+4)
QS
Quality of
Service
Service coverage 3 (+2)
24 (+10)
Public taps and standpipes 4
Pressure and continuity of
supply
8
Quality of supplied water 1 (+4)
Service connection and
meter installation and repair
3
Customer complaints 5 (+4)
Fi
Economic
and Financial
Revenue 1 (+2)
23 (+24)
Cost 1 (+2)
Composition of running costs
per type of costs
(+5)
Composition of running costs
per main function of the water
undertaking
(+5)
Composition of running costs
per technical function activity
(+6)
Composition of capital costs (+2)
Investment 1 (+2)
Average water charges 2
Efficiency 9
Leverage 2
Liquidity 1
Profitability 4
Economic water losses 2
Total number of PIs (+ sub-indicators): 112 (+58)
76
5.6. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR
WASTEWATER SERVICES
In this chapter, structure of IWA Performance Indicators system for
wastewater services will be elaborated and focused on in more detail, than
structure of IWA Performance Indicators system for water supply services.
There are many possible ways to define a wastewater system. Figures 4.1
and 4.2, shows the main components and linkages from an infrastructure
point of view. Another way is presented on following Figure 5.6.1, which
illustrates the way in which a wastewater flow balance may be determined.
Key to implementation is the construction of a flow balance for catchments
served. Flow, conveyed solids and other waterborne substances are
illustrated as a simplistic wastewater balance, or flux. Here the sum of the
input components and the system losses is equal to the outputs from the
system. The water services PI manual provides definitive information about
establishing a water balance (Matos et al. 2002a and Matos et. al. 2003).
Figure 5.6.1: Wastewater balance (Matos et al. 2002a and Matos et al. 2003)
OUTPUTSLOSSESINPUTS
STORMWATER
Runoff
From properties
From industry & commerce
From wrong connections
Imported
Infiltration
SANITARY SEWAGE
Authorized
Wrong connections
From on-site systems
Imported
Industrial inputs
Imported sludge
Sewer solids extracted and re-
introduced downstream
STORMWATER
Wrong connections
Exfiltration
SANITARY SEWAGE
Wrong connections
SSO spills
CSO spills
Solids extracted from systems and
disposed off-site
On-site systems
(local disposal)
On-site stormwater, handling
(local disposal), source control,
BMP, SUDS
TREATED EFFLUENTS
To watercourses
To land
Reused
Exported
UNTREATED EFFLUENTS
To watercourses
To land
Exported
SLUDGE
Wasted & removed
(disposed)
Reused
+ =
These services are usually not the responsibility of the wastewater undertaking.
77
The services provided by the undertaking must be related to the users of
those services and the inter-relationship between the customers served by the
wastewater undertaking and the direct impacts arising from those services are
illustrated below. The Figure 5.6.2 shows how the various customers
(including industry) utilise the services and how the population and other
customer base can be comprised of residents and temporary (visiting) service
users. It also shows that parts of the population and other potential customers
may manage their own wastewater services on-site or otherwise. The non-
users of the services are represented as those not served (Matos et al. 2003).
Figure 5.6.2: Illustration of service provision in terms of customers and main
impacts (Matos et al. 2003)
Resident population
Service population
equivalent in the area
Industry p.e.
+ Commercial p.e.
+ Services p.e.
+other p.e.
Seasonal population
(period equivalent)
+
+
Imported
wastewater
population
equivalent
Not served
On-site systemas that
are under the
responsibility of the
wastewater
undertaking
Sewer system
Wastewater
produced
Wastewater drainage
and/or treatment
Discharge to
environment
(surface water, ground
water, soil and air)
Compliant with
discharge
consents
Non-compliant
with discharge
consents
Compliant with
discharge
consents
Non-compliant
with discharge
consents
WWTP
Area under the wastewater undertaking responsibility
78
Performance Indicators for Wastewater services
The IWA Performance Indicators system for wastewater is structured into six
separate categories of performance:
Table 5.6.1: Structure of the PI framework (Matos et al. 2003)
Code
Performance
indicatorsfor
wastewater
wEn Environmental indicators
wPe Personnel indicators
wPh Physical indicators
wOp Operational indicators
wQS Quality of service indicators
wFi Economic and financial indicators
The interpretation of performance of an undertaking cannot be carried out
without taking into account the context in which it operates. In addition, it is
also necessary to consider also the characteristics of the infrastructure and
resource system and characteristics of the region in which the services are
provided. Having this in mind, the structure of the PI system also includes
profiles for context, system and region.
Figure 5.6.3: Structure of wastewater CI and PI
(Matos et al. 2002a and Matos et al. 2003)
Complete list of Performance Indicators for Wastewater is listed in
APPENDIX, according to Matos et al. 2003, Cabrera et al. 2006. and Matos et
al. 2002b.
PERFORMANCE
INDICATORS
CONTEXT
INFORMATION
External
data
Undertaking
information
UNDERTAKING
PROFILE
SYSTEM
PROFILE
REGION
PROFILE
ENVIRONMENTAL
PHYSICAL
PERSONNEL
OPERATIONAL
QUALITY OF SERVICE
ECONOMIC AND FINANCIAL
79
Table 5.6.2: IWA PI system structure for wastewater services
Group
Code
Main PI group Subgroup
Number of PIs
subgroup
(+ sub-
indicators)
Number of PIs
main group
(+ sub-
indicators)
wEn
Environmental
indicators
Wastewater 5
12 (+3)
Solid residues 7 (+3)
wPe
Personnel
indicators
Total personnel 2
20 (+5)
Personnel per main function 5 (+2)
Technical personnel per activity 5
Personnel qualification 2
Personnel training 1
Personnel vaccination and safety 3 (+1)
Absenteeism 1 (+2)
Overtime work 1
wPh
Physical
indicators
Wastewater treatment 4
12
Sewers 3
Pumping headroom 3
Automation and control 2
wOp
Operational
indicators
Sewer inspection and maintenance 5
45 (+11)
Tanks and CSOs inspection and
maintenance
4
Pumps and pumping stations
inspection
2
Equipment calibration 3
Electrical and signal transmission
equipment inspection
3
Energy consumption 3
Sewer system rehabilitation 4 (+3)
Pump rehabilitation 2
Inflow/infiltration/exfiltration (I/I/E) 4
Failures 8 (+1)
CSO control 1
Wastewater and sludge quality
monitoring
3 (+7)
Vehicle availability 1
Safety equipment 2
wQS
Quality of
service
indicators
Population served 4
18 (+11)
Treated wastewater 1 (+4)
Flooding 5
Interruptions 1
Reply to customer requests 3
Complaints 2 (+7)
Third party damages 1
Impact on traffic 1
wFi
Economic and
financial
indicators
Revenues 2 (+2)
37 (+8)
Costs 2 (+4)
Composition of running costs per type
of costs
5
Composition of running costs per
main function (internal and
outsourced)
5
Composition of running costs per
technical function activity
4
Composition of capital costs 2
Investment 1 (+2)
Efficiency indicators 9
Leverage indicators 2
Liquidity indicators 1
Profitability indicators 4
Total number of PIs (+ sub-indicators): 144 (+38)
80
Context Information for Wastewater services
The Context Information data are organized into sectors for undertaking
profile, the system profile and the region profile.
The undertaking profile outlines the organizational structure for the
Undertaking. The system profile focuses mainly on the type of
water/wastewater infrastructure and service provided, i.e. the physical assets,
technology used and type of customer. The latter goes into more detail than
the other profiles because it also contains descriptive information that is
helpful for the interpretation of the PIs. The region profile is essential for
meaningful comparisons between undertakings as it allows for a better
understanding of the demographic, economic, geographical and
environmental context (Matos et al. 2002a and Matos et al. 2003).
81
Figure 5.6.4: Context Information data for wastewater services
(Matos et al. 2002b)
SERVICE DATA
Types of system managed by the undertaking
Population, population served by different
types of systems or unserved
Peak population served
Catchment area and impermeable area
Annual average daily dry weather flow
Industrial wastewater
Imported/exported wastewater
Daily peak factor
Level of treatment (without treatment,
preliminary, primary, secondary and tertiary
treatment)
Sludge production, treatment and disposal
CUSTOMER SERVICE
Complaint record systems
Guaranteed standards scheme
PHYSICAL ASSETS
Wastewater systems
Total sewer length
Sewer length increase
Combined sewer systems, separate domestic
and separate storm water sewers, pump
mains and other sewers
Sewer materials, diameters or equivalent and age
Manholes, Sewer overflows,
Sewer gullies and Sea outfalls
SEWER CONNECTIONS
Domestic, Industrial pre-treatment facilities,
Septic tank and other connections
STORAGE
Storage tanks, stormwater storage tanks,
other and total storage volume
Pumping stations - number, total capacity,
pumped wastewater
Wastewater treatment plants, number by size and
type of upstream system
Treatment capacity (preliminary, primary,
secondary and tertiary treatment capacities)
Peak flow storage capacity at WWTP
Pumping capacity
Sludge production and sludge treated
TECHNOLOGICAL RESOURCES
Computerized information systems: planning and decision,
inspection, maintenance, customer complaints, other
Monitoring, automation and control: flow meters, quality monitors,
pumping, treatment, monitoring and control, integrated control
MAPPING
Updated mapping and digital mapping
SYSTEM PROFILE
DEMOGRAPHY AND ECONOMICS
Population density, Household occupancy
Population growth rate (current and forecast)
Gross National Product per capita
Inflation rate
Yearly working time
Unemployment rate
ENVIRONMENT
Yearly rainfall (average, minimum and
maximum)
Short duration rainfall (10 and 60 minute,
10 year return period)
Air temperature (daily average, minimum and
maximum)
Topography (maximum altitude, minimum
altitude)
RECEIVING BODIES
Types and special protected areas
REGION PROFILE
Undertaking identification
Geographical scope
Type of activity
Type of assets ownership
Type of operations
Total personnel
Annual revenue, Annual total costs
Outsourcing costs (management and support, financial
and commercial, planning and design, construction,
operation and maintenance and laboratory services)
Average annual investment
Service taxes or charges
UNDERTAKING PROFILE
82
Variables for Wastewater services
Variables are marked with capital letters, from A to H.
Variables are divided in sections, as follows:
 Section A, Environmental Data, marked as wAi (from wA1 to wA26), 26
in total;
 Section B, Personnel Data, marked as wBi (from wB1 to wB27), 27 in
total;
 Section C, Physical Assets Data, marked as wCi (from wC1 to wB33),
33 in total;
 Section D, Operational Data, marked as wDi (from wD1 to wD69), 69 in
total;
 Section E, Demography (and Customer) Data, marked as wEi (from
wE1 to wE8), 8 in total;
 Section F, Quality of Service Data, marked as wFi (from wF1 to wF26),
26 in total;
 Section G, Economic and Financial Data, marked as wGi (from wG1 to
wG52), 52 in total and
 Section H, Time Data, marked as wHi (wH1), 1 in total.
It is obvious that the number of variables is huge (242 in total). It does not
mean that all of them are needed. As already mentioned, first phase is
establishing of the objectives, than, according to those objectives of
undertaking, PI’s are chosen and only variables for those PI’s are needed. It
means that this huge number of variables is going to be as small as possible.
83
5.7. PERFORMANCE INDICATORS AND RELATED COMPONENTS -
CONFIDENCE-GRADING SCHEME
Confidence-grading scheme is always the same for any Performance
Indicators System and it is has been standardized by ISO series 24500:2007,
adopting it from IWA. In the literature, reference books, such as Algere et al.
(2006) and Matos et al. (2003), cited the same structure and methodology.
This chapter has been completely adopted from ISO 24510:2007, ANEX B,
24511:2007, 24512:2007, ANEX E and also Alegre et al. 2002, Algere et al.
(2006) and Matos et al. (2003), without any changes:
Reliability bands
Table 5.7.1: Reliability bands of data for PI system
DATA RELIABILITY Definition
A - Highly reliable Data based on sound records, procedures, investigations or analyses
that are properly documented and recognized as the best available
assessment methods.
B – Reliable Generally as in band A, but with minor shortcomings, e.g.: some of
the documentation is missing, the assessment is old, or some reliance
on unconfirmed reports or some extrapolations are made.
C – Unreliable Data based on extrapolation from a limited sample for which band A
or B is available.
D – Highly unreliable Data based on unconfirmed verbal reports and/or cursory inspections
or analysis.
Accuracy bands
Accuracy is defined as the approximation between the result of a given
measurement and the (conventionally) correct value for the variable to be
measured. The accuracy bands presented below are based on the system
adopted in England and Wales.
They are to be applied to the measurement and not to the measuring
equipment - for example, in some cases the equipment may be highly
84
accurate but is used out of range. Whenever the measurement accuracy
cannot be assessed, it should be graded as greater than 100%.
Table 5.7.2: Accuracy bands of data for PI system
DATA ACCURACY Definition
1 - Error (%): [0;1] Better than or equal to +/- 1%
2 - Error (%): ] 1;5] Not band 1, but better than or equal to +/- 5%
3 - Error (%): ] 5;10] Not bands 1 or 2, but better than or equal to +/- 10%
4 - Error (%): ] 10;25] Not bands 1, 2 or 3, but better than or equal to +/- 25%
5 - Error (%): ] 25;50] Not bands 1, 2, 3 or 4 but better than or equal to +/- 50%
6 - Error (%): ] 50;100] Not bands 1, 2, 3, 4 or 5 but better than or equal to +/- 100%
Error (%): > 100 Values which fall outside the valid range, such as > 100%, or small
numbers
Overall confidence grades
The confidence grades (c.g.) will be an alphanumeric code, which couples the
reliability band and the accuracy band, for instance:
C4 – Data based on extrapolation form a limited sample (Unreliable, Band
C), which is estimated to be within +/- 25% (Accuracy band 4).
The reliability and accuracy bands would form the matrix of confidence grades
shown below:
Table 5.7.3: Matrix of confidence grades, according to
ISO standard series 24511:2007 (Matos et al. 2003)
Accuracy Bands
(%)
Reliability bands
A B C D
[0; 1] A1 ++ ++ ++
]1; 5] A2 B2 C2 ++
]5; 10] A3 B3 C3 D3
]10; 25] A4 B4 C4 D4
]25; 50] ++ ++ C5 D5
]50; 100] ++ ++ ++ D6
NOTE: ‘++’ indicates confidence grades that are considered to be incompatible
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic
BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic

More Related Content

Similar to BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic

INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...
INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...
INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...Mastewal Getahun
 
Msc Development Sudies Dissertation
Msc Development Sudies DissertationMsc Development Sudies Dissertation
Msc Development Sudies DissertationNhlanhla Mlilo
 
An Integrated Library System On The CERN Document Server
An Integrated Library System On The CERN Document ServerAn Integrated Library System On The CERN Document Server
An Integrated Library System On The CERN Document ServerAmanda Summers
 
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...TÀI LIỆU NGÀNH MAY
 
Emona-based Interactive Amplitude Modulation/Demodulation iLab
Emona-based Interactive Amplitude Modulation/Demodulation iLabEmona-based Interactive Amplitude Modulation/Demodulation iLab
Emona-based Interactive Amplitude Modulation/Demodulation iLabHuynh MVT
 
Lusaka Ecological Sanitation Conference Final report 2004hpm
Lusaka Ecological Sanitation Conference Final report 2004hpmLusaka Ecological Sanitation Conference Final report 2004hpm
Lusaka Ecological Sanitation Conference Final report 2004hpmCharles Bwalya
 
MSc Thesis: Ecosystem Services of Tropical Silvopastoral Systems
MSc Thesis: Ecosystem Services of Tropical Silvopastoral SystemsMSc Thesis: Ecosystem Services of Tropical Silvopastoral Systems
MSc Thesis: Ecosystem Services of Tropical Silvopastoral SystemsHyeonju (Callie) Ryu
 
OFAH + Conservation Halton_urban-creeks-2008
OFAH + Conservation Halton_urban-creeks-2008OFAH + Conservation Halton_urban-creeks-2008
OFAH + Conservation Halton_urban-creeks-2008Mhat Briehl
 
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...TÀI LIỆU NGÀNH MAY
 
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trời
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trờiLuận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trời
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trờihttps://www.facebook.com/garmentspace
 
E.Vatamidou -- PhD thesis
E.Vatamidou -- PhD thesisE.Vatamidou -- PhD thesis
E.Vatamidou -- PhD thesisEleni Vatamidou
 
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...TÀI LIỆU NGÀNH MAY
 

Similar to BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic (20)

INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...
INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...
INVESTIGATION INTO SOME OF THE ENGINEERING PROPERTIES OF SOILS FOUND IN MOJO ...
 
Ephrem Tibebu.pdf
Ephrem Tibebu.pdfEphrem Tibebu.pdf
Ephrem Tibebu.pdf
 
Final Report
Final ReportFinal Report
Final Report
 
Water Handbook
Water HandbookWater Handbook
Water Handbook
 
Msc Development Sudies Dissertation
Msc Development Sudies DissertationMsc Development Sudies Dissertation
Msc Development Sudies Dissertation
 
An Integrated Library System On The CERN Document Server
An Integrated Library System On The CERN Document ServerAn Integrated Library System On The CERN Document Server
An Integrated Library System On The CERN Document Server
 
GeversDeynoot2011
GeversDeynoot2011GeversDeynoot2011
GeversDeynoot2011
 
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...
đáNh giá công tác quản lý môi trường tại công ty thực phẩm ping rong – bình v...
 
Emona-based Interactive Amplitude Modulation/Demodulation iLab
Emona-based Interactive Amplitude Modulation/Demodulation iLabEmona-based Interactive Amplitude Modulation/Demodulation iLab
Emona-based Interactive Amplitude Modulation/Demodulation iLab
 
Lusaka Ecological Sanitation Conference Final report 2004hpm
Lusaka Ecological Sanitation Conference Final report 2004hpmLusaka Ecological Sanitation Conference Final report 2004hpm
Lusaka Ecological Sanitation Conference Final report 2004hpm
 
MSc Thesis: Ecosystem Services of Tropical Silvopastoral Systems
MSc Thesis: Ecosystem Services of Tropical Silvopastoral SystemsMSc Thesis: Ecosystem Services of Tropical Silvopastoral Systems
MSc Thesis: Ecosystem Services of Tropical Silvopastoral Systems
 
OFAH + Conservation Halton_urban-creeks-2008
OFAH + Conservation Halton_urban-creeks-2008OFAH + Conservation Halton_urban-creeks-2008
OFAH + Conservation Halton_urban-creeks-2008
 
Elisavet D. Michailidi Thesis
Elisavet D. Michailidi ThesisElisavet D. Michailidi Thesis
Elisavet D. Michailidi Thesis
 
ThesisCIccone
ThesisCIcconeThesisCIccone
ThesisCIccone
 
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...
Nghiên cứu công nghệ hợp khối xử lý nước thải sinh hoạt tại hộ gia đình vùng ...
 
Water & sanitation handbook
Water & sanitation handbookWater & sanitation handbook
Water & sanitation handbook
 
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trời
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trờiLuận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trời
Luận văn tốt nghiệp chế tạo máy thiết kế robot làm sạch tấm pin mặt trời
 
E.Vatamidou -- PhD thesis
E.Vatamidou -- PhD thesisE.Vatamidou -- PhD thesis
E.Vatamidou -- PhD thesis
 
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...
đáNh giá hiện trạng môi trường trang trại chăn nuôi gà giống chất lượng cao t...
 
SI Thesis
SI ThesisSI Thesis
SI Thesis
 

BENCHMARKING AND PERFORMANCE INDICATORS - Borisav Milutinovic

  • 1. University of Belgrade Faculty of Civil Engineering Postgraduate Programme in Water Resources and Environmental Management BENCHMARKING AND PERFORMANCE INDICATORS IN WATER SUPPLY AND WASTEWATER SERVICES by BORISAV MILUTINOVIĆ 2013
  • 2. II
  • 3. III DEDICATION This thesis is dedicated to my family, to my beloved son “Archer” Stevan and to my dearest wife Zorica. To my sister Dejana. To my mother Mara and father Stevan who have not lived to see this moment. To all young generations in my family, as a role model to them, just to convince them, that everything is possible and everything in life depends just on you, with a little help of luck given by God the Almighty. Try always to do your best, making efforts day by day.
  • 4. IV
  • 5. V ACKNOWLEDGEMENTS The author would like to thank Dr. Zorana Naunović, a dear lecturer. Without her and her commitment, these postgraduate studies would not be possible. Special thanks to the major professor of this thesis, Professor Dr. Jovan Despotović and his valuable guidance and suggestions. All gratitude to Professor Dr. Rafaela Matos, for sharing her early works and valuable information about IWA performance indicators. Thanks to dear colleague Aleksandar Krstić, for sharing the information about benchmarking program in the Republic of Serbia, “Benchmarking I and II” and giving me the unpublished papers about it. This thesis would not have been possible without this. Here, I also want to thank to all of my dear colleagues in EUREAU, for giving me the idea for this thesis. I also owe appreciation to colleagues from committee of CEN/TC165, Association for Water Technology and Sanitary Engineering (UTV) and Chamber of Commerce and Industry of Serbia (CCIS). Special thanks to my dear Bros.: and Comp.: M. B. and B.B., for their support, untiring encouragement and everything else that makes life easier with real friends. Special thanks to Mr. Branislav Babić, who had the patience to read all versions of the thesis.
  • 6. VI Silent gratitude to a brotherhood of free men of good standing, which have a great influence on me. Thanks to all of my unmentioned friends and colleagues, who have encouraged me in these two years and influenced my career. Once again, particular thanks to my family, my beloved son Stevan and my dearest wife and friend Zorica. They have had patience, inspired me, had understanding for my work, given me the infinite love, comfort when it was hard and encouraged me throughout this study and finishing this thesis. Heaven knows that only with my love can I make this up to them. Thanks to my sister Dejana, who raised me, with so much love and to my mother Mara and father Stevan who have not lived long enough to see this. Moreover, first and foremost, thanks to God the Almighty, for making this possible.
  • 7. VII TABLE OF CONTENTS Page DEDICATION .................................................................................................III ACKNOWLEDGEMENTS............................................................................... V TABLE OF CONTENTS ................................................................................VII LIST OF TABLES ...........................................................................................IX LIST OF FIGURES......................................................................................... X ABSTRACT ..................................................................................................XIII CHAPTER 1. INTRODUCTION.....................................................................15 1.1. OBJECTIVES....................................................................................................................15 1.2. ORGANIZATION................................................................................................................15 CHAPTER 2. BENCHMARKING ...................................................................17 2.1. INTRODUCTION................................................................................................................17 2.2. EARLY HISTORY OF BENCHMARKING ................................................................................20 2.3. BENCHMARKING DEFINITIONS AND EXPLANATIONS.............................................................21 2.4. CONCEPT OF BENCHMARKING..........................................................................................22 2.5. PERFORMANCE ASSESSMENT AND PERFORMANCE IMPROVEMENT (OR METRIC BENCHMARKING AND PROCESS BENCHMARKING) ....................................................................26 2.6. BENCHMARKING PROCESS...............................................................................................30 2.6.1. IWA (INTERNATIONAL WATER ASSOCIATION) BENCHMARKING PROCESS ........................30 2.6.2. EBC BENCHMARKING PROCESS ...................................................................................34 2.7. EXAMPLES OF BENCHMARKING INITIATIVES AND PROJECTS IN WATER AND WASTEWATER INDUSTRY ..............................................................................................................................35 2.7.1. THE BENCHMARKING PROGRAM OF EUROPEAN BENCHMARKING CO-OPERATION (EBC)..37 2.7.2. THE BENCHMARKING PROGRAM OF INTERNATIONAL BENCHMARKING NETWORK FOR WATER AND SANITATION UTILITIES (IBNET)............................................................................39
  • 8. VIII CHAPTER 3. PERFORMANCE ASSESSMENT (METRIC BENCHMARKING) ...................................................................................................................... 43 CHAPTER 4. ISO STANDARDS SERIES 24500 (24510, 24511 AND 24512) AND STANDARDIZATION OF PERFORMANCE INDICATORS .................. 51 CHAPTER 5. PERFORMANCE INDICATORS SYSTEM.............................. 57 5.1. IWA PERFORMANCE INDICATORS SYSTEMS IN WATER SUPPLY AND WASTEWATER SECTOR..58 5.2. ELEMENTS OF THE PI SYSTEM..........................................................................................60 5.3. ADVANTAGES OF IWA PERFORMANCE INDICATORS SYSTEM..............................................67 5.4. IMPLEMENTATION OF IWA PERFORMANCE INDICATORS SYSTEM........................................69 5.5. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WATER SUPPLY SERVICES 74 5.6. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WASTEWATER SERVICES ..76 5.7. PERFORMANCE INDICATORS AND RELATED COMPONENTS - CONFIDENCE-GRADING SCHEME 83 5.8. SIGMA LITE SOFTWARE..................................................................................................86 CHAPTER 6. BENCHMARKING INITIATIVES IN REPUBLIC OF SERBIA.. 87 CHAPTER 7. CONCLUSION ........................................................................ 91 LIST OF REFERENCES ............................................................................... 95 APPENDIX.................................................................................................. 105
  • 9. IX LIST OF TABLES Table 2.1: Overview of some benchmarking efforts in the water industry (Cabrera et al. 2006) and as found in literature........................ 36 Table 3.1: Scope of application of PI systems (Alegre and Baptista 2002)46 Table 4.1: Structure of Standards on service activities related to drinking water supply and wastewater systems – comparison review (ISO 24510, 24511, 24512) and (Talib et al. 2005)........................... 52 Table 5.5.1: IWA PI system structure for water supply services................... 75 Table 5.6.1: Structure of the PI framework (Matos et al. 2003).................... 78 Table 5.6.2: IWA PI system structure for wastewater services..................... 79 Table 5.7.1: Reliability bands of data for PI system...................................... 83 Table 5.7.2: Accuracy bands of data for PI system ...................................... 84 Table 5.7.3: Matrix of confidence grades, according to ISO standard series 24511 (Matos et al. 2003)......................................................... 84 Table 5.7.4: Reporting of confidence grades (c.g.) for a sequence of years, according to ISO standard series 24511 (Matos et al. 2003).... 85 APPENDIX Table 7.1: List of Performance Indicators for Wastewater services, adopted from Matos et al. 2003, Cabrera et al. 2006 and Matos et al. 2002b............................................................................. 107
  • 10. X LIST OF FIGURES Figure 2.1: Plan – Do – Check – Act CYCLE (Cabrera et al. 2006) and EBC (European Benchmarking Co-operation) Web site: http://www.waterbenchmark.org/content/benchmarking.html. 23 Figure 2.2: Benchmarking Process “Plan, Do, Check, Act”, according to ISO standard series 24500 .................................................... 23 Figure 2.3: Illustration of “performance assessment” (metric benchmarking) and “performance improvement” (process benchmarking) as described by Kingdom and Knapp (1996) (adopted from Cabrera 2006)........................................................................ 28 Figure 2.4: Benchmarking cycle according to DVGW and DWA (2008) with the aims of performance assessment and performance improvement as well as the components of benchmarking according to the IWA (Cabrera et al. 2006)............................ 29 Figure 2.5: The IWA benchmarking process (Cabrera et al. 2006) .......... 30 Figure 2.6: EBC’s levels of participation (sources Cabrera et al. 2006 and EBC, Web site http://www.waterbenchmark.org/content/benchmarking.html) 38 Figure 2.7: Wastewater sets of input variables and performance indicators for basic, standard and advanced level of benchmarking, according to Benchmarking Program of European Benchmarking Cooperation (EBC)......................................... 38 Figure 3.1: Illustration of typical flowchart in performance assessment (Sjøvold et al. 2008)............................................................... 43 Figure 3.2: PI as a part of a performance measurement system, followed by example for water supply service (Alegre et al. 2006)....... 44 Figure 3.3: Water undertaking context (Algere et al. 2006)...................... 47 Figure 3.4: Wastewater undertaking context (Matos et al. 2003, adopted from Algere et al. 1997).......................................................... 47 Figure 3.5: Metric benchmarking process (Sjøvold et al. 2008) and (adapted, Cabrera 2001)........................................................ 49
  • 11. XI Figure 4.1: Fields of application of ISO 24511 (Wastewater), source of the schematic: IWA Performance Indicators for Wastewater Services ................................................................................. 53 Figure 4.2: Fields of application of ISO 24511 (Wastewater), source of figure: based on a scheme from Hydroconseil, France, 2002.53 Figure 4.3: Fields of application of ISO 24512 (Drinking Water)............... 54 Figure 4.4: Relevant relationships between stakeholders for establishing objectives, according to ISO standard 24511 and 24512 ....... 55 Figure 5.2.1: Structure of PI system – six separate categories. .................. 60 Figure 5.2.2: General outline of IWA variable definition, according to Alegre et al. 2002 and Alegre et al. 2006........................................... 61 Figure 5.2.3: IWA variable definition, classification and description, according to Alegre et al. 2002 and Alegre et al. 2006........... 61 Figure 5.2.4: General outline of the PI definitions within the IWA Manual, according to Alegre et al. 2002 and Alegre et al. 2006........... 63 Figure 5.2.5: PI identification and classification, description, terms and processing rules, according to Alegre et al. 2002 and Alegre et al. 2006 .................................................................................. 63 Figure 5.2.6: Illustration of components (data elements) of a performance indicators system (Sjøvold et al. 2008 and adapted Alegre et al. 2006)...................................................................................... 66 Figure 5.4.1: Phases of the PI system implementation process for water supply services (Alegre et al. 2006) ....................................... 70 Figure 5.4.2: Phases of the PI system implementation process for wastewater services (Matos et al. 2003, adapted from Alegre 2000 and 2002) ...................................................................... 71 Figure 5.4.3: PI and CI selection procedure (Alegre et al. 2006 and Matos et al. 2003) ................................................................................. 72 Figure 5.4.4: Example of data flows concerning PI and CI and team responsibilities (Alegre et al. 2006 and Matos et al. 2003) ..... 73 Figure 5.6.1: Wastewater balance (Matos et al. 2002a and Matos et al. 2003)...................................................................................... 76
  • 12. XII Figure 5.6.2: Illustration of service provision in terms of customers and main impacts (Matos et al. 2003).................................................... 77 Figure 5.6.3: Structure of wastewater CI and PI (Matos et al. 2002a and Matos et al. 2003) .................................................................. 78 Figure 5.6.4: Context Information data for wastewater services (Matos et al. 2002b).................................................................................... 81 Figure 5.8.1: Recommended PI evaluation procedure using SIGMA Lite and SIGMA Lite WW (Cabrera et al. 2003)................................... 86
  • 13. XIII ABSTRACT Milutinovic, Borisav. 2013. Benchmarking in Water Supply and Wastewater Services. Major Professor (Mentor): Jovan Despotovic. This thesis reviews and analyses throughout the following methods:  Benchmarking;  Performance Assessment (Metric Benchmarking);  Performance Improvement (Process Benchmarking); and,  Performance Indicators (PIs) system (especially ISO standardized PIs and International Water Association PIs). Special attention was given to the International Water Association (IWA) methods of benchmarking, performance assessment, and performance indicators system, its structure and components, performance indicators, data confidence-grading scheme and implementation of those processes and methods. The specific objective was to make a deeper reflection on the IWA Performance Indicators for wastewater services.
  • 14. XIV
  • 15. 15 CHAPTER 1. INTRODUCTION 1.1. Objectives This thesis is written to be an instructional manual or reference for colleagues wanting to work on benchmarking and performance indicators systems in water utility companies (especially wastewater services). The thesis should be understood as a literature review or “quick guidelines” to benchmarking, performance assessment and performance indicators. The overall goal of this thesis research was to theoretically explore tools for comparison and improvement of efficiency and effectiveness in water utilities (water supply and wastewater sector). The benchmarking initiatives are presented for three levels of analysis: the worldwide level (IBNET); the European level (EBC); and the national – local level (IBNET and IPM). For the Republic of Serbia, information in the form of a report on benchmarking is presented.
  • 16. 16 1.2. Organization The thesis consists of seven chapters. Chapter 2 is the introduction chapter and provides a summary of the early history, definitions and explanations, concept and process of benchmarking, with examples of benchmarking initiatives. An overview of two benchmarking programs is also presented. Chapter 3 presents the performance assessment (metric benchmarking) process, with detailed explanations and a literature review. The ISO standards series 24500:2007 (24510, 24511 and 24512) are presented in Chapter 4. Chapter 5 is the core chapter in which the International Water Association (IWA) performance indicators system for water supply and especially for wastewater services is presented and discussed An emphasis is placed on the elements, confidence-grading scheme of data, structure, advantages, implementation process and strategy of the system.,. Information on the benchmarking project initiative in the Republic of Serbia is given in the form of a report in Chapter 6.. Chapter 7 provides a conclusion to the thesis, where the results of this research and recommendations are once again presented and conclusion on benchmarking, performance assessment and performance indicators are highlighted. In the appendix, the complete list of IWA performance indicators for wastewater services is presented.
  • 17. 17 CHAPTER 2. BENCHMARKING 2.1. Introduction Water, which is essential to sustaining life and livelihoods, is a core sector of the global economy. Enhancing operational and financial performance of the water industry and water utilities will provide the basis necessary for expanding access and improving the quality of service. Water companies are different in nature, as noted in Cabrera (2006):  They are of public interest, doing business as a public services;  They are natural monopolies; and,  They have no direct competition. The international water and wastewater industry is going through a period of great change. The water sector faces one of the most – if not the most – challenging strategic outlooks in its history. Numerous global trends are placing pressures on the water sector, and in turn on asset management and other business processes (Water Services Association of Australia, 2008). These inter-related trends include:  Responses to global warming / climate change;  Significant asset development and growth;  Skills shortages arising from a variety of different factors;  Changes and meeting the demands from other industries;  New technologies enabling data collection and analysis on a previous unprecedented scale;  Increasing levels of stakeholder involvement and engagement in decision making;  Increasing complexity in customer needs and relationships;  Regulatory scrutiny and control; and,  Access to capital for investment.
  • 18. 18 These aspects of the strategic context for water utilities are driving changes to the way water utilities are being managed (Gee and Vincent, 2009). Challenges such as climate change, increased regulation and competition for funds, skills shortages, technological development, environmental constraints, increasing customer expectations and ageing infrastructure mean that water businesses need to be more and more efficient and effective each day. The industry needs to understand how it can best manage the various expectations required of it. The formerly monopolistic industry sectors of water and wastewater treatment and services are today increasingly influenced by free market mechanisms and been pressurized by both societies and governments for more transparency and efficiency. Urban water and wastewater utilities are under increasing pressure to perform. In addition, regulators and citizens demand increasingly higher standards of environmental, social, and economic sustainability. If water and wastewater utilities are to meet these increasing demands and expectations in both developed and developing countries, they must first take stock of their performance over time and the need for improved performance is not limited only to developing countries. The crucial statement (Principles for Economic Regulation) made by the UK Government in April 20011 was a catalyst for many improvements in the water sector in April 2011 and it reads as follows: “In certain sectors network effects and/or economies of scale create circumstances, such as natural monopolies, which limit the prospects for effective competition. In these areas, independent economic regulation will be needed over the long term to continue to provide vital consumer protections and ensure consumers’ interests are promoted through efficient provision of good quality, reliable and sustainable services.”
  • 19. 19 Recognizing the challenges facing the industry and with an appreciation of how modern asset management may assist these businesses meet their various commercial, environmental, social and regulatory obligations, a lot of organizations, institutions and individuals have developed a lot of methods and tools of benchmarking best practice within the water industry. The objectives of those methods and tools are to identify how well particular participants are currently managing their strategy as well as identifying those participants that represent best practice in a range of key process areas. To achieve their management goals, the water undertakings need to strive for high degrees of:  Efficiency – the extent to which the resources of water undertakings are utilized optimally to produce the service (Makropoulos 2009); and,  Effectiveness – the extent to which declared objectives (specifically and realistically defined, in advance) are achieved (Makropoulos 2009). To improve the desired efficiency and effectiveness to a certain level, each utility must first know what the best practice is. The benchmarking method was developed in this effect. The term benchmarking is used to describe the technique of identifying best practices from those “best in class” for specific and critical processes, adapting them and providing continuous performance improvement. This management tool is a well-known example of the “Xerox benchmarking” (resulting from its early application in Xerox in the 1980s) (Camp, 1989; Sjøvold et al., 2008).
  • 20. 20 2.2. Early History of Benchmarking Performance assessment is a natural need in human psychology. People, as individuals, social groups, regions and states, always want to know if they can be better, or how they are doing compared to others. Often people want to know if they have made an improvement over time. It is difficult to determine whether one is good at something, if there are no previous references. Through the process of benchmarking, these references can be established. Early history of benchmarking begins in the 1970’s with a problem raised in the Xerox Corporation, USA, a leading copier manufacturer. At that time, Xerox heavily lost market share to Japanese manufacturers. Looking for explanations as to what caused the problem with the market, Xerox made a comparative analysis of different companies. They realized that Fuji sold copiers at Xerox’s production cost and result confirmed significantly higher production costs in the USA. This was the first step in establishing the modern concept of benchmarking. Robert C. Camp described this concept in 1989 in his book “Benchmarking – The Search for Industry Best Practices that Lead to Superior Performance”, based on the case of Xerox Corporation (Cabrera et al., 2006). The next steps were transferring and implementing the benchmarking concept to the water industry, as it is the concept universally applicable in any kind of industry. It must be underlined that benchmarking is not the only tool for improving the water industry (water and wastewater services). There are other options, which include process optimization, business process redesign, restructuring, merging utilities and so on. The scope of this work deals with benchmarking and over the past two decades, many projects have proven benchmarking to be a powerful management instrument in the water industry.
  • 21. 21 2.3. Benchmarking definitions and explanations “Benchmarking is the process of comparing one's business processes and performance metrics to industry bests or best practices from other industries” (WIKIPEDIA). Performance benchmarking is learning how others do business, whether they are more efficient than others are, and if they are, whether their methods can be understandable and usable to others advantage. By the definition given in IWA’s Manuals of Best Practice: ”Benchmarking is a tool for performance improvement through systematic search and adaption of leading practices”. The American Water Works Association (AWWA) has defined benchmarking as ”a systematic process of searching for best practices, innovative ideas, and highly effective operating procedures that lead to superior performance, and then adopting those practices, ideas, and procedures to improve the performance of one’s own organization”. Benchmarking is a process of a comparative evaluation of system performances in similar systems (elements of the same operating system or, more often competing systems). This process enables the identification of the best techniques of "best in class". Benchmarking is therefore a key instrument for establishing a quality management system based on continuous improvement (Gspan et al., 2009). Comparisons with similar utilities elsewhere in the country or region or with standards of international good practice can shed light on how well a utility is performing, identify areas for improvement, and help indicate a plan of action. A major challenge for measuring and benchmarking water and wastewater utility performance has been the lack of standardized information, pointed out Van den Berg and Danilenko in 2011.
  • 22. 22 2.4. Concept of Benchmarking Benchmarking is essential for those developing and implementing water policy. The tools are important for documenting past performance, establishing baselines for gauging productivity improvements, and making comparisons across service providers. In addition, if managers do not know how well their organization or division has performed (or is performing) they cannot set reasonable targets for future performance. Benchmarking provides regulators and utility managers with a way to make performance comparisons over time, across water utilities, and across countries. It can promote conflict resolution between these two groups by allowing participants to focus on performance, and can help bridge the gap between technical researchers and those practitioners currently conducting studies for government agencies and water utilities (Berg and Padowski, 2007). As Van den Berg and Danilenko stated in 2011, a wide range of stakeholders can use benchmarking, like:  Utilities: to identify areas of improvement and set realistic targets;  Governments: to monitor and adjust sector policies and programs;  Regulators: to ensure that adequate incentives are provided for improved utility performance and that consumers obtain value services;  Consumers and civil society: to express valid concerns;  International agencies and advisers: to perform an evaluation of utilities for lending purposes; and,  Private investors: to identify opportunities and viable markets for investments. Benchmarking is organized in projects with clear starting and ending dates. Nevertheless, from a management point of view, benchmarking should be considered as continuous, permanent process, because searching for better
  • 23. 23 practices never ends. According to this, benchmarking should follow the Plan- Do-Check-Act principle presented in Figure 2.1. Figure 2.1: Plan – Do – Check – Act CYCLE (Cabrera et al., 2006) and EBC (European Benchmarking Co-operation) Web site: http://www.waterbenchmark.org/content/benchmarking.html The same principle is a part of ISO standard 24500:2007 (more detailed) and its illustration is given as Figure 2.2. Figure 2.2: Benchmarking Process “Plan, Do, Check, Act”, according to ISO standard series 24500:2007
  • 24. 24 According to this, benchmarking should be done on annual basis and it should be embedded in the (annual) business cycle. Benchmarking is both a science and an art. Even the best statistical analysis will be ignored if it is not carefully presented. There are several methodologies available for benchmarking and it is important to keep in mind that a single index of utility performance has the same problems of any indicator: it will be neither comprehensive nor fully diagnostic. Therefore, when conducting benchmarking analyses, water professionals must understand the strengths and limitations of different methodologies. Benchmarking is a fundamental requirement of good management and can help managers and regulators identify historical trends, determine today’s baseline performance, and quantify relative performance across utilities (Berg and Padowski, 2007). Benchmarking is a practical management and decision-making tool to measure and compare the performance of utilities. The objective of benchmarking is to improve performance by comparing with and learning from other, similar organizations. Performance monitoring can play a significant role in the sector as a tool for performance improvement. Benchmarking can help utilities identify performance gaps and effect improvements through the sharing of information and best practices, ultimately resulting in better water services. Van den Berg and Danilenko in 2011 concluded that the primary objectives of benchmarking are as follows:  To provide a set of Performance Indicators (PIs) related to a utility’s managerial, financial, operational, and regulatory activities that can be used to measure internal performance and provide managerial guidance.
  • 25. 25  To enable an organization to compare its performance on PIs with those of other relevant utilities to identify areas needing improvement, with the expectation of developing more efficient or effective methods to formulate and attain company goals as set forth in its business plan. Performance comparisons and benchmarking projects can be organized on a voluntary basis but can also be obligatory. Depending on the purpose, these projects are initiated by different organizations, e.g. associations, consultants or government agencies. Performance comparisons provide useful information about the water/wastewater sector and are, therefore, of the highest interest for deducing standard values for the sector.
  • 26. 26 2.5. Performance Assessment and Performance Improvement (or Metric Benchmarking and Process Benchmarking) In general, there are two different approaches to benchmarking:  “Metric Benchmarking” and  “Process Benchmarking”. Metric benchmarking is intended as a quantitative comparative assessment of company performance, normally measured by performance indicators (PI). Performance benchmarking is intended as a mechanism for identifying specific work procedures to be improved by emulating external examples of excellence that can be set as the best standard. “Metric benchmarking identifies areas of under-performance where changes need to be made to the way things are done, whilst process benchmarking is a vehicle for achieving change, and the improvement required can be imported from other best practice partners.” (Larsson et al. 2002) Van den Berg and Danilenko in 2011 made a distinction of two types of benchmarking: metric and process benchmarking. Metric benchmarking involves systematically comparing the performance of one utility with that of other similar utilities, and even more importantly, tracking one utility’s performance over time. A water or wastewater utility can compare itself to other utilities of a similar size in the same country or in other countries. Similarly, a nation’s regulators can compare the performance of the utilities operating there. Metric benchmarking, essentially an analytical tool, can help utilities better understand their performance. Such benchmarking is most powerful when carried out over time, tracking year-to-year changes in performance.
  • 27. 27 Process benchmarking is a normative tool with which one utility can compare the effectiveness of its processes and procedures for carrying out different functions to those of selected peers. A utility can compare its billing and collection system, for example, to those used by other utilities to see which system performs better. When the comparison reveals one utility’s system to be more effective or efficient than the others, the underperforming utility can adopt and internalize those processes and procedures as appropriate. The performance indicator constitutes the building block of both types of benchmarking. Indicators are quantitative, comparable measurements of a specific type of activity or output. Benchmarking can be of a two kinds, says a definition given by Gspan et al. (2009):  Metric benchmarking: regular periodic measurement of relevant (internal) metric variables, calculating relevant indicators and comparing their values to date;  Process benchmarking: comparison of internal indicators for certain processes to the one from the other companies in the same industry, with the aim of discovering vulnerabilities, weaknesses, needs and opportunities for improvement in efficiency, competence and competitiveness, impact analysis for introduction of a specific action on the process and the announcement of a future state. Here is a good and short definition of metric and process benchmarking by Gspan et al. (2009): Metric benchmarking gives thus a response to the questions such as: "Where am I, what am I doing?" Process benchmarking answers the questions: "Where and what are the opportunities for improvement?" Instead of this differentiation between metric and process benchmarking more appropriate words (to give a clear terminology) are going to be used. The IWA Specialist Group on Benchmarking strongly recommends abandoning the use
  • 28. 28 of the terms “metric benchmarking” and “process benchmarking”. Instead “performance assessment” and “performance improvement” should be considered consecutive components of benchmarking (Cabrera et al., 2006). Performance assessment and improvement should be understood as two differentiated phases, which are the necessary parts of benchmarking. Figure 2.3 illustrates this concept. Figure 2.3: Illustration of “performance assessment” (metric benchmarking) and “performance improvement” (process benchmarking) as described by Kingdom and Knapp (1996) (adopted from Cabrera, 2006) As introduced in pervious paragraphs, benchmarking consists of two consecutive components. First step, performance assessment, aims at analysing performance, comparing it with other organizations within or outside the industry, and identifying performance gaps. The next step, performance improvement, is designed to find improvements by learning from the leading practices and adapting them to the own situation. FUNCTION UTILITY PROCESS TASK BENCHMARKING Performance ASSESMENT Performance IMPROVEMENT LEVEL OF DETAIL METRIC Benchmarking PROCESS Benchmarking
  • 29. 29 To illustrate this process, Figure 2.4 gives the schematic view of benchmarking process in German wastewater industry, standardized by DVGW Deutsche Vereinigung des Gas- und Wasserfaches and DWA Deutsche Vereinigung für Wasserwirtschaft, Abwasser und Abfall, acording to IWA in 2008. Figure 2.4: Benchmarking cycle according to DVGW and DWA (2008) with the aims of performance assessment and performance improvement as well as the components of benchmarking according to the IWA (Cabrera et al. 2006)
  • 30. 30 2.6. Benchmarking Process 2.6.1. IWA (International Water Association) Benchmarking Process Through the literature review, it could be found that every benchmarking reference has its own benchmarking process, with different number of steps. However, more or less, they all follow the same procedure. Here, a typical IWA benchmarking process with six different steps is going to be presented, as Cabrera et al. (2006) have described them. Figure 2.5 illustrates those steps: 1. Project Planning 2. Orientation, Training and Project Control 3. Data Acquisition and Validation 4. Data Analysis and Assessment Reporting 5. Improvement Actions 6. Review of Improvement Actions Figure 2.5: The IWA benchmarking process (Cabrera et al., 2006) The following explanation of six IWA’s benchmarking process steps has been completely adopted from Cabrera et al. (2006), without any changes: Preparaton Performance Assessment Performance Improvement Project Planning Orientation, Training & Project Control Data Acquisition & Validation Data Analysis & Assessment Reporting Improvement Actions Review of Improvement Actions
  • 31. 31 1. Project Planning At the start of a benchmarking project, the scope and level of detail are determined based on the demands and needs of the interested utilities. The performance assessment model and the data requirements also need to be elaborated to show participants what they can expect, and to estimate project resources. Based on this information, a detailed project plan with budget and planning can be drafted. Interested utilities are invited to participate at this stage and based on their response the project may or may not be launched. 2. Orientation, Training and Project Control Before starting to benchmark, all staff involved in the project needs to be prepared. The objectives of the exercise and the project plan should be clear. Furthermore, staff needs to be informed about the methodology and data requirements and trained in the data methods that will be applied in the project. These considerations include the staff from participating utilities and staff from the project team (organizing body and/or commissioned third parties). 3. Data Acquisition and Validation One of the most time consuming activities in a benchmarking project is the data acquisition by the participants. This step requires significant efforts from the participants, depending on their experience, availability of the information and accessibility. The role of the project team in this step is to assist utilities in clearing up methodology issues and definition problems and to secure meeting deadlines.
  • 32. 32 When the required data are collected, they need to be validated by the utilities and by the project team, for instance by looking at consistency with data from previous years, outliers, on-site visits or auditing. Although this activity may be intensely time consuming, the availability of a reliable dataset is key to successful benchmarking. Participants in a benchmarking project expect good quality comparison and, accordingly, proper identification of performance gaps, as this is the trigger for improvement actions. 4. Data Analysis and Assessment Reporting Once data are validated, they are analysed, performance indicators are calculated and performance comparison is made between the participants. In this stage, possible remaining errors in the dataset can be identified and cleared up to improve data quality. Performance gaps are then determined and explained (if possible) keeping in mind the differences in the operating environment of the utilities. The result of this step is a draft report (at individual and/or group level) with the preliminary results of the performance assessment. This is the basis for discussing performance differences with the participating utilities in a workshop. After discussing, the preliminary results of the performance assessment in a workshop, possible errors and further explanations on the performance gaps or differences in the operating environment of utilities are processed. Final reports on the performance assessment are produced and distributed to disseminate the results within the company and to its stakeholders. These assessment reports can be supplemented by improvement action plans after the upcoming steps.
  • 33. 33 5. Improvement Actions One of the most important activities for reaching the final goal of a benchmarking exercise involves taking theory into action. Based on the performance assessment and the knowledge that is available in the network, the project team and the participating utilities jointly try to discover best practices, present and discuss these in the workshop and identify improvement opportunities. For further analysis of interesting practices, site visits or task groups may additionally be organized. With the best practices identified, participants should be able to draft their own improvement action plan. The action plan can be quite different for each utility and needs to be prioritized, based on the contribution of the proposed actions to the strategic objectives of the utility and the cost/benefit ratio. Benchmarking without improvement usually equals frustration. The implementation step is often overlooked but is crucial in finishing the job that was started at step 1. In order to implement the suggested improvement initiatives, senior utility management should approve the necessary changes and the necessary internal procedures should be followed to secure investments, organizational changes, etc. 6. Review of Improvement Actions After implementing improvement actions, the results should be assessed to review if the objectives have been reached. Usually, this is done in a following benchmarking exercise. In order for the benchmarking process to be complete, this needs to be documented and evaluated, including lessons learnt and new benchmarking needs. Closing the cycle provides essential information for preparing a new benchmarking effort.
  • 34. 34 2.6.2. EBC Benchmarking Process The European Benchmarking Co-operation (EBC) benchmarking process is given here, as an example. A typical EBC benchmarking cycle comprises the following steps and activities:  Performance Assessment:  defining benchmarking objectives;  defining a performance assessment model;  preparing tools;  inviting participants;  collecting data;  validation and analysis;  identifying performance gaps;  reporting.  Performance Improvement:  identifying good/best practices;  preparing a Performance Improvement Plan with targets and priorities;  implementing performance improvement measures;  evaluating. It should be noted, that methodology of EBC is almost the same as IWA. Both (IWA and EBC) consist of two main steps, as already has been explained: Performance Assessment and Performance Improvement. Both (IWA and EBC) have in common almost the same methodology and steps under different names. The same cycle is followed by ISO 24500:2007 (see Figure 2.2). In conclusion, it could be found that all benchmarking processes, are basically the same with minor differentiations and they follow the same cycle of Performance Assessment and Performance Improvement.
  • 35. 35 2.7. Examples of Benchmarking Initiatives and Projects in Water and Wastewater Industry Since the early 1990’s, benchmarking evolved in water sector. A number of benchmarking methodologies, through a different initiatives and projects have been developed and successfully implemented all over the World. Here, some of the projects are going to be mentioned. Table 2.1 provides some examples of key benchmarking initiatives in the water industry, as they have been found in the literature. Those projects are not necessarily the most important or relevant, but were chosen to illustrate the evolution of benchmarking in water industry.
  • 36. 36 Table 2.1: Overview of some benchmarking efforts in the water industry (Cabrera et al., 2006) and as found in literature Program Name Country Program type Level of detail Type of activity Geographical scope IWA manuals based 6 – Cities Group Scandinavia BM U, F & P WS & WW R No DANVA Denmark BM U & F WS N No European Benchmarking Co-operation (EBC) Europe BM U, F & P WS & WW R & I Yes Germany (several) Germany BM U, F & P WS & WW N Yes NWWBI Canada BM U, F & P WS & WW N No OEWAV Austria BM U & F WW N No OVGW Austria BM U & F WS N Yes QualServe USA BM U WS & WW N No SEAWUN South – East Asia BM U WS R No VEWIN The Netherlands BM U, F & P WS N No WSAA Australia BM F & P WS R & I No ADERASA Latin America PA U WS & WW R No FIWA Finland PA U WS & WW N No IBNET World Bank PA U WS & WW I No Norsk Vann Norway PA U WS & WW N No OFWAT England & Wales PA U & F WS & WW N No Svensk Vatten Sweden PA U WS & WW N No PAS India BM U & F WS & WW N & R & L No FEDERGASAQUA Italia BM U, F & P WS & WW R No IRAR & ERSAR Portugal BM U WS & WW N Yes CARE-W Europe BM U & F WS I Yes ALUSEAU Luxemburg PA U & F WS & WW N No France (several) France BM U, F & P WS & WW N No WaBe Czech Republic PA U & F WS N Yes WSOP Slovenia BM U & F WS N Yes AWSR Slovak Republic BM U WS N Yes IPM Republic of Serbia PA U WS & WW N & I No PA – Performance Assessment; BM – Benchmarking (Performance & Improvement); U – Utility level; F – Functions level (core process); P – Process level; WS – Water supply; WW – Wastewater; I – International; N – National; R – Regional; L - Local 36
  • 37. 37 2.7.1. The Benchmarking Program of European Benchmarking Co-operation (EBC) It is important to mention here, the project of European Benchmarking Co- operation (EBC) in the field of water and wastewater benchmarking. The drivers for the EBC initiative to carry out cross-national benchmarking projects were on the one hand the discussions on liberalization and privatization, and on the other hand the demand for transparent and efficient public services. The requirements of the EU Directives also played an important role, for example, the full cost recovery according to Water Framework Directive (WFD). Using the benefits of international benchmarking, utilities are able to show stakeholders their intention to optimize procedures and this initiative allows them to have a larger opportunity for networking and identifying best practices. For this purposes EBC provides a platform to exchange best practices of management and operations, as well as, benchmarking knowledge and experience. The approach is fully supported by International Water Association (IWA). EBC survey is restricted to a manageable amount of data. A three level performance assessment model has been developed by EBC (Figure 2.6), because water utilities differ a lot (regarding size, type of activities, technological development etc.) and for performance assessment, they can be chosen from three different participation levels:  Basic level: Only service quality data, finance and efficiency data in addition to context information are evaluated.  Standard level: On this level, the three remaining pillars, namely water quality, reliability and sustainability, are added.  Advanced level: On this level, the focus is directed on sustainability and economy.
  • 38. 38 The numbers of the different sets of input variables, which are necessary to calculate the performance indicators, are shown in Figure 2.7. Figure 2.6: EBC’s levels of participation (Cabrera et al., 2006; EBC, Web site: http://www.waterbenchmark.org/content/benchmarking.html) Figure 2.7: Wastewater sets of input variables and performance indicators for basic, standard and advanced level of benchmarking, according to Benchmarking Program of European Benchmarking Cooperation (EBC) Up to now, 41 participants from 18 different countries have joined the EBC project (Thaler and Dimitrova, 2009).
  • 39. 39 2.7.2. The Benchmarking Program of International Benchmarking Network for Water and Sanitation Utilities (IBNET) There are two international initiatives on water supply and wastewater utilities benchmarking which are very relevant for benchmarking of utilities in Serbia that is currently being implemented in Serbia. The first initiative is benchmarking by the model of International Network for Benchmarking of Water Supply and Sanitation Utilities (IBNET), initiated in 1996. This initiative had great success in developing countries. IBNET methodology is currently being used in Serbia through the projects implemented by Inter-institutional Professional Network in Water Sector of Serbia (IPM), with financial support from the World Bank. The second international initiative showed important success in developed countries. European Benchmarking Co-operation (EBC) was initially established in year 2006 by Scandinavian and Dutch water associations and several individual water companies (Krstić, 2013). The International Benchmarking Network for Water and Sanitation Utilities (IBNET) helps to build the resources for meeting this demand and suggests ways of providing improved services. Through its performance-assessment standards and continually updated database, IBNET serves as a global yardstick with which utilities and national policy makers, as well as the public, governments, municipalities, utilities, investors, and other users, can compare and evaluate the performance of water and wastewater utilities throughout the world. IBNET provides a set of tools that allows water and sanitation utilities to measure their performance both against their own past performance and against the performance of similar utilities at the national, regional, and global levels.
  • 40. 40 IBNET consists of three major tools. The first is the IBNET Data Collection Toolkit, which can be downloaded from the IBNET Web site at http://www.ib- net.org; this Excel spread sheet indicates a set of data to be completed and offers detailed instructions on the precise data to enter. The second tool is a continuously updated database of water and sewerage utilities’ performance. This database allows utilities and other sector stakeholders to search for data in different formats and provides the means for simple benchmarking of utility data. The benchmarking tool enables the utility to compare itself to other utilities with similar characteristics (for example, size, factors related to location, and management structure). The third tool provides data on participating agencies. This information helps organizations interested in measuring utility performance to contact neighbouring utilities and other organizations to build local networks for performance assessment and benchmarking. IBNET has three key aspects. The first is that participation is voluntary, with the result that organizations contributing to IBNET are very diverse. They include, for example, regulatory associations, national water associations, government departments and agencies involved in monitoring urban water supplies and sewerage utilities, and, more recently, individual utilities. A second feature of IBNET is that it does not itself collect data. Rather, it sets up mechanisms by which many different organizations conduct data collection. From its start, IBNET’s strategy has been to use a highly decentralized approach. IBNET’s role is to provide instruments, such as the IBNET Toolkit, to support this process. In its feedback, IBNET checks the quality of the data to ensure internal consistency and helps participants to analyse the data. The third key IBNET feature, one fairly rare among agencies involved in utility benchmarking, is its focus on developing time-series data. Without time-series
  • 41. 41 data, trends in utility performance and the impact of water and sanitation policies are difficult to detect. Effective development of time-series data requires ensuring that the data remain comparable over time through the rigorous use of a standardized data set and indicators as well as frequent data updating. In IBNET practice, most of the data are updated every two years. As performance assessment and benchmarking gain more prominence in the sector as regulation and monitoring tools, obtaining data on an annual basis is becoming easier, especially in countries with increasingly institutionalized performance assessment. This database allows innovative time-series performance analysis as well as cross-section analysis (Berg and Danilenko, 2011; Krstić, 2013).
  • 42. 42
  • 43. 43 CHAPTER 3. PERFORMANCE ASSESSMENT (METRIC BENCHMARKING) The scopes of this work are Benchmarking and Performance Indicators. As it has been explained before performance assessment is a part of benchmarking process. The following figure illustrates the steps in performance assessment. Figure 3.1: Illustration of typical flowchart in performance assessment (Sjøvold et al., 2008) Performance assessment is a widespread activity used in economics, business, and sports and in many other areas of life in general, in order to compare and score entities and individuals and make management decisions. Assessment is defined as a “process or result of this process, comparing a specific subject matter to relevant references” (ISO 24500). Performance assessment is therefore any approach that allows evaluation of the efficiency or the effectiveness of a process or activity through the production of performance measures. (Sjøvold et al., 2008) Utility Challenges Data collection PI system Benchmarking Forecasting Decision support
  • 44. 44 The ISO series 24500 provides the following definitions:  Effectiveness is the extent to which the planned activities are realized and the planned results achieved;  Efficiency is the relationship between the result achieved and the resources used. Performance measures are the specific parameters that are used to inform assessment. (Sjøvold et al. 2008). Figure 3.2: PI as a part of a performance measurement system, followed by example for water supply service (Alegre et al., 2006) Objectives Which results are to be reached in the future? Strategies How can those results be reached? Critical success factors Depending on the constraints and the context, the optimum strategies to reach objectives PIs Have the objectives been reached? What happened with the critical success factors? Reduce non revenue water by 2% New metering program Increase lekage detection To replace non-accurate meters by new / more accurate ones To more accurately read / report meters To increase leakage detected volume Op8 – Meter replacement Op30 – Customer reading efficiency Op4 – Leakage Control Op23 – Apparent losses Op28 – Real losses per mains lenght
  • 45. 45 Performance indicators are efficiency or effectiveness measures of the activity of a utility (Sjøvold et al., 2008). Once again, it will be repeated, that performance assessment (metric benchmarking) is quantitative comparative assessment of a company performance. It enables comparison between peer utilities and it is used for quantitative analysis (answers to the question “Where am I?”). Performance assessment (metric benchmarking) is a natural and intuitive way of comparison. Due to the fact that it is one of the last natural monopolies, water industry is a business, which needs some sort of regulation. This is why performance assessment is a great tool for comparison and it has gained a great importance. However, comparing performance indicators has become the natural tool for regulators of the water industry worldwide. All around the world, all industries are carrying out rudimentary metric benchmarking (performance assessment). Nevertheless, it is not enough just to simply compare the figures and this does not make a metric benchmarking (performance assessment). It is much more than that; it implies the analysis of the results, a key phase needed to account for differences in local conditions and to assess the impact of all factors in performance (Cabrera et al., 2006). Potential users (entities or “stakeholders”) of performance assessment in water and wastewater services can be (Alegre et al., 2006; Matos et al., 2003):  The water and wastewater undertakings (no matter of ownership status);  The consumers or direct users;  The indirect stakeholders (affected by impact on surrounding environment);  The pro-active stakeholders (environmental organisations, consumer protection agencies and other pressure groups);  The policy-making bodies (at local, regional or national level);
  • 46. 46  The regulatory agencies (responsible for economics and quality of service regulation);  The auditors, financing bodies and agencies;  The quality certifying organisations;  The multi-lateral organisations.  Table 3.1: Scope of application of PI systems (Alegre and Baptista, 2002) Exclusivelywithinthe undertaking Intheframeworkof benchmarkinginitiatives Aspartofaregulatory framework Aspartofcontractual agreements InthescopeofQuality CertificationSystems InthescopeofGuaranteed StandardSchemes Inthescopeofstatistic reportspubliclyavailable Water/Wastewater utilities        Policy-making bodies   Regulatory agencies   Financing bodies   Quality certifying entities  Auditors     Direct users and indirect and pro-active stakeholders    Supra-national organisations   The following figures illustrate water (Alegre et al., 2006) and wastewater (Matos et al., 2003) undertaking context and interconnection with potential users of performance assessment.
  • 47. 47 Figure 3.3: Water undertaking context (Algere et al., 2006) Figure 3.4: Wastewater undertaking context (Matos et al. 2003, Algere et al., 1997) REGION Water Energy Other resources Environment Technological assessts Human resources Supply system Waterutility National/regional policies Demography Economics Legislation Direct Indirect Proactive Consumers service financial resources ENVIRONMENT REGION ECONOMICS AND DEMOGRAPHICS ENVIRONMENTAL RESOURCES WATER ENERGY SOIL AIR BIOTA ASSETS HUMAN TECHNOLOGICAL FINANCIAL OTHER STAKEHOLDERS INDIRECT USERS CUSTOMERS PROACTIVE USERS AUTHORITIES POLICY MAKERS REGULATORS WASTEWATER UNDERTAKING
  • 48. 48 Performance assessment of water utilities is not an easy job. The amount of data present in a single utility can be overwhelming. That is the reason why number of data and PIs should be very carefully chosen. Performance assessment could be, for this reason, described as the art of simplification: the more concise the data, the better; but an over simplification of the whole picture can provide insufficient information for making good decisions. Indicators are a great tool to assess performance. The traditional ratio combines at least two relevant variables measured in the real world and provides significant information. By combining the adequate indicators, a general picture of reality can be achieved. An indicator is a very intuitive tool, and is easily understood. Indicators facilitate comparisons, as denominators usually provide a size or quantity reference (Cabrera et al., 2006). Performance indicators are useful only when they have been compared to established reference. Without explanatory (additional) information, value of an indicator could be meaningful. Designing of a performance assessment system means, that method of comparison must be clear. The usage of indicators can be of a different purpose (Cabrera et al., 2006):  Assessing the fulfilment of objectives/targets. This depends of what goals utility wants to achieve. Those objectives/targets should be established in advance, with fixed values of indicators that should be reached.  Trend analysis. If utility, or parts of it, wants to follow evolution with time, indicators provide trends and can be used even for prediction in the future. In this case, the indicators are being compared to previous values of the same indicators obtained in the past, and delivered information about evolution in time and whether it is improved or not.  Peer comparison. A natural follow-up to any indicator system is to try to compare the indicator values with those obtained by another utility. In this case, the required system is more complex, as the analysis of the results needs to take into account size and context differences.
  • 49. 49 The following figure illustrates the implementation phases in metric benchmarking (performance assessment) given by Sjøvold et al. in 2008 and adapted by Cabrera in 2001. Figure 3.5: Metric benchmarking process (Sjøvold et al., 2008; Cabrera, 2001) 1. Identify the goals and necessary resources 2. Find metric benchmarking partners 3. Define PI system and data collection procedures 4. Collect data, calculate and validate indicators 5. Results analysis 6. Communicate results 7. Identify potential improvement areas 8. Undertake improvement actions ActionIntegrationExecutionPlanning
  • 50. 50
  • 51. 51 CHAPTER 4. ISO STANDARDS SERIES 24500:2007 (24510, 24511 AND 24512) AND STANDARDIZATION OF PERFORMANCE INDICATORS The International Organization for Standardization (ISO) is a worldwide federation of national standards bodies (ISO member bodies). The work of preparing international standards is normally carried out through ISO technical committees (TC). In 2007, the ISO TC 224 published the international standard series ISO 24500:2007 (Koelbl, 2008). The full series consists of the following international standards:  ISO 24510 (2007): Activities relating to drinking water and wastewater services — Guidelines for the assessment and for the improvement of the service to users,  ISO 24511 (2007): Activities relating to drinking water and wastewater services — Guidelines for the management of wastewater utilities and for the assessment of wastewater services,  ISO 24512 (2007): Activities relating to drinking water and wastewater services — Guidelines for the management of drinking water utilities and for the assessment of drinking water services. The objective of these international standards series is to provide the relevant stakeholders with guidelines for assessing and improving the service to users, and with guidance for managing water utilities, consistent with the overarching goals set by the relevant authorities and by international intergovernmental organizations. ISO / TC 224 developed this series of standards and it is applicable on a voluntary basis. It means that those standards are not normative, just informative guidelines. The standards were developed by consensus internationally, with the objective of being applicable worldwide. All three standards share common parts, including terminology, annexes and structure of PIs. The standards
  • 52. 52 recommend building PI Systems according to IWA recommendations (IWA Manual of Best Practice Handbooks). The examples provided are mostly from the IWA Handbooks. The three standards share the same structure:  Scope  Components  Objectives  Guidelines for the Management  Service Quality Assessment  Related Performance Indicators (PIs)  Use of PIs for Operation ISO 24511:2007 and ISO 24512:2007 are twin standards, but ISO 24510:2007 is slightly different. Table 4.1: Structure of Standards on service activities related to drinking water supply and wastewater systems – comparison review (ISO 24510:2007, 24511:2007, 24512:2007) and (Talib et al., 2005) ISOWD 24510 ISOWD 24511 ISOWD 24512 Scope Scope Scope Normative References Normative References Normative References Terms and Definitions Terms and Definitions Terms and Definitions Components of Services Components of Wastewater Systems Components of Water Supply Systems User’s Needs and Expectations Management Components of a Wastewater Service Management Components of a Drinking Water Supply Service Performance Indicators Wastewater Service Objectives Drinking Water Supply Service Objectives Guidance on the Management of Wastewater Service Guidance on the Management of Drinking Water Supply Service Service Assessment Service Assessment Performance Indicators Performance Indicators
  • 53. 53 Components of wastewater system, according to ISO 24511:2007: Figure 4.1: Fields of application of ISO 24511:2007 (Wastewater), source of the schematic: IWA Performance Indicators for Wastewater Services Types of wastewater system, according to ISO 24511:2007: Figure 4.2: Fields of application of ISO 24511:2007 (Wastewater), source of figure: based on a scheme from Hydroconseil, France, 2002.
  • 54. 54 Components of water supply system (drinking water), according to ISO 24512:2007: Figure 4.3: Fields of application of ISO 24512:2007 (Drinking Water) The ISO 24511:2007 and 24512:2007 are introducing the term responsible body, which has the overall legal responsibility for providing drinking water and/or wastewater services and for establishing the policy and the general organization of the relevant water utility, for a given geographic area. The responsible body should establish the objectives, associated service criteria and performance indicators for a wastewater utility, taking into account the legal requirements of the relevant authorities as a basis and the expectations of the users and other stakeholders in conjunction with its operators. Figure 4.4 illustrates relevant relationships between stakeholders for establishing objectives.
  • 55. 55 Figure 4.4: Relevant relationships between stakeholders for establishing objectives, according to ISO standard 24511:2007 and 24512:2007 According to those two standards, management of wastewater utility needs:  Formulation of objectives and service assessment criteria,  Targeting the service assessment criteria by the use of a set of performance indicators,  Evaluation of the performance by measuring and assessment. The ISO 24500:2007 series of standards are giving definitions (same as IWA) concerning Performance Indicators (PI) and they will be presented in Chapter 5.2. NOTE RB and OPERATOR can be the same body
  • 56. 56
  • 57. 57 CHAPTER 5. PERFORMANCE INDICATORS SYSTEM Definition Performance Indicators (PI) represent a quantitative measure of a particular aspect of the undertakings performance or standard of service. It assists in the monitoring and evaluation of the efficiency and effectiveness of the undertakings, thus simplifying an otherwise complex evaluation (Makropoulos, 2009). There are many Performance Indicators systems (PIs) worldwide. ISO standards for PIs have already been presented in pervious section. IBNET (The International Benchmarking Network for Water and Sanitation Utilities) is one of them, used by IPM (Inter-institutional Professional Network in Water Sector of Serbia) initiative and it is going to be commented on, for its results in Republic of Serbia. In this chapter, IWA (International Water Association) Performance Indicators systems in water supply and wastewater sector are going to be presented in detail. IWA offers a large number of PIs for Drinking Water and Waste Water Services (each approx.170), issued in two IWA Handbooks (Alegre et al., 2006; Matos et al., 2003). IWA manuals on performance indicators provide a structure that may prove to be a valuable guide when building up such a system.
  • 58. 58 5.1. IWA PERFORMANCE INDICATORS SYSTEMS IN WATER SUPPLY AND WASTEWATER SECTOR Performance indicator systems (PIs) and benchmarking are instruments for internal corporate management but also for comparisons of utilities on a regional, national and international scale (Merkel, 2001). The basis for corporate benchmarking are standardized performance indicator systems, which evaluate all the tasks of a sustainable water supply and wastewater sector holistically, considering supply safety, supply quality, customer service, sustainability and efficiency. Such a “quasi-competition” on a voluntary basis can display the performance, but also enables the derivation of measures for improvement (Hirner and Merkel, 2002). According to these principles, a large number of benchmarking projects have been carried out all over the world in the water supply sector over the last decade. At the end of the 1990’s a committee of the International Water Association developed a system of performance indicators for the water supply services (Alegre et al., 2000) and carried out several national field tests in order to adapt the system to practical applications. Six years later, after a field test with more than 70 undertakings worldwide, Alegre et al. (2006) published an updated, improved version of the manual of best practice. Matos et al. (2003) did the same for wastewater services, with the manual of best practice published by IWA. Undoubtedly, the IWA PI system is the state of the art performance indicator systems in the water supply and wastewater sector and is the basis for many projects worldwide, although individual adaptations (e.g. additional PIs) for the frame conditions of single countries may be useful.
  • 59. 59 The main objective of both manuals is to provide guidelines for the establishment of a management tool for water supply and wastewater utilities based on the use of performance indicators. Further objectives are to provide a coherent framework of indicators for benchmarking initiatives but also for regulatory agencies and international statistics collected by the IWA (Alegre et al., 2006; Matos et. al., 2003). This chapter gives an overview of the IWA performance indicator system for water supply and wastewater services described by Alegre et al. (2006) and Matos et al. (2003).
  • 60. 60 5.2. ELEMENTS OF THE PI SYSTEM The methodology of data elements of PI system is the same for water supply and wastewater services and explanation of them is going to be given for both. The structure of PIs for those two services is going to be explained further on. Figure 5.2.1: Structure of PI system – six separate categories. The PI system consists of four types of data elements, each of them with different rules within the system:  variables;  performance indicators (PI);  context information (CI);  explanatory factors. Variables Variables are the data elements of which the performance indicators are calculated from. The variables are values (resulting from a measurement or record) expressed in a specific unit (e.g. “length of mains”, unit: km; “average Environmental / Water Resources PIs Economic and Financial Quality of service Operational Physical Personnel
  • 61. 61 service pressure head”, unit: m; “total sub-process costs”, unit: €/a). Confidence grades indicate the data quality for each variable. Variables should fulfil the following requirements:  univocal definitions,  reasonably achievable,  refer to the same geographical area and the same assessment period as the PI and CI they are used for and fit their definition,  be as reliable and accurate as the decision made based on them requires. Figure 5.2.2: General outline of IWA variable definition, according to Alegre et al. 2002 and Alegre et al. 2006 Figure 5.2.3: IWA variable definition, classification and description, according to Alegre et al. 2002 and Alegre et al. 2006
  • 62. 62 Some of the variables are external data and mainly informative, and their availability, accuracy, reference dates and limits of the corresponding geographical area are generally out of the control of the utility. In this case, variables should also:  whenever possible be collected from official sources, and  be essential for the performance indicator assessment or interpretation. Performance indicators (PI) Performance Indicators are measures of the efficiency and effectiveness of a utility in achieving its objectives, which are results from the combination (ratio) of several variables. Each PI should express the level of performance achieved in a certain area and during a given assessment period (e.g. one year). A clear processing rule should be defined for each performance indicator to specify all the variables required and their algebraic combination. As with variables, the performance indicators also consist of values expressed in specific units and confidence grades indicate the quality of data represented by the indicator. Performance indicators are typically expressed as ratios between variables. These ratios may be commensurate (e.g., “non-revenue water”, unit: %) or non-commensurate (e.g. “total process costs”, unit: €/km or €/100 service connections, unit: l/conn·d). In general, the latter case allows a better performance comparison due to the fact the denominators represent the dimension of the water supply or wastewater system (e.g. number of service connections or total mains length). This allows for comparisons through time, or between systems of different sizes.
  • 63. 63 Figure 5.2.4: General outline of the PI definitions within the IWA Manual, according to Alegre et al. 2002 and Alegre et al. 2006 Figure 5.2.5: PI identification and classification, description, terms and processing rules, according to Alegre et al. 2002 and Alegre et al. 2006 Performance indicators should fulfil the following requirements. They should be:  clearly defined with a concise and unequivocal meaning,  reasonably achievable (depends on the related variables) at a reasonable cost,
  • 64. 64  auditable,  as universal as possible and provide a measure which is independent from the particular condition of the utility,  simple and easy to understand,  quantifiable so as to provide an objective measurement of the service, avoiding any personal or subjective appraisal,  every PI should provide information significantly different from other PIs,  only PIs, which are deemed essential for effective performance evaluation, should be established.
  • 65. 65 Context Information (CI) These data elements provide information on the characteristics of an undertaking and account for differences between water supply/wastewater systems. There are two possible types of context information:  External factors that cannot be changed by management decisions. These information describes the frame conditions of a system (e.g. geography, demography, topography, climate), which are relatively constant through time. They are not under the control of the utility.  Data elements that are not modifiable by management decisions in a short or medium term, but the management policies can influence them in the long term (e.g. the condition of the infrastructure of a system, pipe material). Context information are necessary when comparing different structured systems and gives support in cause analyses. The requirements for context information are, in general, the same as for performance indicators and variables. If the level of detail and confidence grading is not the same, they should be:  univocal definitions,  reasonably achievable,  if external, be collected whenever possible from official survey departments,  fundamental for the interpretation of PIs,  as few as possible. The context information related to the undertaking and to the region profiles is rather alike in both cases (water supply and wastewater services). The system profiles have the same organization, although the contents are different (Duarte et al. 2003).
  • 66. 66 Explanatory factors Explanatory factors are key elements of PI systems, which are used to explain PI values but they are also used for the grouping of comparable water supply/wastewater systems. Explanatory factors may be context information, variables or PIs (e.g. average age of network, service connection density or network delivery rate). Figure 5.2.6: Illustration of components (data elements) of a performance indicators system (Sjøvold et al. 2008 and adapted Alegre et al. 2006) DataElements (allinformationin thesystem) Variables (DataElements usedtocalculate PIs) Explanatory factors (DataElem.used toexplainPIs) CI (DataElementsnot modifiableby management) PIs (PIsusedto explainPIs) PIs
  • 67. 67 5.3. ADVANTAGES OF IWA PERFORMANCE INDICATORS SYSTEM There are several advantages to use PIs proposed by IWA, for water supply and wastewater services (Cabrera et al., 2006):  The system is as universal as possible and the proposed indicators have become an industry standard.  All indicators fulfill previously listed requirements. Indicators published by IWA went through a revision process over many contributors and they are proved in practice.  The indicators and the corresponding variables are well defined. IWA definitions may not be perfect, but they are detailed enough to guarantee that the debate will only concern project specific details.  IWA PIs can be understood, as choosing a set of indicators off the shelf, or at least using them as a base for modification to create the new ones.  The structure of a performance assessment system provided by IWA is a framework, which allows adding, replacing or modifying indicators with the assurance of the system being coherent and compatible with other systems in the world. A very important fact should be stated here. The number of performance indicators should be well balanced. Too many indicators will significantly increase costs and difficulties of implementing the system. Too few will result in the system not being able to provide a proper assessment of the performance of undertaking in the terms defined by the objectives and the selected strategies (Algere et al., 2006). First, the goals and targets to achieve should be established. According to those objectives, the PI should be chosen. The adverse process is a terrible mistake. If the PI are chosen first, without the objectives, than the result will be tracking PI without straight and clear idea what the improvement is.
  • 68. 68 Another recommendation is to start with not more than about 20 PI and then to review the system, modify, tailor and adapt the new or change the starting PI, according to objectives of the PIs. Both PI systems aim to be a kind of ‘yellow pages’ where the undertaking managers may find relevant indicators regarding all their key activities. After defining the object of the performance assessment (e.g. the undertaking as a whole, water losses, rehabilitation, etc.), the intended use of the results (e.g. objective-oriented management) and the type of initiative (e.g. internal analysis, benchmarking, reporting for a regulator), undertaking managers may select the subset of the IWA indicators considered relevant to respond to their needs (Duarte et al., 2003).
  • 69. 69 5.4. IMPLEMENTATION OF IWA PERFORMANCE INDICATORS SYSTEM In this chapter, implementation process of IWA Performance Indicators System is going to be presented by illustration in the following figures. The implementation of any Performance Indicators system has to be objective-oriented. Performance Indicators are the last step of a larger management strategy that should link the undertaking’s objectives to strategies, define critical success factors and only then bring performance indicators both as means to evaluate the success of these strategies and as a mechanism to detect problems in advance. Objectives need to be precise and clear. They need to be both demanding and realistic and most importantly reflect the mission and vision of a company (Algere et al., 2006). After establishing the objectives, implementation process can be performed. Implementation process is the same (unique) for both, water supply services and wastewater services. Both consist of three phases:  Strategy,  Development and  Assessment. There is no crucial difference, between the next two figures, Alegre et al. (2006) presented the first one in 2000 and Matos et al. (2003) adapted, improved and upgrade the methodology with details. Figure 5.4.3 shows the sub-processes of PI and CI selection procedure and figure 5.4.4 shows, as sub-process, an example of the possible way for solution of data flows concerning PI and CI and team responsibilities. All the processes presented in figures are intended to be a continuous improvement process with in utility. The defining of objectives and strategies to reach them is a periodic task in the management of any organization at any level.
  • 70. 70 Figure 5.4.1: Phases of the PI system implementation process for water supply services (Alegre et al., 2006) Appoint Strategic team. Define objectives, strategies and critical success factors Define a PI team profile. Appoint a PI Team Identify suitable PI system elements for the critical success factors. Prepare SIGMA Lite file Establishment of data management routines, schedules and responsibilities Pilot test (with SIGMA Lite) Data collection, validation, input and assessment Result interpretation according to objectives, and global reporting Determine success in achieving objectives and prepare new strategies Strategy RefineselectionifPIsisnotoptimum forcriticalsuccessfactors DevelopmentAssessment
  • 71. 71 Figure 5.4.2: Phases of the PI system implementation process for wastewater services (Matos et al. 2003, adapted from Alegre 2000 and 2002) Phase 3. Implementation of important of Pis to be assessed Phase 1. Definition of the strategic performance assessment policy Definition of objectives Definition of the scope of application Definition of the PI team profile Appointment of a PI Team Adoption of IWA Manual of Best Practices “Performance Indicators for Wastewater Services” Phase 2. Selection of Pis to be assessed PI significance level assignment Selection of performance indicators (PI) and context information (CI) Definition of data collection and PI assessment frequencies Preparation of SIGMA Lite file by selecting PI and CI Definition of the internal data flows and team-member responsibilities Carry out pilot test with SIGMA Lite WW Software List of important PIs Elaborates of written procedures for data collection FULL IMPLEMENTATION – PIs ASSESSMENT Improvements See separate Figure: PI and CI selection procedure Improvements See separate Figure: Example of data flows concerning PI and CI and team responsibilities
  • 72. 72 Figure 5.4.3: PI and CI selection procedure (Alegre et al. 2006 and Matos et al. 2003) For each pre-selected significant PI and CI item: Identification of the data required Is all necessary data available? Reliability Accuracu Yes Are reliability and accuracy acceptable? Confirm selection of PI and CI Yes Reject PI No Reject CI No Is it efficient to obtain this CI? Do you really need this CI? Is it efficient to obtain the the data? Is PI important? Modify / Add data collection procedures Yes Yes No PI CI No
  • 73. 73 Figure 5.4.4: Example of data flows concerning PI and CI and team responsibilities (Alegre et al. 2006 and Matos et al. 2003) DEPARTMENTS PI TEAM TOP MANAGEMENT · Definition of the strategic performance assessment policy · Guidelines for selection of PIs and related Cis · Pre-selection of PI and related CI · Refinment of the PI and CI listings · Decision on improvements to be carried out on data collection procedures List of pre-selected PIs and related CI · Selection of possible additional PI (more detailed information) · Selection of the relevant variables Request for data availability Reports on PI and CI assessing availability · Check data availability from departments · Check possibility of assessing pre- selected PI · Proposes refinements of the PI and CI listings · Prepares reports Request for Variables and CI data · Selection of the relevant variables · Preparation of data collection forms · Preparation of the SIGMA Lite/SIGMA Lite WW file Reports on PI and CI · Collect data from departments · Enter data in SIGMA Lite/SIGMA Lite WW · Calculate PI · Output SIGMA spreadsheets · Prepares reports · Data availability verification (value, reliability and accuracy) by each responsible in the departments Variables and CI availability information Variables and CI data · Data collection (value, reliability and accuracy) by each responsible in the departments · Filling in the data collection forms Assessment PI and CISelection PI and CI · Decision on PI and CI use · Definition of the set of PI and related CI to be assessed · PI results interpretation · Decision on improvement measures in the undertaking · Decision on next steps according to intended uses PI and CI assessment request
  • 74. 74 5.5. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WATER SUPPLY SERVICES Within the IWA PI system for water supply, the performance indicators are structured into six main groups (following Table 5.5.1):  Water Resources (WR),  Personnel (Pe),  Physical (Ph),  Operational (Op),  Quality of Service (QS) and  Economic and Financial (Fi). These main groups are divided into subgroups and some of the indicators are broken down into sub-indicators.
  • 75. 75 Table 5.5.1: IWA PI system structure for water supply services Group Code Main PI group Subgroup Number of PIs subgroup (+ sub- indicators) Number of PIs main group (+ sub- indicators) WR Water Resources no subgroup 3 (+1) 3 (+1) Pe Personnel Total personnel 2 20 (+6) Personnel per main function 5 (+2) Technical services personnel per activity 6 Personnel qualification 3 Personnel training 1 (+2) Personnel health and safety 2 (+2) Overtime work 1 Ph Physical Water treatment 1 11 (+4) Water storage 2 Pumping 4 Valve, hydrant and meter availability 2 (+4) Automation and control 2 Op Operational Inspection and maintenance 6 31 (+13) Instrumentation and calibration 5 Electrical and signal transmission equipment inspection 3 Vehicle availability 1 Rehabilitation 2 (+5) Operational water losses 3 (+4) Failure 6 Water metering 4 Water quality monitoring 1 (+4) QS Quality of Service Service coverage 3 (+2) 24 (+10) Public taps and standpipes 4 Pressure and continuity of supply 8 Quality of supplied water 1 (+4) Service connection and meter installation and repair 3 Customer complaints 5 (+4) Fi Economic and Financial Revenue 1 (+2) 23 (+24) Cost 1 (+2) Composition of running costs per type of costs (+5) Composition of running costs per main function of the water undertaking (+5) Composition of running costs per technical function activity (+6) Composition of capital costs (+2) Investment 1 (+2) Average water charges 2 Efficiency 9 Leverage 2 Liquidity 1 Profitability 4 Economic water losses 2 Total number of PIs (+ sub-indicators): 112 (+58)
  • 76. 76 5.6. STRUCTURE OF IWA PERFORMANCE INDICATORS SYSTEM FOR WASTEWATER SERVICES In this chapter, structure of IWA Performance Indicators system for wastewater services will be elaborated and focused on in more detail, than structure of IWA Performance Indicators system for water supply services. There are many possible ways to define a wastewater system. Figures 4.1 and 4.2, shows the main components and linkages from an infrastructure point of view. Another way is presented on following Figure 5.6.1, which illustrates the way in which a wastewater flow balance may be determined. Key to implementation is the construction of a flow balance for catchments served. Flow, conveyed solids and other waterborne substances are illustrated as a simplistic wastewater balance, or flux. Here the sum of the input components and the system losses is equal to the outputs from the system. The water services PI manual provides definitive information about establishing a water balance (Matos et al. 2002a and Matos et. al. 2003). Figure 5.6.1: Wastewater balance (Matos et al. 2002a and Matos et al. 2003) OUTPUTSLOSSESINPUTS STORMWATER Runoff From properties From industry & commerce From wrong connections Imported Infiltration SANITARY SEWAGE Authorized Wrong connections From on-site systems Imported Industrial inputs Imported sludge Sewer solids extracted and re- introduced downstream STORMWATER Wrong connections Exfiltration SANITARY SEWAGE Wrong connections SSO spills CSO spills Solids extracted from systems and disposed off-site On-site systems (local disposal) On-site stormwater, handling (local disposal), source control, BMP, SUDS TREATED EFFLUENTS To watercourses To land Reused Exported UNTREATED EFFLUENTS To watercourses To land Exported SLUDGE Wasted & removed (disposed) Reused + = These services are usually not the responsibility of the wastewater undertaking.
  • 77. 77 The services provided by the undertaking must be related to the users of those services and the inter-relationship between the customers served by the wastewater undertaking and the direct impacts arising from those services are illustrated below. The Figure 5.6.2 shows how the various customers (including industry) utilise the services and how the population and other customer base can be comprised of residents and temporary (visiting) service users. It also shows that parts of the population and other potential customers may manage their own wastewater services on-site or otherwise. The non- users of the services are represented as those not served (Matos et al. 2003). Figure 5.6.2: Illustration of service provision in terms of customers and main impacts (Matos et al. 2003) Resident population Service population equivalent in the area Industry p.e. + Commercial p.e. + Services p.e. +other p.e. Seasonal population (period equivalent) + + Imported wastewater population equivalent Not served On-site systemas that are under the responsibility of the wastewater undertaking Sewer system Wastewater produced Wastewater drainage and/or treatment Discharge to environment (surface water, ground water, soil and air) Compliant with discharge consents Non-compliant with discharge consents Compliant with discharge consents Non-compliant with discharge consents WWTP Area under the wastewater undertaking responsibility
  • 78. 78 Performance Indicators for Wastewater services The IWA Performance Indicators system for wastewater is structured into six separate categories of performance: Table 5.6.1: Structure of the PI framework (Matos et al. 2003) Code Performance indicatorsfor wastewater wEn Environmental indicators wPe Personnel indicators wPh Physical indicators wOp Operational indicators wQS Quality of service indicators wFi Economic and financial indicators The interpretation of performance of an undertaking cannot be carried out without taking into account the context in which it operates. In addition, it is also necessary to consider also the characteristics of the infrastructure and resource system and characteristics of the region in which the services are provided. Having this in mind, the structure of the PI system also includes profiles for context, system and region. Figure 5.6.3: Structure of wastewater CI and PI (Matos et al. 2002a and Matos et al. 2003) Complete list of Performance Indicators for Wastewater is listed in APPENDIX, according to Matos et al. 2003, Cabrera et al. 2006. and Matos et al. 2002b. PERFORMANCE INDICATORS CONTEXT INFORMATION External data Undertaking information UNDERTAKING PROFILE SYSTEM PROFILE REGION PROFILE ENVIRONMENTAL PHYSICAL PERSONNEL OPERATIONAL QUALITY OF SERVICE ECONOMIC AND FINANCIAL
  • 79. 79 Table 5.6.2: IWA PI system structure for wastewater services Group Code Main PI group Subgroup Number of PIs subgroup (+ sub- indicators) Number of PIs main group (+ sub- indicators) wEn Environmental indicators Wastewater 5 12 (+3) Solid residues 7 (+3) wPe Personnel indicators Total personnel 2 20 (+5) Personnel per main function 5 (+2) Technical personnel per activity 5 Personnel qualification 2 Personnel training 1 Personnel vaccination and safety 3 (+1) Absenteeism 1 (+2) Overtime work 1 wPh Physical indicators Wastewater treatment 4 12 Sewers 3 Pumping headroom 3 Automation and control 2 wOp Operational indicators Sewer inspection and maintenance 5 45 (+11) Tanks and CSOs inspection and maintenance 4 Pumps and pumping stations inspection 2 Equipment calibration 3 Electrical and signal transmission equipment inspection 3 Energy consumption 3 Sewer system rehabilitation 4 (+3) Pump rehabilitation 2 Inflow/infiltration/exfiltration (I/I/E) 4 Failures 8 (+1) CSO control 1 Wastewater and sludge quality monitoring 3 (+7) Vehicle availability 1 Safety equipment 2 wQS Quality of service indicators Population served 4 18 (+11) Treated wastewater 1 (+4) Flooding 5 Interruptions 1 Reply to customer requests 3 Complaints 2 (+7) Third party damages 1 Impact on traffic 1 wFi Economic and financial indicators Revenues 2 (+2) 37 (+8) Costs 2 (+4) Composition of running costs per type of costs 5 Composition of running costs per main function (internal and outsourced) 5 Composition of running costs per technical function activity 4 Composition of capital costs 2 Investment 1 (+2) Efficiency indicators 9 Leverage indicators 2 Liquidity indicators 1 Profitability indicators 4 Total number of PIs (+ sub-indicators): 144 (+38)
  • 80. 80 Context Information for Wastewater services The Context Information data are organized into sectors for undertaking profile, the system profile and the region profile. The undertaking profile outlines the organizational structure for the Undertaking. The system profile focuses mainly on the type of water/wastewater infrastructure and service provided, i.e. the physical assets, technology used and type of customer. The latter goes into more detail than the other profiles because it also contains descriptive information that is helpful for the interpretation of the PIs. The region profile is essential for meaningful comparisons between undertakings as it allows for a better understanding of the demographic, economic, geographical and environmental context (Matos et al. 2002a and Matos et al. 2003).
  • 81. 81 Figure 5.6.4: Context Information data for wastewater services (Matos et al. 2002b) SERVICE DATA Types of system managed by the undertaking Population, population served by different types of systems or unserved Peak population served Catchment area and impermeable area Annual average daily dry weather flow Industrial wastewater Imported/exported wastewater Daily peak factor Level of treatment (without treatment, preliminary, primary, secondary and tertiary treatment) Sludge production, treatment and disposal CUSTOMER SERVICE Complaint record systems Guaranteed standards scheme PHYSICAL ASSETS Wastewater systems Total sewer length Sewer length increase Combined sewer systems, separate domestic and separate storm water sewers, pump mains and other sewers Sewer materials, diameters or equivalent and age Manholes, Sewer overflows, Sewer gullies and Sea outfalls SEWER CONNECTIONS Domestic, Industrial pre-treatment facilities, Septic tank and other connections STORAGE Storage tanks, stormwater storage tanks, other and total storage volume Pumping stations - number, total capacity, pumped wastewater Wastewater treatment plants, number by size and type of upstream system Treatment capacity (preliminary, primary, secondary and tertiary treatment capacities) Peak flow storage capacity at WWTP Pumping capacity Sludge production and sludge treated TECHNOLOGICAL RESOURCES Computerized information systems: planning and decision, inspection, maintenance, customer complaints, other Monitoring, automation and control: flow meters, quality monitors, pumping, treatment, monitoring and control, integrated control MAPPING Updated mapping and digital mapping SYSTEM PROFILE DEMOGRAPHY AND ECONOMICS Population density, Household occupancy Population growth rate (current and forecast) Gross National Product per capita Inflation rate Yearly working time Unemployment rate ENVIRONMENT Yearly rainfall (average, minimum and maximum) Short duration rainfall (10 and 60 minute, 10 year return period) Air temperature (daily average, minimum and maximum) Topography (maximum altitude, minimum altitude) RECEIVING BODIES Types and special protected areas REGION PROFILE Undertaking identification Geographical scope Type of activity Type of assets ownership Type of operations Total personnel Annual revenue, Annual total costs Outsourcing costs (management and support, financial and commercial, planning and design, construction, operation and maintenance and laboratory services) Average annual investment Service taxes or charges UNDERTAKING PROFILE
  • 82. 82 Variables for Wastewater services Variables are marked with capital letters, from A to H. Variables are divided in sections, as follows:  Section A, Environmental Data, marked as wAi (from wA1 to wA26), 26 in total;  Section B, Personnel Data, marked as wBi (from wB1 to wB27), 27 in total;  Section C, Physical Assets Data, marked as wCi (from wC1 to wB33), 33 in total;  Section D, Operational Data, marked as wDi (from wD1 to wD69), 69 in total;  Section E, Demography (and Customer) Data, marked as wEi (from wE1 to wE8), 8 in total;  Section F, Quality of Service Data, marked as wFi (from wF1 to wF26), 26 in total;  Section G, Economic and Financial Data, marked as wGi (from wG1 to wG52), 52 in total and  Section H, Time Data, marked as wHi (wH1), 1 in total. It is obvious that the number of variables is huge (242 in total). It does not mean that all of them are needed. As already mentioned, first phase is establishing of the objectives, than, according to those objectives of undertaking, PI’s are chosen and only variables for those PI’s are needed. It means that this huge number of variables is going to be as small as possible.
  • 83. 83 5.7. PERFORMANCE INDICATORS AND RELATED COMPONENTS - CONFIDENCE-GRADING SCHEME Confidence-grading scheme is always the same for any Performance Indicators System and it is has been standardized by ISO series 24500:2007, adopting it from IWA. In the literature, reference books, such as Algere et al. (2006) and Matos et al. (2003), cited the same structure and methodology. This chapter has been completely adopted from ISO 24510:2007, ANEX B, 24511:2007, 24512:2007, ANEX E and also Alegre et al. 2002, Algere et al. (2006) and Matos et al. (2003), without any changes: Reliability bands Table 5.7.1: Reliability bands of data for PI system DATA RELIABILITY Definition A - Highly reliable Data based on sound records, procedures, investigations or analyses that are properly documented and recognized as the best available assessment methods. B – Reliable Generally as in band A, but with minor shortcomings, e.g.: some of the documentation is missing, the assessment is old, or some reliance on unconfirmed reports or some extrapolations are made. C – Unreliable Data based on extrapolation from a limited sample for which band A or B is available. D – Highly unreliable Data based on unconfirmed verbal reports and/or cursory inspections or analysis. Accuracy bands Accuracy is defined as the approximation between the result of a given measurement and the (conventionally) correct value for the variable to be measured. The accuracy bands presented below are based on the system adopted in England and Wales. They are to be applied to the measurement and not to the measuring equipment - for example, in some cases the equipment may be highly
  • 84. 84 accurate but is used out of range. Whenever the measurement accuracy cannot be assessed, it should be graded as greater than 100%. Table 5.7.2: Accuracy bands of data for PI system DATA ACCURACY Definition 1 - Error (%): [0;1] Better than or equal to +/- 1% 2 - Error (%): ] 1;5] Not band 1, but better than or equal to +/- 5% 3 - Error (%): ] 5;10] Not bands 1 or 2, but better than or equal to +/- 10% 4 - Error (%): ] 10;25] Not bands 1, 2 or 3, but better than or equal to +/- 25% 5 - Error (%): ] 25;50] Not bands 1, 2, 3 or 4 but better than or equal to +/- 50% 6 - Error (%): ] 50;100] Not bands 1, 2, 3, 4 or 5 but better than or equal to +/- 100% Error (%): > 100 Values which fall outside the valid range, such as > 100%, or small numbers Overall confidence grades The confidence grades (c.g.) will be an alphanumeric code, which couples the reliability band and the accuracy band, for instance: C4 – Data based on extrapolation form a limited sample (Unreliable, Band C), which is estimated to be within +/- 25% (Accuracy band 4). The reliability and accuracy bands would form the matrix of confidence grades shown below: Table 5.7.3: Matrix of confidence grades, according to ISO standard series 24511:2007 (Matos et al. 2003) Accuracy Bands (%) Reliability bands A B C D [0; 1] A1 ++ ++ ++ ]1; 5] A2 B2 C2 ++ ]5; 10] A3 B3 C3 D3 ]10; 25] A4 B4 C4 D4 ]25; 50] ++ ++ C5 D5 ]50; 100] ++ ++ ++ D6 NOTE: ‘++’ indicates confidence grades that are considered to be incompatible