Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

WIPAC Monthly - July 2017

1,504 views

Published on

WIPAC Monthly is the monthly magazine produced by the LinkedIn Group Water Industry Process Automation & Control.

In this month's issue we have, amongst the usual news articles from the Data/Information & ICA in the Water Industry:

An article on the legal issues of Big Data
An article focusing on Electro-magnetic flow meters
An article focusing on the OFWAT PR19 Methodology and its impact on future programmes
An article on Dissolved Oxygen in wastewater

Published in: Engineering
  • Hi Oliver - finally got chance to read this one after the August holiday insanity. A very nice article of EM flowmeters - spot the rig! The Langham article was also very interesting and I'll pass across to Nigel, ironically they also use the AquaMaster3 on the same standpipe assembly - if only we had a portable turbidity! One for the tech boys to mull over. Speak soon.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

WIPAC Monthly - July 2017

  1. 1. Page 1 WIPAC MONTHLYThe Monthly Update from Water Industry Process Automation & Control www.wipac.org.uk Issue 7/2017 - July 2017
  2. 2. Page 2 In this Issue From the Editor.................................................................................................................... 3 Industry News..................................................................................................................... 4 - 10 Highlights of the news of the month from the global water industry centred around the successes of a few of the companies in the global market. Legal Issues in Big Data...................................................................................................... 11-12 As we adopt technology it is easy to forget the legal aspects of what the water industry does, the privacy of customers, and their data. In this article by Fred Greguras, originally published in Water Online, the legal issues around Big Data with special reference to the USA is discussed. Focus on: Electro-magnetic flow meters in wastewater...................................................... 13-16 In this month’s feature article we take a focus on electro-magnetic flow meters. The technology has been in existence for decades now and this article looks at the basics from the theory behind the technology to the practice of installation and operation OFWAT’s PR19 Methodology: Implications for resilience, asset health & leakage.............. 17-18 In this opinion piece by George Heywood of Servelec Technologies the implications of the recent publication of the UK financial regulator, OFWAT, and their methodology for the next price review highlights the implications for resilience, asset health & leakage Dissolved Oxygen Measurement in Wastewater............................................................. 19-21 The measurement of dissolved oxygen in wastewater is one of the fundamental parameters that the industry measures. In this white paper by ABB the importance, measurement principals and performance of the current technology is discussed Workshops, Conferences & Seminars................................................................................... 22-23 The highlights of the conferences and workshops in the coming months WIPAC Monthly is a publication of the Water Industry Process Automation & Control Group. It is produced by the group manager and WIPAC Monthly Editor, Oliver Grievson. This is a free publication for the benefit of the Water Industry and please feel free to distribute to any who you may feel benefit. All enquires about WIPAC Monthly, including those who want to publish news or articles within these pages, should be directed to the publications editor, Oliver Grievson at olivergrievson@hotmail.com
  3. 3. Page 3 From the Editor In the UK we have the Price Review Process, for those of you in the UK you will know all about it and for those who aren’t its where hundreds if not thousands of people around the country work for up to two years to make sure that everything that needs to be done in a five year period is done. What is important about this particular time is that all of the things that need to be done has to be thought of, ideas floated with customers and then it has to be budgeted for. It is an exhausting time as the industry has to have a very good idea of exactly what they are going to do and how much it is going to cost. In this time everyone is also publishing their ideas of the strategic directions and we have seen this in recent weeks. OFWAT have published a plethora of documents talking about the various directions they want the industry to take. In the methodology that was published this month amongst other things we saw discussions around resilience, asset health and leakage amongst other things. They have also recently published a report on the use of data and the various aspects of it in the modern industry. Reading through all of these reports over the past few weeks and with a slightly biased technological view there is a vision of a huge potential. The OFWAT report talked about utilising data for customer benefit but in another breath they talk about efficiency and the need to keep the cost of water to sustainable levels. This is where data really comes to its forefront. Over the past few years we have seen the benefits of data and innovations in its use. The supply chain is developing it and developing the products for the Water Industry to use. There are barriers to this approach and this includes data & its protection. The industry as a whole is installing thousands of smart meters that are collecting data at an unprecedented rate. What is going to be done with this data? We see in an article in this month’s issue that it has a huge potential saving a customer 50,000 litres of water that was being lost on the customer side of the meter. Its a great saving but where is the next 50,000 litres. This is in leakage which to be honest has adopted smart technology and has seen huge benefits and have reached a great maturity level but what’s next? Gamification - using customers data to monitor and encourage neighbourhoods to save water. Again, in areas of water shortage a technology that has been well used Electronic Billing and App Communication - Can a customer pay a bill via their smart phone, can the customer be informed of problems or can a customer ask a question through the same App. Its something that is certainly coming This is for customers but what about operational efficiency there is a lot that can be done in managing the data and again there is a huge amount that can be done. A quick flick through all of the reports out there reveal the systems that are out there for managing assets, maintaining instruments with CMMS systems as well as things like Condition Based Monitoring. All of these technologies have been around for awhile and it seems that there is a drive for the industry to go in this direction. The really important thing is that it has to be encouraged and fit in the price review process as it all has to be thought of now so that the ideas can be delivered in the future otherwise things wait for another five years when we are engaged in pricing things up next time. As always it is an interesting time and if we come out of the field of the water industry then there are also the questions how the water industry works with the other utilities and how this can all be brought in to the concept of smart cities. All of these programmes come together and reading another report that was published recently we see the sustainable cities report by Arcadis. Water is of course a fundamental part of the city concept and it will be a case of how everything is integrated and working together.... Oliver
  4. 4. New directors at the Sensors for Water Interest Group Smart sensors for water: sensor design and performance Richard Bragg of United Utilities and Michael Strahand of Analytical Technoligies Inc have recently joined the Board of Directors at the Sensors for Water Interest Group following the retirement of Anthony Kyriacou of Severn Trent and John Marsh of Siemens from the board of directors of the UK not for profit organisation. Richard Bragg is currently a Principal ICA Engineer at United Utilities one of the UK Water Industry’s water & sewerage companies covering the north west of England. Richard is a Chartered Instrumentation, control and automation specialist with over 15 years experience in the oil, gas, energy and utilities industries from FEED through to project delivery, commissioning and optimisation. A motivated and driven BEng qualified engineer he has worked on large scale projects both on and off-shore with multinational teams. Dr Michael Strahand has 25 years experience in building businesses that operate in the water industry, in both the clean and the wastewater treatment market. He has a PhD in chemistry and has the ability to apply that knowledge to solving process application problems. He has helped thousands of people in hundred of companies to optimise their processes. He has worked at all stages of the business, design, manufacture and application of on line instrumentation to the monitoring and control of processes, especially water processes. Through in depth knowledge and application of that knowledge he has earned the nickname Doctor Chlorine from his customers and peers. He works with all the UK’s water utility plc’s and have connections at all levels in these organizations. He specializes in applying my in depth knowledge of chemistry to the application of sensors to the monitoring and control of water treatment and waste water treatment processes and building sales organisation businesses based on this. Richard Bragg of UU Michael Strahand of ATI Professor Richard Luxton, Director at Institute of Bio-sensing Technology at the University of the West of England, takes a look at some of the latest developments in sensor and biosensor technology for water analysis ahead of SWIG’s major Sensing in Water conference in September. Professor Richard Luxton: The development of sensor and biosensor technology for water analysis has mushroomed, seeking to exploit the development of new materials and connected smart technologies. Certainly, over the last few years there have been many sensing technologies developed in our universities that demonstrate exquisite sensitivity in the laboratory but are yet to show that transition into a viable product for real world application. Many new sensing technologies are based on the application of nanomaterials such as graphene or carbon nanotubes using electrochemical or impedimetric measurement technologies. Other sensors have been developed that rely on nanoscale features in materials such as nano-pores which detect the presence of material transiting through the structure, or atomic defects in materials such as diamond. For example boron doped diamond is being used to develop a new pH electrode with the potential for greater performance specifications than other pH electrodes. Although there has been a great focus on these nanomaterials many new sensors which are closer to the market or are being used in current products are based on the application of optical methods such as the measurement of fluorescence. For example, the integration of flow cytometry and fluorescence measurement forms the basis of a technology for the detection of bacteria in water which can be applied to online monitoring. In another application recording both the excitation and emission spectra of water multiple compounds can be detected simultaneously. Despite developing ultra-sensitive measurement technologies there still remains the problem of integrating the sensing elements into a real world device where a representative sample is presented to the measurement system. In the era of the Internet-of-things, smart sensing and remote monitoring will be the future, integrating multiple types of sensing technologies to monitor water whether that be drinking water or wastewater. Connectivity to the cloud and interoperability are paramount in the new smart, connected world of sensor systems. We have sensor networks but now connectivity with the cloud opens up the opportunity to develop new data analytics which will generate greater opportunities for the early detection of impending problems, reducing costs of response and enhancing the efficiency of the operator. Only by validating new technologies will we have reliable, cost effective sensors required for smart monitoring. The Sensors for Water Interest Group (SWIG) Sensing in Water major upcoming conference on 27 September at Nottingham Belfry will cover the above topics in depth. Page 4 Industry News
  5. 5. The use of 3D maps to plot the presence of underground utility networks and make the planning of streetworks easier was one of the top ideas to emerge from Northumbrian Water Group’s Innovation festival. NWG’s Innovation Festival, held this month at Newcastle Racecourse, was an unique week-long event that brought people to the region from around the world to tackle social and environmental challenges. One of the organisations taking part was Ordnance Survey, which is looking to work with utilities companies on a trial of 3D mapping to identify the exact locations of gas and water mains, electricity cables, and more, in one place for the first time. If the trial is successful, it has the potential to be rolled out across the country, making maintenance and upgrades to these networks easier and less costly. Ordnance Survey worked with Northumbrian Water on a week-long “sprint” as part of the festival, looking for innovative ways that businesses can contribute to an improved environment. Richard Crump, a managing consultant for Ordnance Survey, said: “The topic of where the various companies’ underground assets are, such as pipes and cables, comes up regularly. There have been various attempts to tackle it in the past, but if we can pull all of this data together there are many different benefits, for the public and for the businesses that supply these utilities. “Improving the knowledge of the various networks and the areas where they come together will help massively when there are problems with one or more of the underground cables or pipes, or when they need replacing. For example, it has the potential to reduce the chances of affecting other companies’ services, making it quicker and easier to make repairs or upgrades, and much more. “We would like to get all of this data in one digital space that shows where it is, how deep it is, what the critical nature is, and then we can work with the companies to see how we can improve things for everyone.” Nigel Watson, Director of Information Services at Northumbrian Water Group, said: “This idea of improving the knowledge that we all have about the various networks of cables and pipes beneath the ground has really caught the imagination of people at the NWG Innovation Festival. “We have maps that show where our road networks and pavements are, people create maps of the electrical wires inside buildings, so the idea that we can do the same beneath our streets is something that makes real sense. “Many older pipes and wires are not recorded digitally, so some of the data is incomplete or held only in paper format, but the benefits of resolving that and creating a complete network map are huge. “Greater knowledge of what we are facing when we go out to work on our network means we can complete jobs more quickly, and this is a great new idea for improving that understanding. Likewise, if we need to dig a hole in the road to repair a water burst, knowing what else is in the ground can reduce the chances of causing problems on another company’s network, so we avoid unwanted disruption or loss of services to customers.” The other challenges discussed at the festival included flooding, leakage, optimising the mobile workforce, upgrading ageing infrastructure and the use of artificial intelligence in the workplace. Underground 3D maps touted to reduce streetworks disruption The Water Environment Federation (WEF) has signed a memorandum of understanding with the Smart Water Networks Forum (SWAN), agreeing to jointly promote the development of best industry practices for more efficient and sustainable smart water networks. The water sector continues to promote and embrace innovation, and smart water networks have emerged as a popular way to use technology to optimize system operations. Smart water networks have a range of applications, from detecting system leaks to managing energy. As technological advancements continue to change the water sector, the qualifications for water sector jobs change too, presenting an opportunity to equip water professionals with new skill sets and knowledge. Through this partnership, WEF and SWAN will continue to advance workforce development. “Supporting innovation is essential to the water sector, and to further development of intelligent water systems” WEF Executive Director Eileen O’Neill said. “The memorandum of understanding between WEF and SWAN is the first step to create the kind of collaborative engagement needed for the future of the sector.” SWAN’s focus on smart wastewater network management enables efficiencies and improvements in three categories: customer, environmental, and operational benefits. This complements WEF’s attention on the value of integrating intelligent water practices into the water sector, determining common barriers of implementing intelligent water practices, technology trends, and new solutions. This year WEF published “Intelligent Water Systems: The Path to a Smart Utility,” which provides a glimpse into the potential benefits of implementing intelligent water systems and global examples from which the water sector can learn. WEF Formalizes Commitment To Intelligent Water Systems, Smart Water Networks Page 5
  6. 6. ATi Joins Forces with Langham and Caption Data using their NanoULTRA for Better Control of Water Networks Langham Industrial Controls Ltd (LIC), ATi and Caption Data have joined forces to offer water companies ‘better control’ when maintaining water networks. This new partnership has created huge benefits as now flow, pressure And turbidity can be used to control how much water is used during a mains flush, whilst also measuring water quality. Since its 2014 launch, the NephNet has proven to be the industry’s most accurate and reliable portable turbidity monitor. Several water companies have used this new technology to improve their systems, prove the flushing is effective and to speed up the process, which saves money, time, and of course water. Some time ago, LIC took the step to make their standpipes ‘NephNet ready’, so now there is communication with the flow and pressure loggers on the standpipes, the turbidity reading can be shown with all three parameters being demonstrated. ATi’s UK Sales Manager, Tristen Preger, said: “ATi strives to continue developing what the market demands, which is has been proven with the NephNet. “Our continual development of our ground-breaking network monitors is born from our desire to work closely with our customers, providing solutions that help improve water quality, drive down complaints, increase credit ratings and result in proactive network management to safeguard water quality.” EU funds development of integrated observing system for Atlantic Ocean EU-funded researchers are driving the development of an integrated system to apply a common strategy to a range of data about the state of the Atlantic which is collected by a variety of organizations via a number of means including buoys, floats, moorings and research vessels. The €20m-plus EU AtlantOS project currently underway is dedicated to the creation of an integrated Atlantic Ocean observing system. The consortium is also aiming at extending the scope of information available from observation of the Atlantic, Visbeck added. “There is a rapidly growing desire among decision-makers – both in the public and the private sector – for more and in particular more data based information from the ocean.” Work in the project, which is taken forward by a consortium of 62 partners from 18 countries from both sides of the Atlantic, began in April 2015. It is due to end in June 2019, following the delivery of a blueprint for the development of the proposed system. Martin Visbeck, project coordinator said: “What we have at the moment is different types of observing system, which are usually dedicated to one single question – climate, biodiversity, ocean chemistry, fisheries, or rapid response to hazards such as oil spills, for instance. By combining these approaches and looking at them strategically, as a whole system, you can boost efficiency. “Among other things, we try to point out opportunities. When you go out to assess fish stocks, you can easily take some other measurements alongside.This would significantly enhance the capability of the whole system and increase efficiency.” New possibilities are also arising with the emergence of new technology such as increasingly powerful robotic systems and new sensor technology. Wider adoption of such technologies could help to reduce the cost of ocean observing, or to make the investment stretch further. The project is also striving to establish common procedures, data formats and calibration features, so that it doesn’t matter where a measurement is taken and the data are all consistent with each other. The improvements would make it easier to assemble the various types of data into a comprehensive overview of the state of the ocean, both as a way of documenting change and as a starting point for prediction of its likely evolution. An evolutionary process One of the project’s aims is to enhance the contribution of Atlantic observing to broader collective efforts such as GEO/Blue Planet, as well as GOOS, the Global Ocean Observing System. A new partnership that is about to be formalised between Brazil, South Africa and the EU will mark another milestone for the collaboration. The plan, in AtlantOS, is to generate awareness of the capabilities and possibilities of the various systems that are already in operation, in a bid to show how their activities and further development could contribute to the common goal of an integrated system. Martin Visbeck concluded:“The key point is to look at Atlantic observing as a system, rather than a random collection of individual bits, and to really understand the capability of this system, rather than the individual capabilities of discrete components such as a network of floats or of a fleet of research vessels. This is new, because the various groups involved have mostly been working in their own circles.” Page 6
  7. 7. Irish Water invests in wastewater flow monitoring Irish Water has started work on a €10M (£9M) programme which will see flow monitoring and performance sampling equipment installed in 400 wastewater treatment plants across the country. Treatment plants in Kildare, Meath, Offaly, Westmeath, Wicklow, Laois, Louth and Fingal will be among the first to receive the equipment as part of a nationwide project which will provide enhanced protection for Ireland’s rivers and coastal waters. The Flow Monitoring and Sampling Programme will also allow the utility to identify where investment is needed in the wastewater infrastructure to accommodate future population growth. Critical wastewater flow and load data will be made available on a consistent basis for the first time ever, helping to improve the performance of the treatment plants. It also helps protect the waterways into which treated wastewater is discharged. When the roll-out of the equipment is completed, plant operators and engineers will have the data and tools to enable them to better manage the treatment processes, measure performance and react quicker to any sudden changes such as a storm event. Instrumentation installed will include flow measurement devices, storm event recorders and sampling equipment. This project will also ensure compliance with the monitoring and sampling requirements of EPA Wastewater Discharge Authorisations. The project will allow Irish Water to build flow and load profiles which in turn will help form strategies for upgrading, maintaining, improving plant efficiency, and ensuring it can identify early where investment is required to meet future demands on wastewater infrastructure. Smart meter reveals 50,000 litre-per-day leak A huge customer-side leak which was losing 50,000 litres of water a day has been repaired by Thames Water after being revealed by the installation of a smart meter. The leak – which was losing the equivalent of 625 bathtubs every day – was discovered after the meter was installed on a customer’s property in Greenwich, south London, as part of the water saving smart meter programme. If the leak had not been spotted and fixed, it would have cost the homeowner a huge £38,325 per year. Stephanie Baker, smart metering programme manager, said the repair showed one of the many benefits of smart meters. She said: “We’re always looking at ways to help our customers save, and reduce leakage. “This just goes to show the benefits of having a smart meter installed. “It’s a win-win, we’ve fixed a huge leak on our customer’s supply pipe for free, that if allowed to run could have cost the customer more than £38,000 per year in additional water consumption.” The leak, which is the largest one discovered by a smart meter, was spotted during routine monitoring of hourly data from the device. The customer was informed and a team of engineers were tasked to repair it. Two leaks on the same stretch of pipe were ultimately identified, one in the garden and one in the basement. The customer is currently on a trial period for the smart meters, and the money saving is the difference between what they would have paid if they hadn’t installed the device. The industry-leading programme, currently being rolled out across London, gives customers two years to take control of their metered usage, before switching them to a metered bill. The aim of the smart metering programme is to reduce overall water use and improve leakage detection, owing to population growth and climate change putting pressure on water resources. Meters will help achieve this aim, by giving residents access to their water use information, online or over the phone, allowing them to see how efficient their home is and track how simple water-saving efforts – like four minute showers and turning the tap off while brushing your teeth – can reduce bills. Thames Water also offers free ‘smarter home’ visits, checking how water efficient a house is and installing water-saving gadgets. Since the launch of the metering programme in 2015, Thames Water has carried out more than 60,000 such visits, saving around 2.5 million litres every day. Page 7
  8. 8. Closer ties and AI could help tackle flooding The impact and likelihood of floods could be reduced through closer ties between people living in areas prone to flooding and relevant agencies. The idea was one of several to emerge from Northumbrian Water Group’s recent NWG Innovation Festival. The week-long event brought together flood experts and victims, as well as academics and people from a wide range of businesses, with a series of new ideas developed and then presented to water industry leaders. The proposals will now be worked on by NWG and partners from business and academia that took part in the festival, including headline sponsor IBM, which helped lead the search for ideas. These ideas ranged from community champion figures who would act as liaisons between relevant agencies and the community to an artificial intelligence system that delivers bespoke information to people, based upon their own specific experiences, locations and needs. During the week, a five-day “sprint” saw around 80 people focusing on the issue of flooding, taking it from an outline of the problem to ideas that can be developed. Sprints apply leading design thinking techniques to real world issues. The sprint, entitled ‘Rain, Hail or Shine’: How can we reduce flooding? was one of six such activities carried out at the same time at the NWG Innovation Festival, which took place at Newcastle Racecourse. Ideas that came out of the sprint, and which will be developed by NWG in partnership with a range of other organisations, include: Members of the public would work closely with relevant agencies and help to keep communities informed to help reduce flood risk and enable people to be better supported when they are affected The creation of an agency that links directly with customers to give and receive bespoke information on flooding, helping to reduce flood risk. A collaborative approach to reducing the surface water that runs from the landscape into water courses A system that utilises artificial intelligence (AI) technology to deliver bespoke flood information to users, so they are better informed about how to respond when problems occur. Chris Jones, NWG’s research and development manager, said: “The NWG Innovation Festival has helped us to work together with a wide range of individuals and companies to take a fresh view on the very broad subject of flooding. “We know that when someone experiences flooding in their homes it can be one of the most devastating things that can happen to them. This is why we place such a strong focus upon reducing the chances of it happening and also the way that we, and others, can support people when flooding does happen. Some of the ideas that came out of the NWG Innovation Festival can make a really positive impact in these areas and we are already looking at how we can start developing and delivering them.” The NWG Innovation Festival was supported by IBM, Microsoft, Ordnance Survey, BT, CGI Group and Reece Innovation. It was also delivered in association with Newcastle University, Durham University, Genesys, Interserve in partnership with Amec Foster Wheeler, Costain Resources, PC1, Tech Mahindra, Mott MacDonald Bentley, Wipro, Virgin Media Business, Schneider, Wheatley Solutions, Sopra Steria, Accenture, 1Spatial, Infosys, Unify, ITPS, Esh-MWH, and Pen Test Partners. s::can System Effectively Controls Aeration Blowers and Reduces Operating Costs in Wastewater Treatment Plant Colorado Springs Utilities (USA) use s::can’s ammo::lyser to monitor and control aeration blowers using NH4-N, improving the plant’s energy efficiency and lowering operating costs. The J.D Phillips Water Resource Recovery Plant came online in 2007 to help the city of Colorado Springs meet increasing service demands. The state of the art facility is enclosed to help odour control. The plant decided to replace their old analyzers with s::can sensors. Since the entire plant is covered, traditional analyzer installations called for conduit piping to be installed under each basin cover to provide power and data communications. In addition to this, the plant required a dynamic installation as the sensors would be moved between two separate basins every year for maintenance purposes. Installing new conduit was extremely costly, time consuming, and prevented the plant operators from moving the analyzers between the two aeration basins within their plant. In an effort to reduce cost, prevent the need for construction, and create transportable analyzing stations, s::can provided the plant with a low-cost, low-maintenance, dynamic solution. As an alternative to installing new conduit under each basin cover, RS485 Modbus radios were used to transmit data back to a con::cube controller in real time. Using this system design, s::can was able to install the equipment at a fraction of the cost and time, and also provided the plant with a dynamic installation. In order to effectively control the blowers using ammonia-based DO control, the key parameters NH4-N, pH, Dissolved Oxygen, TSS, NO3-N were measured. Simultaneously measuring these parameters, Colorado Springs Utilities is able to effectively control their aeration blowers and reduce overall operating costs for the plant. Moreover, the staff now has an easily adaptable system allowing them to move the stations throughout the plant. Page 8
  9. 9. 3D-printed water quality sensors set to revolutionize water industry Researchers at the University of British Columbia in Canada have built inexpensive and tiny devices using a 3D printer to inspect water quality in the distribution system which could revolutionize the water industry. The researchers at UBC’s Okanagan campus have designed a tiny device that can monitor drinking water quality in real time and help protect against water- bourne illness. Professor Mina Hoorfar, Director of the School of Engineering, says new research proves their miniaturized water quality sensors are cheap to make, can operate continuously and can be deployed anywhere in the water distribution system. Online water quality monitoring is becoming an essential part of large water distribution systems (WDS) to ensure that contamination, whether through accidental or deliberate means, do not affect consumers. However, while most urban purification plants have real-time monitoring sensors in the upstream of WDS, the placement of online water quality monitoring sensors throughout WDS has not yet been feasible, primarily due to the high cost and low reliability of the sensors. Prof. Hoorfar commented: “Current water safety practice involves only periodic hand testing, which limits sampling frequency and leads to a higher probability of disease outbreak. “Traditional water quality sensors have been too expensive and unreliable to use across an entire water system.” Until now – the tiny devices created in her Advanced Thermo-Fluidic lab at UBC’s Okanagan campus, are proving reliable and sturdy enough to provide accurate readings regardless of water pressure or temperature. The sensors are wireless, reporting back to the testing stations, and work independently—meaning that if one stops working, it does not bring down the whole system. And since they’re made using 3D printers, they are fast, inexpensive and easy to produce. “Unique and effective technology that can revolutionize the water industry” “This highly portable sensor system is capable of constantly measuring several water quality parameters such as turbidity, pH, conductivity, temperature, and residual chlorine, and sending the data to a central system wirelessly,” she added. “It is a unique and effective technology that can revolutionize the water industry.” While many urban purification plants have real-time monitoring sensors, they are upstream of the distribution system. According to Prof. Hoorfar, the pressure at which water is supplied to the customer is much higher than what most sensors can tolerate. The new sensors can be placed right at or within a customer’s home, providing a direct and precise layer of protection against unsafe water. “Although the majority of water-related diseases occur in lower- or middle-income countries, water quality events in Walkerton, for example, raise serious questions about consistent water safety in even developed countries like Canada,” Prof. Hoorfar continued. “Many of these tragedies could be prevented with frequent monitoring and early detection of pathogens causing the outbreak.” A commercially available desktop 3D design printer was used to create the devices – the Mojo 3D Printer from US firm TRIMECH which uses Fused Deposition Modelling™ technology developed by Stratasys Inc. in the USA. The research, recently published in Sensors journal, was partly funded by the Natural Science and Engineering Research Council of Canada Strategic Project Grant and Postgraduate Scholarship funding. WRc launch new ‘One Stop Shop’ for water meter testing WRc has commissioned a new water meter test rig to enhance their established meter testing capabilities. The new rig provides additional capacity for accuracy and performance testing of small meters, i.e. those sized 15mm to 30mm which are used to measure water supplied to household customers and a large proportion of commercial premises. The new rig complements the existing large meter rig which is primarily used for meters from 40mm up to 150 mm. Both rigs are fully instrumented and will provide test results to National Standards. This enables WRc to offer a “one stop shop” for meter testing that covers more than 99% of the meters used in the water industry. All test rigs are further supported by specialist facilities for investigating the effects of long-term wear on meters. Page 9 Figure 1: Fabrication of the interface using 3D. (a) Comsol simulation of the interface; (b) CAD model of the interface; (c) fabricated interface; and (d) pH sensor fitted into the interface.
  10. 10. The 2014 Water Act specifically added a fifth clause to the water industry regulator Ofwat’s purpose, namely to secure: (a) the long-term resilience of [water & sewerage] systems as regards environmental pressures, population growth and changes in consumer behaviour, and (b) that undertakers take steps… to meet, in the long term, the need for the supply of water and the provision of sewerage services to consumers,” Following the 2014 Water Act, Ofwat spent time interpreting its implications and in its Towards Resilience document (December 2015), it provided a definition: “Resilience is the ability to cope with, and recover from, disruption, and anticipate trends and variability in order to maintain services for people and protect the natural environment, now and in the future.” Given that avoiding “disruption” is pretty much the same as providing reliability, this signals that Ofwat wants to ensure reliability now and in the future. To do this we need the ability to firstly assess our reliability, and secondly predict the future. For the legislators of the Water Act, there was perhaps an assumption that we were on top of reliability now, and they wanted to make sure we were taking a long-term view of it. The truth is that there is still work to do to ensure we are top of our reliability now. In Towards Resilience, Ofwat also makes the point that measuring resilience (or reliability) would be a good idea, and recommends that the water companies consider how best to do this. But guidance on measuring resilience is scarce. There is a relevant British Standard (BS 65000:2014, Guidance on Organisational Resilience) which is mainly focused on the cycle of learning from your mistakes. However, estimating reliability is an area of engineering roughly a century old. Safety Reliability engineering was first used shortly after the First World War in the context of aeroplane safety. Engineers working on the German V-1 missile programme worked out the basic theory during World War 2. Space research in the 1950s and ‘60s pushed the theory further forward and the first journal emerged in 1963 (Institute of Electrical and Electronics Engineers Transactions). In 1965 Richard Barlow and Frank Proschan wrote the seminal text entitled Mathematical Theory of Reliability. Oil and gas and the nuclear industry employ the techniques of reliability routinely but in our industry, it is rarely spotted, with the notable exceptions of its close relative the HAZOP Workshop, and the odd Safety Integrity Level (SIL) assessment for safety systems. So what is reliability engineering? Reliability engineering is a whole collection of techniques intended to help us determine whether an item or a system is going to function or not. It can split between two approaches: the physical and the actuarial. The physical approach is to do with variation in “load” – if we understand the variation in load and we understand the load at which the system fails, then we understand the reliability. This is fairly comfortable terrain for engineers and fits well with the “variability” part of the resilience definition. Loads might be interpreted as water demand, wastewater load, weather and so on. The actuarial approach is more to do with lifetimes and deterioration and captures the asset performance in terms of its “time to failure” or “failure probability”. Failures happen, especially when you have huge numbers of things. Predicting which ones are going to fail is really difficult but predicting how many are going to fail is surprisingly easy – just look at a graph of the monthly pipe bursts for a water company. Calculation The last step is understanding how and whether individual asset failures escalate to system failures. When does the asset failure lead to an impact on the customer and when does the standby just kick in so the customer never knows it happened? How resilient are our systems and our networks to the inevitable breakdowns? Have we got enough standby, enough cross-connection to be able to cope? Have we got too much? Techniques and software such as Reliability Block Diagrams and Fault Tree Analysis are designed to make precisely this calculation, turning an asset failure rate to a system failure rate – a quantity representing the reliability in failures per unit time. These techniques are available and software is also available to help us. The failure rate estimates at asset level are not so commonly available. The big databanks that store data and regularly re-assess failure rates are mostly maintained by the oil industry. Water data is scarce and we generally have to assume the equivalents in the oil industry have similar failure rates – a weak assumption. We may get some supporting evidence from company failure records but there is not enough of this data. Reliability is a means with which to measure our resilience both now and in the future as required by Ofwat. Using standardised or observed rates of failure and calculating the chances of the rare simultaneous failures needed to cause system failure, we can arrive at an objective measure. This helps with many decisions including those difficult calls involving safeguarded systems. If the pump fails, the standby kicks in and there is no consequence, so there is no risk, so how can we justify replacing the pumps? Well, there is still system risk which will increase as the pumps age – and reliability engineering can quantify it. So not only does reliability allow us to satisfy the requirements of the law but it may help us see where we have enough standby and where we need more. And that is a benefit worth chasing. This article was written by Alec Erskine, who is senior principal consultant at MWH, now part of Stantec. Reliability engineering key to resilience Page 10
  11. 11. Big Data is often characterized by the large volume of data, the wide variety of data types and the velocity at which the data must be processed. Data can come from many different sources, such as social media use, online purchases, licensed twitter data streams or sensors used in the Internet of Things (IoT).Big Data is generated by everything around us at all times. Every interaction in ecommerce and social media produces it. Computer systems, sensors and mobile devices transmit it. Big Data comes from multiple sources at a high velocity, volume, variety and complexity. Optimal processing power and analytics capabilities are needed to extract actionable information from Big Data. Businesses need analytics to convert the large and complex data sets into actionable information in order to make better decisions and provide a business advantage over competitors. Big Data analytics is the process of collecting, organizing and analyzing large data sets to discover patterns and other useful information. Big Data analytics examines large amounts of data from various sources to find patterns, correlations, trends and other insights. Big Data analytics can help businesses better understand the information within the data and identify which data can help improve the effectiveness of business decisions. Analytics are developed by building models based on available data, and then running simulations, iterating the value of data points and monitoring how it impacts results. Current computing power can run millions of these simulations, iterating all the possible variables until it finds a pattern, correlation or insight that helps solve the problem. Data analytics are used extensively in consumer marketing. As most of us who carry mobile devices have experienced, analytics enable consumers to be targeted with specifically tailored advertising for products and services based on our individual preferences. Data analytics are also used to optimize supply chain and other logistics for businesses. UPS, for example, analyzes data from a large number of sources to optimize vehicle routes to save time, lower fuel costs and for predictive maintenance on vehicles. Legal Issues in Big Data Privacy. The legal risks of Big Data begin with consumer privacy. Laws and regulations have focused on the privacy and security of personal information. In addition, most websites, online services and mobile apps have a privacy policy agreement and terms of service agreement (also called terms of use, user agreement, etc.) that users accept by clicking or continuing to use. Click wrap type agreements are generally more enforceable than browse wrap type agreements. Having a privacy policy is a good business practice but it may also be required by law or by third party services that collect information through a website. Both privacy policies and terms of service (TOS) should be periodically reviewed to be certain they accurately reflect business practices, particularly with respect to the collection, use and sharing of personal information. There is no single national law in the U.S. regulating the collection, use and sharing of personal information.7 There are federal and state laws and regulations that apply to certain types of personal information, such as financial or health information. There are also consumer protection laws that have been used to prohibit unfair or deceptive practices involving the disclosure of, and security procedures for protecting personal information. An example of personal information that raises legal concerns is health information protected by the Health Insurance Portability and Accountability Act of 1996, as amended (HIPAA). Data analytics is being applied to electronic medical records (EMR) to identify trends in patient care, epidemiology, treatment effectiveness, operational effectiveness and for other purposes. Predictive modelling using data from EMRs is being used for early diagnosis and to trigger warnings or reminders such as when a patient should get a new lab test or take other actions. The Federal Trade Commission Act is a consumer protection law that prohibits unfair or deceptive practices and has been applied to off-line and online privacy and data security policies. The online collection of personal information of children under 13 may trigger the Children’s Online Privacy Protection Act. The Gramm-Leach-Bliley Act (GLBA) is a federal law that regulates how financial institutions must handle personal information. The FTC issued a report on Big Data to provide guidance to companies about their Big Data practices. The FTC limited its focus to the commercial use of consumer information, and its impact on low-income and underserved populations. The FTC urged companies to apply Big Data analytics in ways to provide benefits and opportunities to consumers, while avoiding actions that may violate consumer protection or equal opportunity laws, or detract from core values of inclusion and fairness. California is the leader in state privacy laws. The California Online Privacy Protection Act applies to any person or company whose website, online service or mobile app collects personal information from California consumers. This law has broad geographical effect because of the widely accessible nature of online businesses. Excluding a California audience from access is not generally feasible. The law requires the operator to have a conspicuous privacy policy containing the following: • A list of the categories of personally identifiable information the operator collects; • A list of the categories of third parties with whom the operator may share such information; • A description of the process (if any) by which the consumer can review and request changes to his or her personally identifiable information as collected by the operator; • A description of the process by which the operator notifies consumers of material changes to the operator’s privacy policy; • Whether or not a “do not track” signal is honored; and • The effective date of the privacy policy. The law also requires the operator to comply with the privacy policy. There are also laws and regulations in other countries relating to data protection and privacy. Europe’s General Data Protection Regulation (GDPR) which becomes effective in May 2018 is a primary focus for business planning in 2017.14 This new EU data protection regulation will impose a greater compliance burden on businesses that offer goods and services to EU residents. A privacy policy also needs to contain the provisions required by the GDPR. The GDPR will apply unless the business does not offer goods or services to, or track or create profiles of, EU residents and does not have an “establishment” in the EU. Security. The Security Standards for the Protection of Electronic Protected Health Information (HIPAA Security Rule)15 provide standards for protecting personal health information. The HIPAA Security Rule requires appropriate administrative, physical and technical safeguards to ensure the confidentiality, integrity, and security of electronic protected health information.16 The GDPR also has a security standard requirement. California was the first state to enact a security breach notification law.17 The law requires any person or business that owns or licenses computerized data that includes personal information to disclose any breach of the security of the data to all California residents whose unencrypted personal information was Article: Legal issues in Big Data 2017 Page 11
  12. 12. acquired by an unauthorized person. Most of the early state security breach notification laws followed California’s law and established requirements for notification of a security breach rather than defining security standards. As of June 2017, 48 states, as well as the District of Columbia, Guam, Puerto Rico and the US Virgin Islands have enacted laws requiring notification of security breaches involving personal information.18 Recently, some states have established requirements to avoid a security breach such as the Massachusetts regulation which specifies a detailed list of technical, physical and administrative security standards for protecting personal information that must be implemented.19 HIPAA and the GLBA also have security breach notification requirements. While most attention has been on security threats to personal information, there also are security issues for non-personal information. Hackers changed chemical settings in a water treatment plant in a recently reported incident.20 The analyst firm Forrester predicted there will be a large scale IoT security breach in 2017. Control over Data. Ownership rights to Big Data can provide a competitive advantage since the data owner controls how the data may be used and shared. For example, Twitter’s data licensing business is its fastest growing revenue. Twitter sells its “firehose” of over 500 million daily tweets to various companies that try to turn the tweets into actionable information. Most of the business value in Big Data is in combining data from different sources. Ownership of data resulting from the analytics is also important. Rights to data are usually allocated in the privacy policy and TOS for websites, online services and mobile apps. Traditional signed agreements may be used in business to business transactions. For example, a signed agreement might be used between an IoT provider and its farm customers in a smart agriculture application.22 Joint ownership is a middle ground for ownership allocations in some business to business transactions. Intellectual Property Protection. Some data analytics software appears to remain patentable after the Alice court decision23 but patent holders and applicants will face challenges if they rely on computer execution of nothing more than routine algorithms. Inventive steps will be needed to make Big Data analytics software patentable.24 Such a patent may lose its value over time since the algorithm may improve over the one described in the patent and additional patent applications may be needed. IBM probably has the largest patent portfolio in the Big Data sector. Only some of the Big Data itself may be protected by copyright. Copyright applies to the form of expression not the meaning of text written by human authors. If there is only one way to express content then there is no copyright protection because there is no originality. Any data generated by machines or sensors will not be covered by copyright.25 That means a large amount of Big Data will fall outside of copyright protection. User generated data such as a photo, video or other work posted to a social media site may be protected by copyright but the TOS will likely provide that ownership is assigned to the site operator. Terms of Service Agreement. A TOS is the legal agreement that establishes the obligations and restrictions for using a website, mobile app or online service. The TOS includes provisions that reduce the risk of claims from users and others. There may be liability exposure if the data analytics software provides erroneous or no actionable information. Such liability is limited in the TOS primarily by limited warranty, disclaimers of warranties and limitation of liability provisions in the same way as for other contracts. The TOS may also cover scope of permitted use, restrictions on activities, disclaimers regarding content, indemnification, term and termination, copyright and other intellectual property rights, governing law, jurisdiction, dispute resolution and other issues. Conclusion Big Data is generated by everything around us at all times and includes both personal information and non-personal information. There are laws and regulations on privacy and security for personal information in the U.S. and elsewhere around the world. Collection, use and sharing of personal information must be consistent with a privacy policy and applicable laws and regulations. TOS and other agreements are used to establish the rules for other Big Data ownership and control and to mitigate risk. Data analytics is used to convert Big Data into actionable information that can provide value in a wide range of both consumer and business to business transactions. Acknowledgements This article was originally printed in Water Online and was written by Fred Greguras, Attorney, Royse Law Leak finding robot inspects pipes from the inside A new system developed by researchers at MIT could provide a fast, inexpensive solution that can find even tiny leaks with pinpoint precision, no matter what the pipes are made of. The system, which has been under development and testing for nine years by professor of mechanical engineering Kamal Youcef-Toumi, graduate student You Wu, and two others, will be described in detail at the upcoming IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in September. Meanwhile, the team is carrying out tests this summer on 12-inch concrete water-distribution pipes under the city of Monterrey, Mexico. The system uses a small, rubbery robotic device that looks something like an oversized badminton birdie. The device can be inserted into the water system through any fire hydrant. It then moves passively with the flow, logging its position as it goes. It detects even small variations in pressure by sensing the pull at the edges of its soft rubber skirt, which fills the diameter of the pipe. The device is then retrieved using a net through another hydrant, and its data is uploaded. No digging is required, and there is no need for any interruption of the water service. In addition to the passive device that is pushed by the water flow, the team also produced an active version that can control its motion. Monterrey itself has a strong incentive to take part in this study, since it loses an estimated 40 percent of its water supply to leaks every year, costing the city about $80 million in lost revenue. Leaks can also lead to contamination of the water supply when polluted water backs up into the distribution pipes. The MIT team, called PipeGuard, intends to commercialize its robotic detection system to help alleviate such losses. In Saudi Arabia, where most drinking water is provided through expensive desalination plants, some 33 percent is lost through leakage. That’s why that desert nation’s King Fahd University of Petroleum and Minerals has sponsored and collaborated on much of the MIT team’s work, including successful field tests there earlier this year that resulted in some further design improvements to the system, Youcef-Toumi says. Those tests, in a mile-long section of 2-inch rusty pipe provided by Pipetech LLC, a pipeline service company in Al Khobar, Saudi Arabia, that frequently uses the same pipe system for validating and certifying pipeline technologies. The tests, in pipes with many bends, T-joints, and connections, involved creating an artificial leak for the robot to find. The robot did so successfully, distinguishing the characteristics of the leak from false alarms caused by pressure variations or changes in pipe size, roughness, or orientation. “We put the robot in from one joint, and took it out from the other. We tried it 14 times over three days, and it completed the inspection every time,” Page 12
  13. 13. Focus on: Electro-magnetic flow meters Arguably in the Water Industry as a whole the electro-magnetic flow meter is one of the most popular ways of measuring flow, in wastewater it is probably just behind open channel flow measurement. In this article we will focus on electro-magnetic flow meters and briefly cover the principle of how they work, their application and some of the things that have to be done to keep them working at their best. The principles of electro-magnetic flow measurement The working principle of an electromagnetic flow meter is based upon Faraday’s laws of induction. Electromagnetic induction was discovered independently by Michael Faraday in 1831 and Joseph Henry in 1832. Faraday was the first to publish the results of his experiments. In Faraday’s first experimental demonstration of electromagnetic induction (August 29, 1831), he wrapped two wires around opposite sides of an iron ring (torus) (an arrangement similar to a modern toroidal transformer). Based on his assessment of recently discovered properties of electromagnets, he expected that when current started to flow in one wire, a sort of wave would travel through the ring and cause some electrical effect on the opposite side. He plugged one wire into a galvanometer, and watched it as he connected the other wire to a battery. Indeed, he saw a transient current (which he called a “wave of electricity”) when he connected the wire to the battery, and another when he disconnected it. This induction was due to the change in magnetic flux that occurred when the battery was connected and disconnected Bringing this concept forward to an electro-magnetic flow meter if an electrical conductor is moved in a magnetic field which is perpendicular to the direction of motion and to the conductor, an electrical voltage is induced in the conductor whose magnitude is proportional to the magnetic field strength and the velocity of the movement. From this we can see that the two wires and the iron ring basically act in a similar fashion to the first electro-magnetic flow meters as the electro-magnetic wave that Faraday created is altered depending upon the velocity of the fluid through the pipe. This is characterised by using the following equations: Uo ~ B . V. D With the induction B, the flow velocity v and the pipe diameter D The flow rate qv through the cross section A under consideration is q_v=A .V= (D2 π)/4 .V And putting the two equations together you get Uo ~ q_v To put this into a working practice requires that a magnetic field exist within a pipe and that the induced voltages can be measured without interference. Two coils generate the magnetic field that extends through the pipe only. Typically to achieve this Austenitic steel is used as it does not hinder the magnetic field. To prevent shorting out of the measuring signal UE the meter tube is provided with an insulating internal lining. The measuring voltage UE is measured by means of two metallic electrodes that are in electrical contact with the fluid in the pipe. This fluid must be able to act as an electrical conductor which normally in the water industry isn’t a problem but as an example distilled water , which doesn’t act as an electrical conductor couldn’t be measured through an electro-magnetic flow meter. This is illustrated in diagram 1. Putting it into practice – what makes up an electro-magnetic flow meter In practice, an electro-magnetic flow meter is made up of a number of different parts. In short these different parts are: A flow meter sensor; which includes: • A steel tube • Outer housing • A copper coil which creates your magnetic field • Measurement electrodes (which create the contact with the fluid) • A liner material (lining the inside of the steel tube • An insulating material (which acts as the insulator) • Some electronics A transmitter • Some electronics • Method of communication (an analogue output) • A display Figure 1: The basic principles of an EM flow meter Figure 2: A cutaway model of an electro-magnetic flow meter Page 13
  14. 14. The sensor itself is basically there to measure the velocity of the flow through the steel tube which is measured by the electo-magnetic force creating a voltage that is related to the velocity of the fluid passing through the pipe. As the pipe is fixed then this can be used to work out the flow rate. What can change in a typical sensor is the measurement electrodes and the liner material which will vary depending upon what is being measured. Typically in the wastewater industry the liner will be a hard rubber material and the electrodes made from stainless steel. However for more challenging applications the electrodes can be made out of other metals including Hasteloy C, Tantalum or Platinum with liner materials ranging from differing types of rubber to PTFE or ceramics. Application & Installation of electro-magnetic flow meters in practice Electro-magnetic flow meters are very versatile flow meter and in the wastewater industry are arguably probably the second most popular type of flow measurement after open channel flow measurement with flumes & weirs. Their application in wastewater is probably on almost any fully submerged closed pipe with wastewater or up to thickened sludges flow through it. So, what are the basic parameters that electro-magnetic flow meters will measure? • It must be a conductive liquid/semi-liquid (i.e. it must be able to flow) • It must be in a fully submerged pipe (although technology does exist for part fill pipes – the application becomes more difficult) • It must have a minimum velocity – this is dependent upon the sensor diameter but an indication of minimum velocity is that a 100mm magmeter requires a minimum velocity of 0.002m/s and a maximum velocity of 2.5m/s giving a very large operational range. Depending upon the layout of the pipework and depending what is in the pipework an electro-magnetic flow meter should be at least 5 upstream diameters and 2 downstream diameters from any pipe disturbance. In the case of major disturbances such as pumps or valves this measurement space increases dramatically. Of course, there are also exceptions to the rule and some of these rules can be “bent” although there is an implication to this. What cannot be changed is that the liquid must be conductive and must be able to flow. When a fluid becomes non-newtonian there are options up to a point. The pipe must also be fully submerged (unless using a partial flow magmeter). These are the rules that cannot be broken. The others, perhaps but the accuracy of the measurement technique will be affected. Velocity As we’ve seen above the velocity operating range of an electro-magnetic flow meter (depending upon supplier, model, etc) is roughly between 0.002m/s and 2.5m/s (this is taken from a suppliers published recommendations) but the more that the flow meter is operated at the lower end of the velocity spectrum so that the accuracy of the measurement will be affected. From the graph from one of the leading suppliers of magmeters we can see that below 0.2m/s the accuracy of the flow meter is affected markedly going from a 1% accuracy at 0.2m/s and doubling to 2% accuracy at 0.1m/s. Although electro-magnetic flow meters are one of the most accurate techniques for flow measurement it is necessary to size them appropriately for the flow rate that is expected. The accuracy dependent upon velocity is of course only one part of the story and the external influences on the electro-magnetic flow meter can be much more serious. Installation effects Historically, there has always been advice on the installation of electro-magnetic flow meters that they must be installed five upstream diameters and two downstream diameters from any pipe disturbance. People take this as a hard fact and as long as you install it like that then everything is going to be ok. Its right and its wrong….let me explain why. Firstly, people tend to miss out an important couple of words here which are “at least” and in fact some work done by TUV-NEL a few years ago came up with the following advice: This type of meter generally requires over 10 diameters of straight pipe upstream for its installation and there can be problems with large bore meters. On the plus side, electromagnetic meters are not as affected by swirl as much as other types of meters are. However, that is not to say that they aren’t affected by swirl at all. Particles or bubbles present in a fluid can affect an electromagnetic flow meter as they tend to either rise to the top of the pipe or fall to the bottom. One way to reduce the effect would be to install the electrodes horizontally on the pipe wall. Although experience in this area shows that in some applications this advice needs to change. To take an example where a flow meter is downstream of a large pump set it is more prudent for the number of pipe diameters upstream to increase to something akin to 15 upstream pipe diameters. Another example is where a Butterfly valve which can create a serious disturbance in the nature of the flow then a total of 20 diameters is more prudent to protect the accuracy of the flow meter. Of course, there are always exceptions to the rule and most of the electro-magnetic flow meter manufacturers have devices that will measure with no upstream of downstream straight lengths. It is the “ideal” world for the engineer who will always have an application. Most people will have seen the photograph of an electro-magnetic flow meter between two pipe bends. The question to ask is – although its available would you actually fit it that way and would you rely on Figure 3: Accuracy at a variety of flow meter velocities Page 14
  15. 15. the flow meter? Most people would say no. The best way to think of the affects caused by pipe disturbances is to go back to the principle of electro-magnetic flow meters and how they work. They are measuring the velocity of the flow through a straight length of pipe. Taking into account friction from the edges of the pipe the meter expects to see the peak velocity though the centre of the pipe. When you have a pipe disturbance such as a 90° elbow the area of peak velocity changes temporarily and so rather that the area of peak velocity being in the centre of the pipe it is in fact off-set and bends towards the inner bend of the pipe. For a Butterfly Valve this effect is even worse The disturbance is not what the electro-magnetic flow meter is expecting and so the velocity reading that it reads is affected creating errors in the flow measurement if the meter is installed to close to the disturbance in the pipe. What of course is the worse situation is if the disturbance is not stable and changes as then certain techniques like off-setting the meter to mitigate the error is not an option With installation effects it is of course best to ask the supplier of the flow meter. They will have experts within their business that will advise about particular applications and how to overcome particular problems within certain applications such as off-setting of the meter to correct for a particular Maintenance of electro-magnetic flow meters So, an electro-magnetic flow meter is in position, it’s been sized correctly, there are no pipe disturbances and it has been recording correctly. Surely, the job is done and we don’t have to worry about it until we unbury it in 20 years’ time when its approaching the end of its asset life? Not quite….it does need to be checked & verified and in some cases cleaned. For this access is necessary so if the flow meter is buried then you may have a problem. Calibration & Verification Now there is some misconception as to the differences between calibration & verification. For electro-magnetic flow meters its simple. If you are doing it in the field, unless you have hired a specific expert service to field calibrate your flow meter (it does exist by one manufacturer) you aren’t calibrating it. You are verifying it. Inside of every single electro-magnetic flow meter is a calibration. They are factors that tend to be hard-coded into the electro-magnetic flow meter from the factory and whatever you do don’t alter them. They are there for a reason and changing them will irreparably change the way that your flow meter measures. They are far too complicated to redo in the field unless you are an expert. They are normally on the manufacturer’s certificate which is produced when every electro-magnetic flow meter is made. In the factory each and every flow meter is put onto the calibration rig and the calibration factors are measured in the tiniest of detail. These are then coded into each flow meter. The rigs are usually huge and involve enough water to fill a swimming pool. The picture shows part of the electro-magnetic flow meter calibration rig (the outside bit) at Stonehouse in England. Each of the rigs are calibrated regularly to ensure the accuracy of the flow measurement. Generally as the rig can’t move to each customer then for an electro-magnetic flow meter to be “calibrated” it has to be taken out, sent back to the manufacturer, be put on the rig, calibrated and sent back. Although each and every supplier works as fast as they can all of the new meters also have to go on the rig and so the process is not instantaneous. After each calibration a certificate is produced. This is the manufacturer’s certificate (Figure 5A) This is calibration to a traceable standard. The alternative is verification where there is two separate and distinct methods which complement each other. The first is electronic or “dry” verification which is simply testing that all of the different elements of the flow meter are working and within tolerance of the original manufacturers certificate. This is usually undertaken by instrumentation technicians onsite and involves connecting to the flow meter either with a dedicated instrument or with a software programme depending where the calibration is undertaken . This runs the flow meter through a number of different check to see if the flow meter itself is within tolerances. This produces verification certificate (Figure 5B) and is not a calibration just a check. The second method of verification is wet calibration and this is verifying that the flow meter is actually recording flows to within tolerance when compared to an alternative traceable flow meter. Although this uses the principles of a calibration it is something in between as the calibration factors that are within the flow Figure 4: CFD through a straight pipe, a 90 degree bend and a Butterfly Valve Figure 5: The EM calibration rig at Stonehouse in England Page 15
  16. 16. meter are not adjusted. This typically is undertaken using an ultrasonic time of flight flow measurement device which is traceable back to national standards. Cleaning of electro-magnetic flow meters When electro-magnetic flow meters first came out it was thought that they could be installed and basically left until the end of their asset life. More recently studies have shown that in some cases this is true but in others it is not. In reality the cleaning of electro-magnetic flow meters is still not fully understood where the highest risk factors that will necessitate a cleaning programme lie but work that has been undertaken by the Water & Sewerage Companies in the UK is starting to uncover where the risk factors are high and where they are not and thus where an electro-magnetic flow meter can be left to record until the end of its asset life and where it needs more operational management. The first factor is where an electro-magnetic flow meter is within the process. If it is on the network or is on the inlet to a wastewater treatment works the risk factors are much higher. This seems an obvious point but it is not always so obvious. Take the example pictures in figure xx. Both are 300mm electro-magnetic flow meters and both are on the inlet to wastewater treatment works. Both of the meters have been in for five years of service. In the example in figure 6 the pipework and sensor are heavily fouled and on the far right, it is as if the meter had been freshly installed. Both meters are the same size, where installed at the same time and both are on the inlet to a works. In this particular case one of the sites utilises iron salts for phosphorus removal and the other did not. So the immediate conclusion is that inlet flow meters which dose iron salts upstream of the flow meter are at a higher risk of needing cleaning more often. In general if a flow meter is on the inlet the cleansing of the flow meter and surrounding pipe should be considered every five years Where a flow meter is on the outlet to the works then the question as to whether to clean the electro-magnetic flow meter or not is more tricky and the studies so far show that the larger flow meters are more susceptible to build up of debris within the pipe work. The natural thought would be down to a self-cleansing velocity not being achieved within the pipework of the meter. However none of the flow meters in the study achieved self-cleansing velocities and flow meters with similar operating velocities but different sizes had differing levels of fouling. In short the study, although has provided indications of where meter cleansing is required and where it is not there is still insufficient evidence to provide a certain recommendation. Summary The use of electro-magnetic flow meters is widespread within the water industry and has been for many years. It is a useful measurement technique that has been used for many years. Originally it was thought to be a “fit & forget” technology and yet, like any measurement technique, care has to be taken in the way it is selected, used and operated. This “care” involves the attention to detail that any measurement technique requires in selecting, installing, operating & maintaining it in the right way. This involves the thought of how it is going to be operated and maintained. With electro-magnetic flow meters this involves: • Selecting the most appropriate bore size so that it operates within the correct velocity ranges. • Installing it so that any pipe disturbance from external factors is minimised or eliminated and ensuring that it is pipe full • Installing it so that it is accessible so that it can be verified (both wet & dry verification), cleaned and replaced If all of this is done then the potential is that an electro-magnetic flow meter can be used with confidence in its accuracy for many years. Figure 5: Methods of Electromagnetic Flow Meter Calibration & Verification (L-R) A - A manufacturers certificate B - A site electronic verification C - Onsite wet verifcation against a meter that has been calibrated against a reference meter Figure 6: Fouled and clean magmeters - (L-R) A - Upstream pipe-work fouling B - The electrode in the sensor fouled C - A clean magmeter Page 16
  17. 17. Opinion: Ofwat’s PR19 methodology: Implications for resilience, asset health and leakage In 2014, Ofwat introduced outcome-based regulation into their methodology for setting water prices for the current AMP6 period. Last week, Ofwat published their draft methodology for setting prices for the next five-year AMP cycle, which will begin in 2020. As anticipated, the water industry regulator will aim to ensure that companies are focused on delivering the improvements that matter most to customers, while sharpening accountability for performance. Focus on resilience As expected, building resilience ‘in the round’ is a key theme of the draft methodology. There will be significant rewards and penalties through Ofwat’s financial outcome delivery incentives (ODIs) for water companies who outperform or fail to deliver across a wide range of KPIs, and these are likely to include both common and company-specific measures of resilience. The methodology adopts the definition proposed by the Ofwat Resilience Task and Finish Group, as follows: “Resilience is the ability to cope with, and recover from, disruption, and anticipate trends and variability in order to maintain services for people and protect the natural environment now and in the future.” Resilience for water companies means planning for and anticipating risks whether they are environmental, operational or virtual, so that any disruption to service is minimised and rectified as swiftly as possible. Predict, prepare, respond “Managing and being prepared for environmental and operational risks that are realised infrequently is a difficult task,” commented George Heywood, Technical Director at Servelec Technologies following the publication of the methodology, who added: “Despite recent efforts to develop standard resilience performance commitments for use at PR19, Ofwat make it clear that there is still much work to be done in this area. Companies will need to innovate to develop their own bespoke measures, and to quantify these using appropriate modelling tools and approaches.” Long-term resilience and long-term affordability for customers must be balanced. It is clear from Ofwat’s consultation that strong affordability and cost-benefit challenges will remain for proposals that involve significant expenditure. Companies will need to make a very strong case for expenditure, backed up by robust analysis. This will require accurate quantification of the impact of disruptions on customers, accounting for aspects such as complex network constraints, a wide range of initial conditions, and practicable operational responses. Servelec Technologies’ MISER software automatically optimises operation under failure events to ensure an auditable and objective analysis of supply interruptions and water quality impacts. Deeper understanding of risks is gained through Monte Carlo Simulation to derive consequence distributions based on the statistics of the inputs. To support the case for investment, optimal sizing of new schemes and scheme selection provide sound cost-benefit appraisals. Operationally, MISER increases resilience through optimal network operation by maximising security of supply, taking account of outages, forecast demands, storage levels, licence usage and load balancing. Longer-term, water resource yield assessments and supply/demand analysis ensure operation is sustainable and robust into the future. Servelec Technologies’ water network management advisory tool, MISER is used by nine water companies in the UK, who benefit from its ability to help them manage their risks, investment and vulnerability. Asset health Although not always associated with the resilience agenda, the underlying health of water company assets is a key element of providing resilient services to customers now and in the future. Ofwat’s methodology strengthens requirements in this area, informed by the findings of their recent horizontal review. To improve transparency aggregated performance commitments will no longer be permitted, and there are a number of common measures proposed to allow for more cross-industry benchmarking. There is a requirement for companies to forecast their performance commitments over at least a further ten years beyond the next price review period, to help customers and stakeholders engage on these longer-term issues. George added: “If there is a need for increased investment to maintain asset health in the longer term then companies will need to build a broad consensus for this with customers. We are already working with a number of companies to support them in determining investment requirements for PR19, using our PIONEER software. This work will include forecasting of investment requirements over the required period. In most cases this is being done using company- specific models prepared by ourselves or by the company, but we also have standard models that we can use for this purpose.” Leakage Leakage will continue to have a high profile at PR19, as Ofwat challenge companies to set more ambitious targets for leakage reduction, at least 15% over the Page 17
  18. 18. five-year period. Solutions exist that enable water companies to detect and locate events on their networks such as bursts before they become a problem, signaling a shift towards more proactive leakage reduction strategies. Portsmouth Water is the latest company to seek Servelec Technologies’ assistance in this area, adopting Servelec Technologies’ self-learning anomaly detection software, FlowSure. FlowSure has previously demonstrated six-figure net savings and typically pays for itself in less than 12 months; these savings can help finance extra network resilience in other areas. “Ofwat’s draft methodology is challenging and ambitious, aiming to spur the industry on to greater innovation in pursuit of more affordable and resilient services. The clock is now ticking for companies to show how they will respond to these challenges at PR19 and beyond.” About the Author George Heywood is Technical Director at Servelec Technologies, a leading provider of end-to-end data collection, control and optimisation solutions to the UK water industry. George is a mathematician and environmental engineer, who has spent over 20 years working in the water industry, mostly in asset management. He helps companies to improve their decision-making through evidence-based analysis, modelling and optimisation. . Servelec Technologies specialises in providing the water industry with expert modelling, optimisation and risk anal- ysis consultancy, combined with software products and development services. Our team has extensive experience gained from supporting utility companies with regulatory price reviews and practical planning since 1997. We provide modelling and wide ranging analysis services covering all utility asset types and have provided our asset management software PIONEER and related services to 30% of UK Water companies during the most recent price review. Servelec Technologies is also supporting international customers with their own regulatory reviews and routine planning. Thames Water uses satellite data to find forgotten London rivers Thames Water is using innovative satellite technology to find forgotten rivers in London - hundreds of lost rivers could soon be rediscovered and eventually restored thanks to the ground-breaking project. The team of river hunters from Thames Water’s innovation team have been collating information from historical maps and records and combining it with data from modern satellite images to track down former rivers – in the hope that many of them can be restored. To date the innovation team has only been able to map North London and predict where the rivers might be. However, they have already discovered a potential 68km of pipes and tunnels that may once have been natural watercourses but were buried and so became lost from view. They have been able to do this by creating detailed spatial modelling maps, using all of the combined data plus the known sewer and river network. David Harding, customer and stakeholder manager who originally suggested the project, added: “Identification of piped watercourses offers opportunities to restore modified watercourses back to their natural state, known as ‘daylighting’. “River daylighting is already taking place in towns and cities across the UK, and has many benefits including encouraging more wildlife. I work with many environmental groups and local authorities that are passionate about restoring lost rivers – they just need to know where they are.” Although the team has only been able to map potential lost rivers in North London, they’re hoping that if they attract the necessary funding, the modelling can be used across the Thames Water region to track down lost rivers. There are a number of reasons the watercourses were buried, such as to allow for building developments, to help manage flooding risks or to conceal pollution. Over time their original nature has been lost due to the transfer of records between different organisations. Now they are invisible, both on the ground and on the maps that most people use. Even when pipes are recorded on maps many of them are incorrectly recorded as sewers, which means the ownership of them, and the responsibility for the upkeep of them, is unclear. The rivers are also often a cause of flooding when pipes and tunnels become blocked or overloaded. If they’re restored to open river channels that can safely flood, it may help manage flooding in the affected areas. Rachel Cunningham, from the innovation team, said: “Now the potential culverted watercourses have been identified for this trial area, correct ownership can be investigated so there’s a better understanding of the maintenance and cleaning that needs to take place. “This is a really exciting project and we’re hoping to attract the funding that would allow for this to be carried out across the region.” Model of the catchment studied with the lost rivers in pink Page 18
  19. 19. Introduction Dissolved oxygen is a key ingredient in the efficient treatment of waste in water processes. A typical wastewater treatment plant uses four main stages of treatment – Primary, Secondary, Tertiary and Sludge. The secondary treatment stage is the point at which organic waste is oxidised to form carbon dioxide, water and nitrogen compounds. To achieve this, most modern plants use an activated sludge system, which uses a culture of bacteria and other organisms to feed on the organic materials in the sewage. When added in combination with the right temperature, these bacteria and organisms use dissolved oxygen to burn or break down organic carbons into carbon dioxide, water and energy, clearing the water of harmful substances. The importance of accurate dissolved oxygen control As a key requirement for most types of life, oxygen is one of the most important parameters in water quality monitoring. As such, water operators need to keep a close eye on levels throughout the water treatment process, from the treatment of waste at the aeration stage through to the point of final discharge. Optimizing aeration efficiency The efficiency of the aeration process relies on dissolved oxygen levels being controlled as closely as possible. Under ideal conditions, dissolved oxygen levels should be maintained at between 1.5ppm to 2ppm. If not enough dissolved oxygen is available, the aeration basins will be deprived of the oxygen needed for effective bacterial growth, negatively affecting the rate of sewage breakdown and impairing treatment process efficiency. Too much dissolved oxygen can also have a detrimental impact. With aeration processes accounting for over half of a plant’s energy costs, it is vital that their efficiency is optimised as much as possible. Failing to ensure tight control of dissolved oxygen greatly increases the risk of operators incurring excessive energy costs. Minimizing environmental impact With operators of water treatment plants facing ever stricter controls on the quality of the water they discharge, it is vitally important to ensure that anything that could affect the health of watercourses and aquatic areas. Where dissolved oxygen is concerned, it is particularly important to ensure that levels are controlled as closely as possible. Both excessively low and excessively high levels of dissolved oxygen can be equally as harmful to aquatic life, making it essential for water treatment plants to ensure that levels are as close to ideal as possible before water is discharged. An example is the potential creation of filamentous growths and ammonia during the wastewater treatment process caused by insufficient aeration. If left unresolved, these harmful by-products can escape into the environment, damaging aquatic life and leading to potentially stiff financial penalties for water operators. What are the main types of measurement techniques for dissolved oxygen? As a key indicator of biological activity levels in water, dissolved oxygen has always been a critical measurement in wastewater treatment processes. Starting with in-situ manual collection techniques, the way in which dissolved oxygen levels have been measured has evolved as technology has become more sophisticated. The following is a brief description of the various key methods historically used to measure dissolved oxygen. The Winkler Titration method Originally developed by Ludwig Winkler in 1888, the Winkler Titration method is an in-situ test involving the mixing of a known chemical alkali solution with a known acid solution to assess the level of dissolved oxygen in a sample. The process starts with manganese sulphate being added to the water sample, followed by an alkali, iodide or azide reagent. The sample is then mixed, with the presence of any dissolved oxygen being indicated by the formation of a brown-coloured precipitate (shown below as MnO2(s)). The equation for this process is shown below: 4e- + 4H+ + O2 = 2H2O 2Mn2++ 4OH- = 2MnO2(s) + 4H+ + 4e- 2Mn2++ 4OH- + O2(aq) = 2MnO2(s) + 2H2O The sample is then mixed with a titrate of sodium thiosulfate, followed by a starch solution, to produce a blue colour. Extra titrate is then be added until the sample turns clear. At this point, the level of dissolved oxygen can be calculated, with the level being proportional to the amount of titrate added in milliliters. Although the test is relatively simple to perform, it nevertheless suffers from several disadvantages. Firstly, the sample has to be measured as quickly as possible, in order to provide as accurate a reading as possible. Linked to this is the fact that a sample assessed using the method could only ever offer information on the level of dissolved oxygen for a specific moment of time. For wastewater treatment processes, this made any data gathered of limited value in helping to achieve a consistent level of dissolved oxygen. White Paper: Dissolved Oxygen Measurement in Wastewater Page 19
  20. 20. Portable dissolved oxygen meters Another common method for collecting dissolved oxygen samples is to use portable meters. Combining a handheld meter with a choice of a galvanic or optical sensor, this method offers a number of advantages over the Winkler test. With pre-calibrated values automatically programmed into the device, a reading can be obtained almost immediately. By eliminating the time taken to conduct a test, more measurements can be made within a given timeframe. However, as with the Winkler test, portable meters are designed for grab samples and thus only provide an indication of dissolved oxygen levels for a particular moment in time under a particular set of conditions. This is especially important in a wastewater treatment process if the readings are used to determine the settings of the dissolved oxygen blowers. If the conditions remain consistent, then the blowers will provide the correct level of dissolved oxygen for optimum aeration. However, if the conditions are subject to variation, then additional readings will need to be taken in order for the blowers to be reset. Online measurement Continuously measuring dissolved oxygen levels offers the best way of ensuring that the right levels of oxygen are being delivered for maximum aeration efficiency. When used in conjunction with modern sensing technology, an online dissolved oxygen measurement system can offer much tighter control of dissolved oxygen levels, matching them to actual oxygen demand. When coupled with automatic blower control, significant energy cost savings can potentially be realised through reduced air consumption. Sensor technology There are two main types of sensors available for dissolved oxygen monitoring – electrochemical and optical. Electrochemical sensors Electrochemical sensors work on either the polarographic or galvanic cell principles. Both work in a similar way, featuring a polarised anode and cathode with an electrolyte solution surrounded by an oxygen permeable membrane. The measurement is derived based on the difference in oxygen pressure outside and inside of the membrane. Variations in the oxygen pressure outside of the membrane affect the rate of diffusion of oxygen through the membrane itself. The cathode reduces the oxygen molecules, producing an electrical signal that is relayed first to the anode and then to a transmitter, which converts the signal into a reading. This process can be represented as: At the anode: 2Pb → 2Pb2+ + 4e- At the cathode: O2 + 2H2O + 4e- → 4OH- Overall reaction: O2 + 2Pb + 2H2O → 2Pb(OH)2 (insoluble) The consumption of the oxygen at the cathode means that a constant flow of sample is needed in order for a reading to be as accurate as possible. In most cases, this will require the sample to be stirred constantly at the sensor tip, in order to produce the necessary levels of oxygen for an accurate reading. One potential drawback of polarographic sensors occurs during start-up. In contrast to galvanic sensors, where the probes are able to self-polarise unpowered, polarographic sensors require a ‘warm-up’ period, lasting from 5 to 15 minutes, for the probes to polarise. The requirement for a constant current means that the sensor consumes more power than other sensor types, making it comparatively less cost-effective. In terms of performance, electrochemical sensors have been proven to offer similar levels of measurement accuracy to optical devices. However, their requirement for a constant flow and their susceptibility to fouling by filamentous growths such as algae, or clogging by fats oils and grease, make them comparatively less reliable under non-ideal monitoring conditions. Where this occurs, the risk of inaccurate measurement and inefficient blower control is greatly increased. Continued sensor drift, coupled with fouling of the sensor membrane, also means that frequent maintenance, including calibration, is needed, ranging from once a month to once a day in extreme circumstances. Optical sensors Originally developed in the 1970s, optical sensors have evolved to overcome many of the limitations associated with their electrochemical counterparts. In contrast with electrochemical sensors, optical sensors have no membrane or chemical components. The most advanced dissolved oxygen sensors work on the ‘dynamic luminescence quenching’ principle, a light-based measurement technique. Optical sensors are comprised of lumiphore molecules embedded in a sensing element, plus blue and red LEDs and a photodiode. Figure 2 shows how an optical sensor works. The operating principles of optical sensors shown in the figure can be explained as follows: Figure 2: The principle of Optical DO Measurement Figure 1: Principle of electrochemical DO measurement Page 20

×