Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

WIPAC Monthly August 2018

300 views

Published on

In this month's edition of WIPAC Monthly, the magazine from the Linkedin Group Water Industry Process Automation & Control, we have articles on

What to expect from the new generation of sensors
The intersection of data analytics and data governance
Adding Sensor to AMI
The emerging Dichotomy of the Industrial Internet of Things
Focus on: Phosphorus in Wastewater

This edition also sees the launch of the new WIPAC website which is a prequel to the launch of WIPAC as an organisation.

Have a good month

Oliver

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

WIPAC Monthly August 2018

  1. 1. WIPAC MONTHLYThe Monthly Update from Water Industry Process Automation & Control www.wipac.org.uk Issue 8/2018- August 2018
  2. 2. Page 2 In this Issue WIPAC Monthly is a publication of the Water Industry Process Automation & Control Group. It is produced by the group manager and WIPAC Monthly Editor, Oliver Grievson. This is a free publication for the benefit of the Water Industry and please feel free to distribute to any who you may feel benefit. However due to the ongoing costs of WIPAC Monthly a donation web site has been set up to allow readers to contribute to the running of WIPAC & WIPAC Monthly, For those wishing to donate then please visit https://www.patreon.com/Wipac all donations will be used solely for the benefit and development of WIPAC. All enquires about WIPAC Monthly, including those who want to publish news or articles within these pages, should be directed to the publications editor, Oliver Grievson at olivergrievson@hotmail.com The picture on the front cover is from the article the news story that Thames Water are using a variety of techniques, including the use of drones, to combat leakage from the water distribution network From the Editor............................................................................................................. 3 Industry News............................................................................................................. Highlights of the news of the month from the global water industry centred around the successes of a few of the companies in the global market plus a report from the SWAN Forum conference in Barcelona. 4 - 11 What to expect from the new generation of sensors..................................................... In this article from Rosa Richards of the Sensors for Water Interest Group the latest developments in field and laboratory based sensors from using Bluetooth to field based stripping voltammetry are discussed. 12-13 The Intersection of data analytics & data governance.................................................... Data analytics is the latest area of innovation in the water industry but how is this is governed. In this article by Infogix the intersection of data analytics and data governance is discussed. 14 Jumpstart Smart Infrastructure by adding sensors to AMI........................................... AMI or Advanced Meter Infrastructure has been present on water meters for a few years now. In this article adding sensors to an AMI network as an alternative communication route is discussed. 15-16 An emerging dichotomy in the industrial internet of things........................................... In this article by Michael Henk of US Water the development of the industrial internet of things and the way that we think of its uses and the way that we form the business case for its use is outlined. 17-18 Focus on : Phosphorus in wastewater........................................................................... In this continuation of the “Focus on” series we look at the basics of phosphorus in wastewater. Why continuing pressures in phosphorus are driving the need for more advanced monitoring and control systems. 19-21 Workshops, Conferences & Seminars............................................................................ The highlights of the conferences and workshops in the coming months. 22-23
  3. 3. Page 3 From the Editor For me the exciting news this month is the “near” completion of the WIPAC website it is at least up and public. It has been something that has been interesting as I have been forced to develop new skills and tiptoe through the languages of HTML,CSS. Mailhosts and where things should point to. Its certainly not something that I am comfortable with doing but when you push yourself in areas that you don’t know before it is something to experiment with. The end result, as it is at the minute, is near to what I wanted to achieve but not quite there yet. From here there is a lot more work to do but this is the point that I bring other people in the form of the WIPAC membership which hopefully will develop over time. It is a process, albeit it on a much larger scale and with a lot greater an importance that the Water Industry is going to have to go through. The industry needs to develop new skills, it needs to develop new ways of working in adopting the innovations that Water 4.0 offers. It is willing and able to make these changes. Since writing my recent article on Water 4.0 and Wastewater which was based on the Sensors for Water Interest Group workshop in July some of the major engineering consultants in the water industry have been in touch to say “talk to us more about these concepts” and the discussions so far have been interesting at best. The article of course was largely based upon the concepts that we have discussed in the WIPAC Group and so the question has to be asked, why now? Well it’s very much like the Gartner Technology readiness curves plus also the drivers of the next Asset Management Period in the Water Industry. It is a case of both the political environment and the technology being ready for the adoption of Water 4.0. With the work that organisation that the Smart Water Networks (SWAN) Forum have been doing around the potable water network have matured the “vertical” segment of the potable water distribution network with particular reference to water consumption and leakage. Showing that this has worked with the area of the business where the value is very easy to prove the financial and societal benefits gives more drive to see what else can be done. Of course on the potable water networks the next area that is being worked upon is quality and seeing how we can improve water quality by operating networks in a more calm and controlled way ensuring that the water age in the network doesn’t get too high or the operation of the network doesn’t disturb the elements within. Where do we go with the other elements of the water industry? The next obvious stage is the wastewater network and wastewater treatment bringing control to the network by maximising network storage to balance inputs into the wastewater treatment works when the weather allows. A calmer and more controlled wastewater treatment systems allows for a system that is more efficient in the way it operates when weather permits. In the UK at least with programmes such as the Event Duration Monitoring programme and the future FFT Monitoring programme the operation of an overflow will become very evident. All of this will easily detect where the system is failing to correctly control flows passing forward and areas where improvements are necessary due to incapacity of the system. The next step is to go from just detecting the way the system is operating from actively controlling the system from Toilet to Water-body. It is a concept that has been going on for quite awhile and the industry has seen the case study of how to do it. Detection of where the problems lie will ensure that the right areas are invested in which will allow the need to be developed so that there is an issue the obvious solution of expanding works and storm tanks isn’t the default position and the option of putting a smart water network is considered. The last thing to mention this month in the editorial is the launch of the WIPAC Website which is the first stage in the development of WIPAC into a full membership organisation. There is an article on the next page about this development and as it states in the article please feedback to me what you think and what further developments that you would like to see. Have a good month, Oliver
  4. 4. The Future of WIPAC - Stage one complete(ish) A few days early and if I am to be honest it is not completley finished, as it will probably never will be, the WIPAC website is launching with this edition of WIPAC Monthly. I took the decision to launch now before everything is finished as there are some important events coming up and it will grow as companies come on board with WIPAC as a membership group and more companies join the WIPAC Directory and see what the WIPAC group has to offer. The other reason for launching the website now is that LinkedIn are cutting out the group announcements and so the weekly update email will be disappearing, at least for the short-term future, and so communication with the group from the group management is going to be less. It is something that I have discussed in depth with LinkedIn (it was a long conversation one night) and it has been done to protect group members from spam or sponsored emails that have been sent out by some group managers. So, what does the new WIPAC website give and what functionality will it bring. Firstly, its all about industry news and who is releasing what and the interesting articles that are happening in the Water Industry, very much along the lines of the feed in the LinkedIn Group. The next section is the long awaited WIPAC Directory. It is a concept that I have considered for many years and have always struggled to deliver as WIPAC has actually been costing money and was being paid for out of my own pocket. With the re-launch of the WIPAC website it seemed a natural time to put things into action, attempt to do some coding myself and get it up and running. The one thing to bear in mind with the WIPAC Directory is that it is biased, it has been initially filled with one company who has joined WIPAC and has the full WIPAC Directory listing and some other major companies in the Water Industry (who I think are likely to join WIPAC as they have always been great supporters in the past). Over time it will grow especially with engagement in the WIPAC Group. The next section of the WIPAC Website is the undeveloped bit and this is the knowledge management area. There are ties being developed with the Sensileau platform and once the WIPAC Workshops and Webinars start the recordings of these sessions will be available for at least at short period of time here. I am also going to work on some of the useful resources and downloads that are freely available on the internet but are not always the easiest of things to find. The next few sections include things about what WIPAC does and will in the future will be the area for people to book onto the WIPAC workshop and webinars with an embed to Eventbrite pages as well as other significant events within the UK water industry. There also details for companies to see if they’d like to join WIPAC and the details of how to join. Overall feedback on the WIPAC website is wanted and it is only going to get better and more useful with the feedback of members of the WIPAC Group. For example what do people want to see in the knowledge management systems, the WIPAC directory will only get stronger and more useful as more companies join which will of course be used to fund the WIPAC Workshops and Webinars. So what next, the first stages are to get the company details sorted and to turn WIPAC into a registered company which will be not-for-profit and ideally a community interest company which will be asset locked. Then it will be all about increasing the number of membership companies so that there are enough funds or the official launch that is going to take place at the WWEM Conference and Exhibition this November. There is lots happening with WIPAC and the first steps are in the WIPAC Website. Page 4 Industry News
  5. 5. Scientists Use Satellites To Measure Vital Underground Water Resources The availability of water from underground aquifers is vital to the basic needs of more than 1.5 billion people worldwide. In recent decades, however, the over-pumping of groundwater, combined with drought, has caused some aquifers to permanently lose their essential storage capacity. With the hope of providing better tools to water resource managers to keep aquifers healthy, scientists funded by the National Science Foundation (NSF) and affiliated with Arizona State University (ASU) and the Jet Propulsion Laboratory (JPL) are using the latest space technology to measure this precious natural resource. “Periods of drought have long-term effects on groundwater supplies and create major challenges for groundwater management,” says Maggie Benoit, a program director in NSF’s Division of Earth Sciences, which funded the research. “Now, scientists are developing new methods of monitoring groundwater levels using satellite-based measurements of Earth’s surface, providing a more comprehensive picture of the health of our nation’s groundwater resources.” The researchers have focused their efforts on one of the world’s largest aquifer systems, located in California’s Central Valley, measuring both its groundwater volume and its storage capacity. The results of their findings are published in the American Geophysical Union journal Water Resources Research. Peering underground from space California’s Central Valley is a major agricultural hub covering an area of about 20,000 square miles. It produces more than 25 percent of U.S. agriculture, at an estimated value of $17B per year. The Central Valley aquifer system provides water for people and wetlands, supplying about 20 percent of the overall U.S. groundwater demand. Because of drought and the increase in the human population this aquifer serves, it is ranked one of the most stressed in the world. While past studies on water resources and drought have focused mainly on low-resolution or local scale measurements of groundwater dynamics, the research team for this study, which includes ASU scientists Chandrakanta Ojha, Manoochehr Shirzaei and Susanna Werth, and Donald Argus and Thomas Farr from JPL, took a more high-tech route. They used the data collection features of several satellite-based Earth remote sensing techniques to obtain a more consistent and higher resolution view of the Central Valley aquifer system. “Ironically,” says Werth, “we had to go several hundred miles into space to see what was going on under the surface of our planet.” Using these high-tech, remote-sensing techniques, the team analysed data from the 2007 to 2010 drought and mapped the entire California Central Valley. “It’s great when we can use our high-tech, Earth-orbiting satellites to help solve real-world problems right here in California,” adds Farr. An indicator for aquifers around the world The team measured land subsidence (when land above and around an aquifer shifts downward) using space-borne Interferometric Synthetic Aperture Radar (InSAR) and added that to data on groundwater levels sampled at thousands of wells across the Central Valley. The researchers then used data from NASA’s twin satellite mission, the Gravity Recovery and Climate Experiment (GRACE), to estimate groundwater loss. “It’s this combination of literally terabytes of data that helped us get the best picture of what is happening below the surface,” says lead author Ojha. The team found that between 2007 and 2010, there was a significant drop in ground levels in the southern area of the Central Valley -- nearly 32 inches, a decrease that should normally take decades. “Groundwater overdraft in some parts of the Central Valley has permanently altered clay layers, causing rapid ground sinking that can be measured by radar satellites from space,” says Shirzaei. The most startling result, however, is the permanent loss of water storage capacity in the aquifer system. During the 2007 to 2010 drought, up to 2 percent of storage capacity was lost entirely when the water level declined and the clay layers in the system were permanently compacted. “That storage capacity cannot be recovered through natural recharge,” says Ojha. “This means that during the wet season, when the Central Valley gets rain, there is not enough space to store the water, making groundwater supplies more scarce during future droughts.” New satellites to measure the effects of drought The next step for the team will be to focus on the drought in California from 2012 to 2016, a period that was more detrimental to the Central Valley aquifer than the 2007 to 2010 drought. The researchers plan to integrate radar measurements with additional data from the newly launched GRACE Follow-On (FO) satellites. The GRACE FO mission, which launched on May 22 of this year, consists of two nearly identical satellites that follow one another along the same orbit. The satellites continually measure the distance between them, which changes depending on the gravity field over which they are orbiting. Since oscillations of groundwater change the gravity field, scientists can use the data to map underground water location and volume change. The work will not end there. The team hopes to extend the research to Arizona and other areas of the arid Southwest. “The whole region is affected by a long-term drought,” states Werth, “with differences in severity, climate conditions, groundwater geology and water management approaches. “Our hope is that this research will enable decision-makers to accurately manage water resources and plan for future water allocations. Water managers need to know about the irreversible processes taking place and how to prevent future crises.” Page 5
  6. 6. Thames using ‘eyes in the sky’ to combat leaks Thames Water has launched a three-pronged aerial attack in its hunt to find and fix leaky pipes. The company, which has said it will aim for 50 per cent leakage reduction by 2050, is using a fleet of drones, an aeroplane and a satellite to boost its battle against leakage. The technology has been used to spot dozens of possible leaks using state-of-the-art thermal imaging and infrared cameras, providing an invaluable tool in the fight against leakage. Reducing leakage is a key priority for the company, which has pledged to get back on track with its targets by 2020 and then further reduce it by 15 per cent by 2025. Dozens of teams are fixing more than 1,000 leaks a week across the company’s 20,000-mile underground pipe network, with the ‘eyes in the sky’ giving them another helping hand. Euan Burns, chief engineer at Thames Water, said: “Reducing leakage is one of our main priorities, and we know it’s really important to our customers too. “We’re always looking for innovative ways to help solve operational issues, and this aerial approach with the latest technology will give us another perspective and another tool to help find leaking pipes. We’re in the early stages of introducing this at the moment, but the signs have been encouraging and we’re looking forward to seeing the results our eyes in the sky can bring.” Currently, technicians use acoustic loggers on pipes to listen for water escaping, and also use data to track how much water is going through pipes compared to how much was produced at the treatment works. Visible leaks are also reported by both staff and members of the public. As reported by the BBC’s Reality Check team, leakage has fallen 38 per cent since privatisation, and Thames is determined to use every tool possible to get levels down even further. It has restructured its teams to tackle leakage, and is investing in both people and resources. The company’s fleet of five drones, all manned by fully qualified pilots, can fly more than 100 metres high and a distance of 500 metres to survey huge landscapes. In July, they flew 28 times with on-board thermal imaging cameras beaming live footage back to screens where experts look for leaks. In three flights, they found leaks within 60 seconds of launching, which are now in the process of getting fixed. Another weapon in the company’s armoury against leakage is the use of a satellite, which takes high resolution images of the ground. These pictures are then cross-referenced with maps of pipes and other aerial images, before teams are given an exact location to investigate. It is set to be used on a trial basis and Thames Water has teamed up with company Earth-i for the project. It was particularly useful during the prolonged dry and hot spell, with green areas in parched landscapes clearly showing a leak, which may be missed during wetter conditions. Finally, an aeroplane with an infrared camera has also been flying through the skies, predominantly over the rural areas of the Thames Valley and south London. As part of a joint project alongside the Water Research Centre (WRc) the camera on board the Vulcanair aircraft takes hundreds of pictures, which are then analysed by special software which can pinpoint areas where leaks can be found. Page 6
  7. 7. Anglian Water in energy storage deal Anglian Water has agreed a partnership with redT and Open Energi which will energy storage facilities installed alongside solar panels at one of its water treatment work The utility has purchased a 60kW/300kWh redT energy storage machine to install alongside a 450kWp solar PV system at one of its works. This will enable the company to store excess solar generated during the day and use it at other times, to reduce the site’s reliance on the grid. As the largest power consumer in the East of England, reducing reliance on volatile grid electricity will enable optimisation of a £77M energy bill, which is one of the company’s most significant operational costs. Maximisation of renewables generation and consumption is part of Anglian Water’s strategy for delivering Carbon Neutrality by 2050. Over the next 18 months, the company will be building over 30MWp of solar under a 25-year PPA contract with HBS New Energies & Macquarie Principal Finance. This programme of work will reduce carbon emissions by 15,000 tonnes of CO2e and increase our renewables generation by approximately 25 per cent, delivering annual savings in excess of £1 million. This is being followed by a second significant solar programme which will shortly be out for tender. This additional generation for solar will supplement the increasing amount of renewable power that Anglian Water is generating from its wind turbines and its fleet of Combined Heat and Power engines powered by biogas. Investing in energy storage infrastructure will enable Anglian to increase onsite solar generation by 80 per cent at the ‘pathfinder’ site in Norfolk from 248kWp to 450 kWp. In parallel, redT’s energy storage machine will create additional value for Anglian Water by providing real-time balancing services to take advantage of wholesale energy price arbitrage. In total, the project is expected to reduce site electricity costs by 50 per cent per annum by 2040. Anglian Water and redT are partnering with energy tech company Open Energi to ensure maximum benefits are derived from the pathfinder. The flow machine will be fitted with Open Energi’s Dynamic Demand 2.0 software which harnesses artificial intelligence to optimise energy consumption and stack multiple de- mand-side value streams. The redT machine is sustainable and non-degrading and can provide at least 5 hours of energy storage, which makes it ideal for use alongside on-site generation, such as solar PV. It is also fast responding and flexible enough to react to real-time energy trading opportunities. These machines have a lifespan of 25 years and do not degrade like conventional lithium or lead-acid batteries, so there is no marginal performance cost to stacking multiple value streams or changing activities over the life of the project, delivering the greatest end-user benefit. redT’s CEO, Scott McGregor said: “Our machines work alongside on-site generation to give businesses their own local energy infrastructure and create the environment for ‘baseload’ renewables, this system will allow Anglian to harness more cheap solar on site and increase generation from 248kWp to 450kWp. Open Energi’s intelligent software means these assets can be flexibly managed to deliver the best possible outcome for businesses, cutting costs, creating revenue and making the most of renewable power generated on-site.” David Hill, Commercial Director at Open Energi added: “Energy storage puts businesses in control of their energy use like never before and redT’s machines offer scope for businesses to stack value streams and shift consumption off-grid for significant periods of the day. We’re excited to be working with customers to deliver value and create an energy system where renewable generation is delivering power 24/7.” Jason Tucker, Director of Alliances and Integrated Supply Chain at Anglian Water said: “We are very happy to be at the forefront of our industry by taking part in this ground-breaking project. This pathfinder integrates, rather than simply co-locates, storage and solar. The approach will enable us to develop future-proof solutions for managing energy more flexibly and efficiently, whilst increasing resilience. Using redT’s flexible energy storage infrastructure alongside Open Energi’s smart software will allow us to unlock more solar power, as well as allowing us to participate in grid services to further reduce our energy bills. Most importantly, this collaborative project will provide us with invaluable insight to support our future energy strategy, as one of the largest energy ‘prosumers’ in the East of England.” Algorithm Provides Early Warning System For Tracking Groundwater Contamination Groundwater contamination is increasingly recognized as a widespread environmental problem. The most important course of action often involves long-term monitoring. But what is the most cost-effective way to monitor when the contaminant plumes are large, complex, and long-term, or an unexpected event such as a storm could cause sudden changes in contaminant levels that may be missed by periodic sampling? Scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Savannah River National Laboratory have developed a low- cost method for real-time monitoring of pollutants using commonly available sensors. Their study, “In Situ Monitoring of Groundwater Contamination Using the Kalman Filter,” was published recently in the journal, Environmental Science & Technology. “Conventional methods of monitoring involve taking water samples every year or every quarter and analysing them in the lab,” said Haruko Wainwright, a Berkeley Lab researcher who led the study. “If there are anomalies or an extreme event, you could miss the changes that might increase contaminant concentrations or potential health risk. Our methodology allows continuous monitoring in situ using proxy measurements, so we can track plume movement in real time.” “ Analysis of the autonomous in situ data can be rapidly analysed remotely using machine learning methods,” she added. “It can act as an early warning system – we can detect sudden changes in contaminant levels. These changes may indicate a need for more or less intervention in terms of the remediation strategy, ideally leading to improved as well as more cost-effective cleanup.” Page 7
  8. 8. Utileyes: Northumbrian’s award-winning ‘virtual visit’ app The Utileyes app - which allows Northumbrian Water Group technicians to view potential problems inside customers’ homes and elsewhere remotely. As Northumbrian Water Group embarked on its ‘unrivalled customer experience strategy’ in 2016, a three-day innovation sprint set about answering a question: How might the company make customers’ lives easier and work smarter as a business? The sprint participants had been exploring ways to improve customer journeys and were discussing the fact that, when people called to report a problem such as a suspected leak, the company would usually have to send a distribution technician (DT) out to the property, which took seven days on average. “Then somebody came up with a brilliant idea,” Northumbrian Water customer director Claire Sharp says. “They asked: ‘What if we could see inside customers’ homes?’” When one of the participants mentioned a video-sharing tool that had been used at Northumbrian’s Information Services 2015 Conference, the potential was clear: customers could download an app and, using the camera on their device, allow the DTs to make a ‘virtual visit’. Northumbrian began work on making the idea a reality and, while an initial plan to make use of a pre-existing app had to be aborted due to data protection issues, the company was able to develop its own software successfully. The Utileyes app was launched in July 2017 and, in its first year, the company was able to virtually validate in excess of 400 leaks. “Every visit that we do virtually saves us at least £50 in terms of a technician going out, and clearly it’s massively quicker – we can fix leaks in around a day-and-a-half, whereas normally it’s three days,” Sharp says. Positive customer response Its benefits have not gone unnoticed: the judges at the 2018 Water Industry Awards declared Utileyes the “clear winner” in the Customer Service Initiative of the Year category, saying it is an example of “how innovation can be turned into real customer benefits”, and it has received a very positive response from the user base. Sharp adds: “If customers are at home when they call, we can ask them to download the app and almost instantly we can see the problem and explain what it is. They’re really quite wowed by it because it’s not what they expect. They often think they’re going to have to get an appointment and take time off work.” One of the initial challenges for Northumbrian was convincing its own technicians that they could carry out their duties effectively with Utileyes, and the company had to ensure the app could support sufficiently high-quality video to make doing so viable. Once staff recognised that the software was up to the task, word quickly spread around the company. “We have this real innovation culture where we’re encouraging people across the business to share their ideas and the innovations they’ve come up that are making a difference for customers,” Sharp says. “Utileyes was showcased at one of our regular team-talk sessions, and that just sparked off lots of discussions across the business.” Staff in other departments realised that Utileyes, in its existing form, had potential for them too – for example, a member of a maintenance team contending with an engineering problem might use the app to connect with a senior technician back at base. Trench inspections Northumbrian has also been using it to carry out trench inspections. While the normal wait time for a trench inspection can be between five and ten days, a virtual trench inspection can take place either on the same day or next working day. “We often find that once you’ve found an innovation, there’ll be other applications for it,” Sharp says. “I think there will still be more that we can do with Util- eyes. Our customers are becoming more tech-savvy all the time. Most people use their mobile phones to access apps as opposed to using it as a telephone now – that’s just how things are evolving – so as customers become savvier and more comfortable using this sort of technology, the possibilities are endless really.” It is no secret that Northumbrian – which also took Water Company of the Year at the Water Industry Awards – is putting more stock in the value of new ideas, and it hosted its second Innovation Festival in July, attracting thousands of attendees and the involvement of over 500 businesses over the course of its five days. “We look outside the business a lot. We have an innovation panel that has representatives from Apple, Amazon, National Grid and Microsoft working on it – we’re getting access to some other innovative organisations that are really pushing the boundaries and testing us to do more. “It’s great when you have access to those organisations and the people within them because often posing a problem to them or a challenge that we’ve got will spark something that they’re doing and we can work together.” Sharp acknowledges that some Northumbrian customers will be more comfortable with technology than others and that apps such as Utileyes may not be for everyone. As such, the company has no intention of abandoning traditional face-to-face services for those who want them. “Sometimes you can’t beat that,” she says. “We really pride ourselves on being customer-focused – building rapport and looking after them. Technology has a place but so does human contact, and we must never forget that. It’s just part of our evolution to look at how we can use innovation. It might be apps, it might be something else, but we want to make our customers’ lives better and make sure they’re getting an unrivalled service from Northumbrian and Essex & Suffolk.” Page 8
  9. 9. Southern Water’s Peter Jackson on why data officers deserve a place in the C-suite When Peter Jackson was appointed chief data officer (CDO) of Southern Water back in April 2017, he wasn’t only taking his place in a newly created role at the company, but he was one of the first CDOs in the UK water industry altogether. He explains that Southern Water had realised that data was important to its future, and that they hoped that by placing greater emphasis on data, it could improve operations and customer experience. His role, unsurprisingly, has been to develop and deliver a data strategy across the organisation. “One of the first things I wanted to do was rationalise the current data landscape, and to do that I wanted to create a governed data set across the organisation and to centralise data activity. Previously all of the data was siloed and while different departments were using the data, it wasn’t being shared,” Jackson says. In order to get the data strategy right, Jackson has had to ensure that the business has the right capabilities – in terms of personnel, technology and processes. On the personnel front, the company has required data scientists, but like many other businesses, it has decided to retrain and upskill in-house resources because of the scarcity and expense of recruiting from elsewhere. “Those employees who are already handling data can be upskilled with R and Python because they understand the business and data already. It’s hard to recruit externally – particularly if you’re not in London – and a data scientist salary is also harder to match than other roles, so upskilling inhouse meets a lot of beneficial criteria,” Jackson explains. On the technology side of things, Jackson states that Southern Water is in the second phase of adoption. The company has already created a cloud-based platform with the help of Google, and now it is introducing more business intelligence tools and investigating the potential of deep learning. “A lot of that is about adding business value to our operations, and automating and moving into online dashboarding with Birst and a range of other vendors,” he says. The approach Jackson is taking stems from his own book – the CDO Playbook – which he co-authored with Caroline Carruters, CDO of the Lowell Group. “The concept is to have an immediate data strategy – fixing a lot of the data issues to make them better, this is a ‘business strategy’, but it’s also important to have a ‘target strategy’, which is focused on where you want to be in five to seven years, without building up too much technical debt. We will be moving into that strategy in 2020,” he says. Jackson and Carruter’s book has been well received by those in IT and business, and it has led to them forming what they call the first Chief Data Officer Summer School in conjunction with data governance company Collibra. Of the many people who have noted their interest in the free school, which has already begun, are those in CDO roles now, those with similar roles without the CDO title, and those who are aspiring to be CDOs in the near or distant future. Some 300 people have put forward their names, and Jackson and Carruter have worked with 110 of those so far. The aim – much like the book – is to break- down what the role of the CDO is, what a data strategy should look like and provide practical tips around how they can develop and deliver this successfully. For Jackson, the fact that his role is a ‘C-level’ job shows how important Southern Water thinks data and data strategy is – and he says this is vital for a business to succeed today. In the water industry, the importance of data is likely to grow, he says, and in the years to come companies will be able to better predict water demand and supply accordingly. Much of that is because the companies’ ability to handle data is getting better, but none of that would be possible without a clear data strategy and a clear data leader. LG Sonic installs MPC-Buoy units for algae management in lake Qaraoun, Lebanon In collaboration with the Litani River Authority and Dutch foundation World Waternet, LG Sonic has started an algae control project in Lebanon. As part of the project, LG Sonic has installed 11 MPC- Buoy units that will monitor and control algal blooms in Lake Qaraoun. The project marks the first time that LG Sonic will monitor water quality at different depth levels. The goal of the project in Lake Qaraoun is to identify and control algal blooms in order to restore the lake’s ecosystem. In order to treat the lake, 11 MPC-Buoy algae control units have been installed. These systems combine real-time water quality monitoring and ultrasound technology in order to control algae with a chemical-free method. The first results of the project look promising. After just three weeks there has been a significant improvement in water quality due to the treatment of the MPC-Buoy units. LG Sonic’s water quality software, MPC-View, which receives water quality parameters from the MPC-Buoy units, shows a decreasing trend in algae levels. The project in Lake Qaraoun is the first time MPC-Buoy units monitor water quality at different depth levels. This provides a complete overview of the water quality of the lake. Page 9
  10. 10. UK-based water specialist Aqualogic has become the distributor for the Trimble Unity smart water software platform for water companies in the UK. Trimble Unity is a cloud-based, GIS-centric Software-as-a-Service (SaaS) solution that offers a suite of applications and tools for the water, wastewater, stormwater and environmental water industry. The solution, which is already being successfully employed by the water industry in the United States as well as by one water company in the UK, supports companies to implement smart technology to save costs, reduce water loss and enhance the performance of their assets – all of which lead to improved regulatory compliance and customer service. Trimble Unity provides situational awareness of water and wastewater utility asset performance, offering a single view of remote monitoring data, performance measurement reports, GIS, operational data, asset conditions and events. Customers can leverage Trimble Unity’s configurable web and mobile work management, analysis and data collection workflows for responding to alarms or events, assessing the condition of the utility network assets and collecting authoritative asset data in the field. Trimble Unity allows customers to integrate the solution with their existing back office customer service and asset management systems and provides a single GIS-centric field solution across an entire workforce. Aqualogic managing director Ben Rice said: “Collaborating with Trimble builds on the work we have been doing across the UK water industry innovating with disruptive leak detection technology. “The Trimble Unity platform will allow water companies to view and manage current and new assets in the field, all in one place using one platform – UK water companies see this as an ideal solution to significantly improve field operations.” Saad Latif, Trimble Water’s regional business manager, added: “Aqualogic is a specialist in water management and field services as well as acoustic leak detection. “Working with Aqualogic is an ideal choice for their expertise and contacts across the industry to launch the Trimble Unity platform in the UK.” Aqualogic to distribute Trimble Unity smart water platform Pall Water, a world leader in water filtration, separation, and purification technologies, recently announced the launch of Aria SMARTBOX real-time system monitoring. This tool provides users with remote monitoring so they quickly understand their system’s performance anytime, anywhere, and on any device. The Aria SMARTBOX provides customers with insight to system performance with intuitive dashboards, customizable historic system trends, automatic reports, and real-time alerts. This powerful tool allows customers to proactively identify and resolve any potential issues with system performance to prevent downtime and maintain water quality. According to Joe Carr, Licensed Operator at Newton Water Utility, “The SMARTBOX provides us with the ability to monitor our plant from remote locations. The Town of Newton utilizes the tool using the smart phone application. The SMARTBOX gives us peace of mind knowing that the plant is running properly while at home or away from the facility. The device also provides us with alerts when the plant is not running normally, which is beneficial to know before a much larger problem could arise. The Town of Newton’s experience has been 100% positive with the SMARTBOX.” The Aria SMARTBOX is a new offering in Pall Water’s Service on Demand. Pall Water Introduces Aria SMARTBOX, Real-Time System Monitoring Page 10
  11. 11. Experts are testing cutting-edge techniques designed to enhance the monitoring of water worldwide as part of the €5 million MONOCLE project funded by the European Union’s Horizon 2020 programme. Around 20 scientists from the University of Stirling, the Plymouth Marine Laboratory and colleagues from across Europe are gathering at Loch Leven, Kinross-shire, where they will study the feasibility of using drone and other in situ technology to monitor the quality of water. The work, which takes place over the next three days, will dovetail with a Stirling-led project that is using satellites to monitor water quality from space. Scientists hope that information gathered from drones or loch-side devices will help address gaps in conventional monitoring and support data collected with satellites. Professor Andrew Tyler, Deputy Dean and Associate Dean for Research in the Faculty of Natural Sciences at Stirling, leads the £2.9m GloboLakes project, which uses satellites of the European Space Agency to monitor water quality, including the detection of algal concentrations, harmful algal blooms, and mineral and organic matter. While the project team believe the technology has the ability to help monitor the millions of lakes across the world, the latest study, MONOCLE, addresses specific gaps in data. Professor Tyler said: “Only a small fraction of the world’s 100 million lakes are routinely monitored – largely due to their geographical spread and the logistical and political difficulties of monitoring water. “The GloboLakes project has shown that, by using satellites, we can measure the constituents that contribute to water quality by their absorption and scattering of characteristics within the water column associated with lakes, reservoirs, rivers and estuaries.” However, there are often gaps in this data –for example, due to cloud cover, or because the bodies of water are too small to be monitored by the satellites., he continued. The MONOCLE project is now seeking to fill the gaps in the data by using in situ and drone based technologies. MONOCLE involves 12 partners and is led by Stefan Simis, Earth Observation scientist at Plymouth Marine Laboratory. He said: “It is essential to obtain regular and widespread measurements of water quality in lakes, estuaries and coastal waters, both to support satellite observations and in their own right – we use satellites to relate water colour to water quality, while measurements in the field are essential to monitor further chemical and biological properties.” “Deploying sensors is unfortunately still a costly effort and one of the aims of MONOCLE is to bring down this cost. Our international colleagues visiting lochs in Scotland this week are developing methods to use consumer drones and sensors which you can build yourself, alongside highly accurate measurement instruments. “ After trialling the technology at Loch Leven, further tests will take place in Sweden, Hungary, Romania and Tanzania – assessing and comparing both low to high cost solutions and promoting the engagement of citizens in the monitoring of water. The project at Loch Leven is the first in a series which will look at how different instruments work, how they compare and what factors influence the comparison. Prof Tyler added: “We hope that, by the end of this project, both low and high-tech solutions will be available to provide information that validates existing satellite technologies and provide solutions to the gaps in space and time from satellite data covering these dynamic yet vulnerable environments.” New study uses cutting edge technology to monitor water quality The underground network of water pipes in Leeds are getting a £1.8m 21st century makeover to help cope with a huge spike in demand for water, with around 12 million litres of water consumed each day in the city centre alone. It will involve six ‘smart’ valves being installed across six key areas of the water pipe network in the city centre including Wellington Road, Great George Street, Pontefract Lane, Woodhouse Lane, Hunslet Lane and University Road. Cllr Richard Lewis, Executive Member for Regeneration, Transport and Planning at Leeds City Council, viewed one of the ‘smart’ valves before it is installed in Great George Street over the next few weeks. The remotely controlled ‘smart’ valves will be fitted to key strategic parts of the water network in Leeds to help control the flow of water, which will prevent pressure surges that can occasionally lead to burst pipes and supply disruptions. Jayne Blackburn, project manager at Yorkshire Water said: “This is an exciting new project for Yorkshire Water and will greatly benefit customers in Leeds and commuters who won’t be inconvenienced as much if we have a burst or leak on our network.” “It was great to show Cllr Lewis the work we’re doing to create a ‘smart’ water network in Leeds – a first for Yorkshire.” Cllr Richard Lewis said: “It’s exciting news that Leeds is the first city in Yorkshire to benefit from the latest technology in ‘smart’ valves. The new installations will play an important role in keeping the city centre moving by helping us avoid unnecessary disruption from burst water pipes.” The project involves specialist sensors being installed in the pipes that are capable of remotely talking to the valves to open and close them and in doing so control water flow. The work is being carried out by Yorkshire Water’s contract partners Morrison Utility Services and aims to be finished by end of September 2018. ‘Smart’ valves installed in Leeds Page 11
  12. 12. Article: What to expect from the latest generation of sensors? In recent years, as with other technology, sensors have become smaller, more rugged and more stable, leading to more portable technology and more reliable measurements. Significant advances have been made in the measurement of species such as trace metals, nitrate, nitrite, ammonia, and E. coli, using electrochemical (such as voltammetric stripping) and optical techniques (such as fluorescence imagery). Improvements have also been made in the measurement of common physico-chemical parameters such as temperature, conductivity, turbidity, colour and pH. Applications for these physico-chemical sensors include wastewater, water quality, aquaculture, aquariums, agriculture, hydroponics and more. So, are the latest generation of water sensors for commonly measured physico-chemical parameters better able to meet user requirements than previous designs? Typical users may be looking for low cost sensors which are intuitive and easy to use without technical training. Sensors must be sensitive enough to meet regulatory requirements for monitoring (i.e. sufficiently low Limits of Detection) and be easy to calibrate. For ease of use most consumers expect connectivity, with familiar technologies like smart devices, tablets and cloud storage, to come as standard. Indeed, using a smart display for monitoring results (Figure 1) has multiple advantages as extra functions can be added such as alarms, data logs, data sharing, step-by-step user guides and prompts, calibration reminders and instructions. Smart displays are available from companies, such as Übersicht, Ultrapen PTBT2, Hanna Instruments, and Camlab’s TRUEscience system amongst others, when water measurement is combined with a user interface on a smart phone app. “It was really easy to get started.”says Ben Witchell, winemaker, Flint Vineyard, Norfolk who uses Camlab’s TRUEscience system pH meter. “I was quite surprised actually that out of the box it was very easy to get going. I found that I don’t have to read the instructions, I just get on with it. I’d say that it’s easier to use than a normal pH meter.” Developed with input from users at every stage in order to design out problems and incorporate users’ requirements, Camlab’s TRUEscience system uses a cap which accepts standard electrodes with an S7 connector to measure different parameters. This means that there is no need to replace the whole unit when the electrode fails, you can also use electrodes from other manufacturers and you can select a specialized electrode for your application. The system has features typical of a high-end meter, but at a low price. Multiple probes can be connected and displayed on a smart device simultaneously in the TRUEscience system. Camlab has developed a range of electrodes including pH, Ion Selective Electrodes (ISE), Dissolved Oxygen (DO) , and Redox. An electrode for conductivity is currently under development by Camlab. A significant advantage to using an Android platform is that software can be updated over time to upgrade the system with improvements in interpretation of data without the need to buy new hardware. The TRUEscience system can expand over time as electrodes to measure new parameters are added. The Camlab TRUEscience pH sensor (Figure 2) was launched in November 2016 and is claimed to be the world’s first laboratory Bluetooth pH meter and QR-coded buffer solution system. The pH sensor provides the accuracy of laboratory grade sensors added to the computing power of an Android tablet by using an app, which adds numerous benefits to the user. Scanning the QR codes on buffer solutions aids keeping track of expiry dates, for example [image A2a & A2b]. For health and safety purposes, Material Safety Datasheets (MSDS) and certificates of analysis can be obtained via the app. Reagents can be quickly ordered online when needed, with automatic reminders. Customers have reported that the pH sensor is well designed and user-friendly. “Two pH probes can be used at the same time from the same tablet so two types of media can be made up at the same time.” Says Sarah Barber, Head of Media at the John Innes Centre for plant science research who has found the TRUEscience pH sensor useful for other reasons as well: “It is useful having a tablet instead of the box meter simply because we can use it across the whole bench and so you can be pHing something at one end and somebody else can be measuring it at the other end. Having no wires makes it cleaner and tidier and so that does make it easier to use”. In response to market demand, future developments for the TRUEscience app include the ability to sync data to the cloud (to move calibration between smart devices), artificial intelligence, voice commands, Alexa remote display and connecting with additional mV signal electrodes. The TRUEscience system is currently not completely waterproof which would be useful for field work, so this would also be a useful improvement for the future. Advances have also been made in the measurement of heavy metals in water bodies. Heavy metal pollution from mining, smelting, military activities and effluent discharges has traditionally been analysed using accurate, reliable but expensive laboratory techniques such as Atomic Absorption Spectrometry (AAS), Inductively Coupled Plasma Mass Spectrometry (ICP-MS) and Inductively Coupled Plasma Optical Emission Spectroscopy (ICP-OES). These techniques have low limits of detection (LOD) but use extremely expensive specialist equipment which is not portable. Collection, stabilization and transport of water samples for lab analysis takes time in addition to the nominal time for the actual analysis in the lab so results are not available quickly. There are various techniques available for much faster analysis in the field using portable equipment: photometry, ion-selective electrodes, x-ray fluorescence, DNA-based biosensors and stripping voltammetry. Each technique has advantages and disadvantages; most are expensive and/or can only measure a limited range of parameters. Figure 1:Using a smart device for displaying results Figure 2: Laboratory pH meters with bluetooth & QR Codes Page 12
  13. 13. Trace2O have developed a low cost heavy metal sensor (‘the Metalyser’) to measure heavy metals in natural water samples, originally for monitoring water quality in developing countries in the field. Stripping voltammetry was chosen as a well-established technique to measure heavy metals present, with the potential to measure a wide range of metals, with linear data. Simply put, stripping voltammetry uses the known redox potentials of metal ions (Figure 3a) (most metal ions are positively charged) to selectively reduce metal ions of interest on electrodes. The potential is then reversed to oxidise and strip the ions off the electrodes. By measuring the current during oxidation, the concentration of metal ions can be measured as the peaks will correspond to the original concentration in solution (Figure 3b). The accuracy and LOD are comparable with lab-based methods such as AAS and ICP. No pre-treatment is required for simple samples, but may be necessary for more complex samples. The sensor uses glassy carbon electrodes which are low cost, modified with a mercury or gold film (miniscule amounts are used to form a film to create a more effective electrode). Different versions of the Metalyser are available, the most popular being the HM1000 portable (Figure 4) which can measure arsenic, cadmium, chromium, copper, mercury, manganese, nickel, lead and zinc in natural water. Step by step instructions are provided but analysis is automatic, the design uses a sonde containing the electrodes, temperature probe and stirrer and a hand-held unit containing a potentiostat and data processor. Temperature affects the reading so must be measured and automatically corrected for. In addition to being low cost and portable, other advantages of the HM1000 are that it is easy to use, results are rapid, no fume extraction is required, false results can be easily identified and turbid samples can be analysed. The challenge of calibration is solved by using standard addition – in effect every sample is calibrated each time. Limitations include interference from organics but this can be overcome using a UV digester which assists in breaking down larger organic molecules and releasing complexed metal ions for subsequent analysis. Interference from other metals can also be a problem, and a relatively small range of metals can be measured compared with other techniques. Other versions of the Metalyser have lowered the LOD by adding in photometric analysis in addition to voltammetric stripping and increasing the range of met- als measured to include non heavy metals (aluminium, boron, iron – in the HM2000 deluxe). “The Metalyser Deluxe HM2000 is very easy to use; it can be mastered after the few hours’ training that Trace2o provides, and any further questions or doubts were always quickly answered by Trace2o afterwards.” Said Dr Peter Bartl, Senior Chemical Design Engineer at Pyropure Ltd, who used the Metalyser to detect heavy metals in wastewater for an R&D project. “We needed to make sure our waste water did not contain significant amounts of heavy metals, and the only way of doing that was via frequent analysis of the water. At first I was sending samples for external analysis, but that was both expensive and time consuming, also, sometimes I needed a quick answer. The Metalyser Deluxe HM2000 was the perfect solution. Although designed for detecting very low concentrations of metals in apparently clean water, it could handle our waste water with no problem, only sometimes requiring some dilution. I have found it to be reliable, accurate, and robust with just a bit of maintenance and common sense. Also, it can be easily stored way in two small cases when I’m not using it, and is easily taken outdoors for field measurements.” The HM3000 Field Pro is able to increase deposition times on the electrodes and incorporates a tablet PC to perform linear regression on multiple data points to further lower the LOD. More metals are added to the suite of metals which can be analysed: gold, bismuth, selenium, tin, thallium. The latest generation of physico-chemical water sensors are increasingly user friendly, sensitive, accurate and are often able to connect with smart devices and therefore better meet user needs and expectations than previous generations of sensors. It will be interesting to see whether new technologies like lab-on-a- chip perhaps combined with amplification techniques can make a further step change in future. Perhaps the miniaturization of technology will allow low cost sensors to measure minute samples using minuscule amounts of reagent using microfluidics or micro cavities and detection may even be reduced down to the presence of a single molecule of a pollutant in a droplet! The above information was presented at a Sensors for Water Interest Group (SWIG) workshop held at Clare College, Cambridge that discussed the latest developments in water sensors. Figure 3a - Stripping voltammetry and Figure 3b typical results for metals Figure 4: Field based metal analysis Page 13
  14. 14. Article: The Intersection of data analytics and data governance Are you suffering from data exhaustion? The symptoms can vary widely, but it generally stems from the unrelenting influx of data and your organization’s continued struggle to define a full-scale strategy to organize, understand and leverage that data for maximum advantage. There’s certainly no shortage of treatment options being marketed; whatever data obstacle you’re facing, there’s literally scores of solutions available to overcome them and ease your pain. You’re not alone in this affliction, and it is entirely understandable. The fact is, the era of big data arrived in fits and starts for most companies, with many organizations forced to take a reactionary approach to data management. As the value and volume of data grew, projects arose and fires erupted, and tools to address a specific need would be purchased or built as the constraints of both budget and resources allowed. Today, this has left some organizations with fatigue and frustration from the patchwork of tools that loosely comprise their data management strategy. Such an approach was borne out of necessity, but unfortunately it isn’t strategic, it isn’t scalable, and it won’t serve as a long-term solution for managing data and maximizing data value for increased profitability and competitive advantage. It’s time for organizations to consolidate their data management strategies, and find ways to increase efficiencies and synergies among tools and solutions. Identifying symbiotic relationships is a good first step towards developing an integrated data management plan to serve both immediate needs and future goals. Finding Data Management Symbiosis Symbiotic relationships generally occur in nature, and interactions such as mutualism represent species who work together for reciprocal benefit. Yet this concept can also be applied to an organization’s data ecosystem—where seemingly disconnected or disparate data tools can serve a shared purpose. A good example of this is the connection between data analytics and data governance. Data governance is increasingly recognized as a foundational component of any strong data management plan, and analytics can improve the performance and efficiency of an organization’s governance efforts. But organizations don’t accrue data for data’s sake; the end goal of raw data is the insights it can reveal and the improved decision-making and outcomes that can result. Analytics are the critical enabler of this process, and data governance also plays an important role in helping organizations maximize the value of their analytics and data assets. Analytics-enabled Data Governance Data governance is the formal orchestration of people, processes, and technology to enable an organization to leverage their data as an enterprise asset. It serves a critical function in business to support regulatory compliance, but it is also crucial to ensuring a common understanding of organizational data assets across an enterprise. Key components include business glossary, data dictionaries, data lineage and metadata management, all of which inform users about the source, use, relationships and definitions related to data—including business terms, attributes, and dependencies. It also assigns data owners and stewards to data assets to increase accountability and enable access to data resources, encouraging data usage. And it can have a direct impact on data quality by providing monitoring, quality dimensions and scores. With analytics-enabled data governance, machine learning algorithms can monitor and improve data quality across an enterprise, self-learning as issues are resolved. Improved data quality increases user trust in data reliability, and therefore increases data utilization for analysis. Machine learning can also play a vital role in compliance efforts, with automatic monitoring for potential non-compliance. Without analytics, governance programs simply confirm compliance with regulatory requirements. With analytics and active monitoring, organizations can proactively identify areas where they may have violations—a capability that will only grow more important as regulations like the General Data Protection Regulation (GDPR) go into effect. Enhanced Analytics through Data Governance Analytics can clearly improve organizational data governance efforts, but equally apparent is the impact that data governance can have on an organization’s analytics efforts. Data governance is all about increasing data understanding across a business enterprise, and encouraging collaboration to get the most from your data assets. It shouldn’t reside in IT as a technical undertaking; it needs to be business focused to drive data utilization. Engaged users who trust and understand data are far more likely to then use analytics to look for intelligence and insight. Strong data governance creates educated users who can rely on data assets and will frequently turn to analytics to help solve business issues and uncover unseen issues and opportunities. Elements of governance also make analytics possible. Metadata management is oversight of “data about data,” and was previously mentioned as an important aspect of a comprehensive data governance approach. But it is also the key to true predictive analytics for deriving valuable business insights, and predictive analytics are in turn the key to gaining competitive advantage and insight into future outcomes and events. Self Service is Essential The mutually beneficial relationship of data governance and analytics should be clear, but it must be said that if an organization delegates all responsibility for analytics to IT or an elite group of data scientists, it is largely a moot point. Data management and data governance today should be focused on empowering business users with a self-service approach to accessing and leveraging data assets as they’re the power users looking for to use the data. IT resources are scarce and overburdened, and their time can be better spent than juggling a constant stream of competing requests from myriad business users. Likewise, business users can’t afford to wait weeks or even months for a data request to be processed; by then, the issue may be irrelevant or the data may be out-of- date. As organizations re-examine their data strategies, they need to give users access to self-service analytics that will allow them to routinely rely on analytics, increase their speed to insights, and put them in the driver’s seat as they look for innovative ways to gain competitive advantage and improve the bottom line. This article was originally publushed as a blog on the infogix website. The original is available by clicking here Infogix Data3Sixty™ Platform applies advanced analytics to improve your customer experience and drive corporate profitability and streamline operational efficiency. This process improves data quality, ensures effective transaction monitoring, provides balancing and reconciliation, identifies fraud and operationalizes predictive models. Page 14
  15. 15. Article: Jumpstart Smart Infrastructure By Adding Sensors To AMI Smart water networks today do more than read meters. They also collect data from sensors on distribution networks to help reduce non-revenue water losses, monitor and control pressures in water mains, and prevent unwanted sewage discharge. These new smart infrastructure solutions help water utilities expand the definition of smart water — going beyond applications aimed at improving billing accuracy and efficiency. Just like smart metering made billing more efficient, smart infrastructure solutions, which incorporate the characteristics of the Internet of Things (IoT), help utilities meet regulatory requirements, prevent pipe bursts, reduce energy costs, and identify hard-to-detect sources of water loss that result in lost revenues. Underground leaks, for example, are a primary concern for utilities worldwide and a leading cause of non-revenue water losses. The Asian Development Bank reports that Asia loses around 29 billion cubic meters of treated water each year to leaks, at a cost of approximately $9 billion. The European Environment Agency also estimates drinking water losses from the distribution system average of around 30 percent in most countries, with leaks in urban reaching 70 to 80 percent in some cities. In the UK, even consumers are voicing concerns, with about 70 percent believing that their utility is not doing enough to reduce leaks, as reported in The Guardian. Pinpointing Underground Leaks These unapparent leaks can go unnoticed for years, washing away dirt and gravel under roads and buildings until a sinkhole appears and causes major infrastructure damage. However, utilities can extend the benefits of their advanced metering infrastructures by adding sensors that use acoustic sensing to identify and locate leaks in water pipes. These acoustic logging solutions help utilities: • Avoid high-cost catastrophic pipe failure • Extend infrastructure and treatment plant lifecycle costs • Conserve water and power • Meet consumer and regulatory service expectations A variety of acoustic logging solutions are available on the market today, but unless the data collected from the loggers is also analysed properly, utilities will have too many false positives that identify leaks where there are none, or worse, false negatives that miss leaks altogether. The right software solution combined with the acoustic logging will eliminate the noise and electrical interference that renders some leak-detection solutions ineffective. What’s more, an acoustic logging solution should take into account pipe materials such as metal, concrete and PVC to provide for more accuracy in identifying leak locations, as pipe materials have different acoustic characteristics. In addition, solutions should install in standard valve stacks to give the fastest and most direct access to pipes. Other factors to consider when considering acoustic leak detection are: • Application software that remotely correlate data and provides visual identification of high probability leak locations • Diagnostics software that ensure the system is operating at peak performance and notifies operators when problems arise • Time synchronization of analyses and sound recordings to provide system-wide correlation remotely without use of handheld devices The Aclara leak detection solution is a smart infrastructure solution that provides automatic, accurate, and effective acoustic data logging and correlation. It leverages both the proven capabilities of the Aclara RF network advanced metering infrastructure (AMI) solution and acoustic loggers from Gutermann, a global technology leader and innovator in intelligent water loss technologies and leak detection technology based in Baar, Switzerland. Aclara leak detection technology operates on private, not public, radio frequencies over the Aclara RF fixed-network AMI. It can be implemented as a standalone fixed-network solution for leak detection or as a valuable add-on to Aclara RF meter- reading network. The Aclara leak detection system is an example of a sensor that can be used effectively on an AMI network. The solution can identify the location of underground leaks to within a few feet. Guarding Against Sanitary Sewer Overflows Preventing sanitary sewer overflows (SSOs), which can unintendedly discharge raw sewage into the environment, is a national enforcement priority for Environmental Protection Agency (EPA) in the U.S. An estimated 40,000 SSOs each year are caused by events such as severe weather, vandalism, and improper system operation and maintenance. The regulatory agency is vigorously moving to eliminate SSOs from municipal collection systems and to ensure that wastewater is conveyed to treatment plants in accordance with the requirements of the Clean Water Act. To eliminate SSOs, EPA uses a mix of compliance and enforcement tools, including traditional administrative and judicial penalties. These penalties can run The Aclara leak detection system is an example of a sensor that can be used effectively on an AMI network. The solution can identi- fy the location of underground leaks to within a few feet. Page 15
  16. 16. to the millions of dollars. And although the U.S. EPA has led the world in recognizing the problem presented by SSOs, other countries such as the European Union (EU) are also evaluating the problem. A report on the impact of SSOs in 28 EU member states, issued by Milieu Law & Policy Consulting, outlined the state of current legislation and regulatory policies and recommended implementing similar strategies as those used in the U.S. for managing SSOs. As a result of this government focus, water and wastewater utilities are looking at a variety of technologies to prevent SSOs. One is the use of sensors to determine when water and sewage levels are rising in sewers or when manhole tampering has occurred. These sensors, when operating on a fixed network, reliably provide near real-time monitoring of manholes and other key sewer locations. Whether used for early warning of overflows, informing maintenance schedules, compliance reporting, or deeper analytics (such as capacity modelling and performance reporting) sewer monitoring is a crucial solution for managing SSOs. Aclara has designed an SSO solution that installs in manholes and will integrate into an Aclara AMI environment. It is available in two configurations — as a level alarm to provide warnings of impending overflow or level monitoring for tracking system efficiency and cleaning/maintenance effectiveness. The Aclara SSO solution also offers: • Manhole level trends over time that indicate signs of decreased flow capacity — triggering a need for a clean out • A water consumption data over a sewer level versus time report to separate normal wastewater generation from inflow and infiltration (I&I) stormwater • Four user-configurable alert level thresholds to maximize operational flexibility • An event mode where intelligence in the endpoint adapts to changing conditions (can read/transmit more often when an alarm is tripped, for example) Monitoring Pressure To Prevent Leaks Another way to identify leaks and keep a water system operating optimally is through pressure monitoring. Pressure sensors located in the water distribution network can identify anomalies that could indicate leaks or the potential for leaks. Having pressure sensors monitored automatically on a fixed- network increases their effectiveness and efficiency in making pressure adjustments. Pressure monitoring is especially useful in district metering areas to: • Perform flow and pressure calculations to show leaks • Compute water balance and determine minimum night flow • Identify where pressure can be lowered to reduce leaks • Control DMA gate valves Automatic pressure monitoring on an AMI network also can help utilities reduce the damage caused by leaks and lower the chances of burst pipes by allowing operators to quickly reduce pressure when necessary. It also is useful in helping operators monitor pressure to maintain minimum service pressures as well as determine where problems exist when they receive low-pressure complaints. Pressure monitoring is also helpful in collecting the data necessary to calibrate hydraulic models. Aclara integrates with a variety of pressure sensors with standard fittings to provide the data needed to understand how pressure is affecting the distribution system. The solution operates at multiple pressure levels and offers configurable alarms. By harnessing AMI technologies to build out smart infrastructure solutions, water utilities can have an impact on their bottom lines and reclaim lost water revenue. Aclara powers the data-driven information and advanced applications utilities need to leverage AMI investment to improve services and manage water distribution networks. CTG ALGAE-Wader Pro Systems For Flanders The Flanders Environment Agency (VMM) has taken delivery of a number of CTG ALGAE-Wader Pro systems to assist in their work in assessing Algae levels within the water environments within Belgium. CTG’s representatives within the Benelux countries, Notra BV, assisted in arranging the demonstrations and the subsequent supply. The CTG Algae-Wader Pro system was chosen after successful demonstrations within the swimming lake at Blaarmeersen Sports and Recreation Park. This is a typical example of the sites that will be monitored for algae concentrations. A VMM spokesman stated “If these sensors show too high a cyan chlorophyll content and microscopic analysis shows that these are toxic species, the necessary measures will be taken to protect the public.” The systems supplied to VMM are configured for measurements of Chl-a, Phycocyanin and Phycoerythrin and come with a handheld unit which provides both graphical and digital real-time data to the operator, with Red, Amber, Green thresholding, and logs the data for post mission extraction. All data is both time and position stamped. The three channels provided by the CTG TriLux fluorometer within the system will allow the VMM to not only measure concentrations of Chl-a but also help inform on the group type of the algae present. Page 16
  17. 17. Despite industry utilizing the internet for communications to “things” far before the advent of the Internet of Things (IoT) terminology, much focus was put on the consumer market as the general IoT began to develop. This could be due to innovators focusing on the perceived higher volume of “things” required for the consumer market or the ability for the average innovator to have a deeper understanding of consumer requirements. Regardless of the reasoning, as the Industrial Internet of Things (IIoT) divulged from IoT, the technology sector generally encompassed “industrial” as anything not considered to be in the consumer IoT. This is a short-sighted view as the industrial markets’ technology demands vary by market segment and even further by application within the market segments. In this article, we discuss some methods IIoT marketers and product managers can use to position their solutions for success in the industrial market segments. Market Segmentation Considerations for IIoT The concept of industrial market segmentation is highly subjective, complex, and perhaps does not get the depth of attention it deserves. The rapid growth of IoT and the latent development of the IIoT, provided attractive opportunities for many innovators competing within consumer IoT to migrate towards IIoT, many underestimating the depth of knowledge required for a technology solution to take off in the industrial markets. Perhaps this is the reason for the generalization of the industrial term in the IIoT in the first place. Segmenting industrial markets is complex because dynamic market drivers emerge due to pressure from demographics, regulation, innovation, and other economic variables. Consider for a moment, the dynamics of the food and beverage (F&B) market. The F&B market can broadly be separated into several main segments such as prepared foods, dairy, meats, beverage, agriculture, etc. Further segmentation might separate for example, “beverage” into craft and large-scale due to the varying market drivers between them. Traditional drivers still exist for many large - scale producers surrounding quality, brand protection, and environmental sustainability. However, with consumer awareness increasing, trends to purchase local, organic, and/or non-GMO, have presented opportunities for small, startup, craft producers. Craft producers will place very different priorities on their drivers and factors in purchasing decisions; often early in the startup phase. Because of these variations, an IIoT product may not have the same fit between the market sub-segments, even if it is functionally the same. Tailoring IIoT Solutions to Market Needs I suggest that, in general, IIoT solutions will fall towards one side of an application continuum with “high volume / reproducibility” on one end and “implementation intensive and variable customization” on the other. There are a number of qualitative attributes that might factor into the ends of the continuum (see figure below). Understanding your target market’s positioning on the continuum will help you tailor the product’s commercial model for success. You might find that the product can simply have enough flexibility built into the model to supply the needs of all of your target markets. However, you might also find that the best approach is to brand and package a product completely different for each segment despite the functional technology being the same or very similar. A continuum for your business might contain the same or different attributes and approaches to market segmentation will vary. Regardless, placing the market segments and products on the continuum will likely result in improved decision making throughout product development and lifecycle management. Be cautious not to generalize market segmentation too much. As with the beverage example discussed earlier, stopping at “beverage” would leave a product potentially positioned very poorly in a large part of the market. Once your solution passes the ideation phase, and presumably you have generated enough business justification for it to provide value to the targeted market(s), the hard work of development begins. Chances are that the idea for the product originated from a realized opportunity to fill a value gap in an application. It’s unlikely however that the details of how that product should be positioned and transacted have been completely vetted; nor that all possible applications for the product have been identified. It’s rare for all of the detailed marketing required for a full commercial model to be complete prior to any development. This would also cripple a product’s ability to reach the market in a timely fashion. As development progresses, the details of positioning and modelling the solution in the target markets should be moving in parallel. It’s important to not underestimate the amount of effort required to build a complete model for a new product. Marketers and product managers must consider some key items during the development process. Placing these items on the application continuum will identify with clarity what gaps might need to be filled prior to commercial launch. Some considerations: • Target market’s segments and sub-segments position • Other (untargeted) market’s position • Product’s intended position (at least initially) • Competing products Article: An Emerging Dichotomy In The Industrial Internet of Things Figure 1: IIOT Application Continuum Attributes Page 17
  18. 18. • Organizational considerations (such as ability to finance, transactional preferences, etc) If development is moving in parallel, then feedback should occur with development to adjust the products position to align with the target markets. Let’s exercise this by furthering the example of the craft and large-scale beverage markets. Ideation may have identified a need your organization could fill in the large-scale market. However, throughout analysis, you find that the craft market will move higher volume and that your organization does not contain the resources to assist throughout the difficult implementations in the large-scale market. While the outcome in this example might seem obvious, when adding multitudes of markets, segments, and sub-segments the complexity increases. Market repositioning throughout product lifecycle As markets and the IIoT evolve, new opportunities may present themselves for existing products. It is as important to consider the changes in the marketplace as it is to manage the dynamics of the product itself. This outward facing viewpoint may present new opportunities for an otherwise aging product. It’s also just as likely for a market’s dynamics to change causing a detrimental impact to an otherwise healthy product’s revenue. Marketers and product managers might periodically evaluate and reposition the following items on the continuum: • Targeted market segments for changes • Market segments not targeted for emerging changes • Product’s position based on emerging technological enhancements, changes in supply chain, organizational changes, etc • Competing products and emerging technologies This creates a dynamic analysis of the product’s position against the targeted markets and competing products. As gaps begin to display in the completed continuum, actions can be taken by marketers and product managers to either protect or enhance a product’s position or to evaluate an end of life process. Conclusion The speed of change for IoT and IIoT is rapid. Industrial market trends do not typically change quickly, but their depth is extensive and dynamics impacting them do exist. Whether you subscribe to the theory that the IIoT is beginning to display a dichotomy or not, the reality is that the needs of the industrial market segments are variable in a very dynamic environment. The reasoning for any lag in uptake of an existing or emerging IIoT solution is likely far beyond the value the technology could bring on its own. For any IIoT solution to gain traction, the entire end to end experience must be tailored for the clients in the target markets. Figure 2: Application Continuum for Beverage Market Segments at Ideation Michael Henk is the Director of Digital Solutions at US Water U.S. Water was founded in 1997 by three individuals with a unified mission: to be universally recognized as the most innovative and flexible provider in the marketplace by focusing on safe, economical and environmentally sound solutions for our customers. In the following years, U.S. Water quickly grew to include multiple locations and production facilities with representation and distribution centres nationwide and internationally. In 2015, U.S. Water joined the ALLETE family of companies. In business for over 100 years, ALLETE (NYSE:ALE) is well-positioned as a reliable provider of competitively-priced energy in the upper Midwest, and invests in trans- mission infrastructure and other energy-centric business. With a shared purpose, the ALLETE family of companies work to provide clean, safe, efficient and affordable energy products and services that fuel modern necessities and enrich quality of life Page 18
  19. 19. Introduction Phosphorus is one of the key regulated parameters in wastewater treatment and the newest investigations from the Chemical Investigation Programme in the UK have shown that the best available technology is capable of removing the pollutant down to a concentration of 0.25mg/L of Total Phosphorus. This is going to need an unprecedented amount of control of the treatment system so that these ultra-low concentrations are achieved. So what’s the problem with phosphorus – what are the key questions that we need to ask: • Why is regulated to the level that it is regulated to? • How do we treat it at the moment? • How do we measure and what are the problems with measuring it? • How we do control it? • Where is phosphorus removal going, how are we going to monitor and control the removal system? Phosphorus is one of the major nutrients along with Nitrogen and Potassium. It is present in fertilisers and is globally used in agriculture. As a result it is something that is quite often washed into rivers through diffused pollution and the phosphorus becomes an aquatic pollutant. This along with Nitrogen in the form of nitrate is the root cause of eutrophication which is a major aquatic pollution problem and is a root cause for algal blooms. Phosphorus is often regulated as it is often the limiting factor with eutrophication as its absence even when an excess of nitrogen will prevent the algal growth. Basically without phosphorus present you won’t get eutrophication. Phosphorus has been cited as being the major reason why water-bodies in the UK have failed to achieve good ecological status. So it’s important to remove phosphorus and the practicality of reducing the pollutant load from diffuse pollution makes the wastewater treatment system the most convenient place to remove it. As a result the removal of phosphorus from the wastewater stream has become a priority and the regulated levels are approaching 10 times lower that the 1989 Potable Water Quality Standard (where phosphorus was regulated to 2.2mg/L P). The difficulty is that Phosphates are sub-categorized into: • Orthophosphates • Condensed phosphates – Metaphosphates – Pyrophosphates – Polyphosphates • Organophosphorus compounds Orthophosphate is always determined if samples are not digested as only orthophosphate can be detected directly by photometric means. This is also known as determination of the “reactive” phosphorous. The measurement results can be indicated in a variety of ways: • PO4, phosphate • PO₄-P, phosphate-phosphorous • P₂O₅, phosphorus pentoxide The way we tend to treat phosphorus at the current time is by either using chemical precipitation methodologies (in the main) or biological techniques such as Enhanced Biological Phosphorus Removal in Activated Sludge (EBPR). The former method using chemical precipitation with iron or aluminium salts being much simpler and cheaper but has the limitation of using chemicals. These chemicals, ideally, need to be controlled and to control the chemical dosing we need to measure the concentration of phosphorus. Measuring Phosphorus Measuring phosphorus is where the problems start to creep in with special reference as to “what phosphorus are you measuring?” When phosphorus is regulated it is regulated to the total phosphorus that is present in water. This is very simple to regulate but much more difficult to measure and so often the soluble reactive phosphorous is actually measured and a safety factor for the insoluble non-reactive phosphorus taken into account in this factor. This is especially possible as it is regulated to annual average so if the annual average is running to close for comfort then greater treatment can be applied. The reverse is also true although not a popular operational strategy from an environmental point of view. When you look at the laboratory method and the different fractions of phosphorus and its analysis (figure 1) an appreciation of the complications can be seen. It is a case of pick a fraction, any fraction and see what you come up with. In reality in wastewater the fraction that is regulated is total phosphorus, quite often in the field total reactive phosphorus will be measured as the practicalities of filtering samples in the field usually is the limiting factor. So what is the basic methodology? Focus on: Phosphorus and it’s control in wastewater Page 19
  20. 20. For measuring Total Phosphorus the methodology is to oxidise the sample to soluble reactive phosphorus generally (but not exclusively) using acid digestion. Even this method is not necessarily using just one method as the standard methods list: • Perchloric acid method for the most difficult of samples that need an aggressive digestion technique • Nitric acid – sulphuric acid method for most samples • Persulphate oxidation with UV as the most convenient method as long as stable results in comparison to the other methods are obtained. This is basically to convert the total phosphorus to reactive phosphorus. If the partioning between Total and Reactive Phosphorus needs to be understood then the reactive test needs to be run with and without digestion. The colourmetric method for the analysis of reactive phosphorus is not easy either with several methods available here as well with three main methods including: • Vanadomolybdophosphoric acid method • Stannous Chloride method • Ascorbic Acid method Before the recent changes in wastewater regulation around phosphorus the Vanadomolybdophosphoric acid method would have been the most appropriate as it has a range between 1-20mg/L P. However with increasingly tight standards and regulated phosphorus methods dropping below 1mg/L phosphorus the dilution of the sample will become necessary but with this the chances of error and interferences increase. The vanadomolybdophosphoric acid method with the potential for dilution is still the most applicable. The principle of this method is that ammonium molybdate reacts under acid conditions to form heteropoly acid, molybdophosphoric acid. In the presence of vanadium a yellow vanadomolybdophosphoric acid is formed. The intensity of the yellow colour is proportional to the phosphate concentration. Iron and sulphide do interfere with this method but the former over concentrations of 100mg/L where the latter is a problem so this should be taken into account where septicity is a problem and hydrogen sulphide concentrations are high. All of these differing variations in the laboratory methods bring about complications when moving to an online methodology of analysis but the most common method for online measurement of soluble reactive phosphorous is a conversion of the high range laboratory method using vanadomolybdophosphoric acid method and for the measurement of total phosphorus the conversion of the total phosphorus to reactive phosphorus using the Persulphate oxidation method using UV followed or either of the acid digestion methods by the Ascorbic Acid Molybdenum Blue method. The principle components of the soluble reactive measurement system are: • The sample collection system • The sample filtration system • The reagent addition system • The photometric detector When it comes to the total phosphorus method the additional complication is in the digestion methodology be it an oven system with the acid phase digestion or with the UV system with the oxidation methodology. The key potential sources of error for any of these methods are: Sampling – The sampling methodology is key to the success of any online analytical method. Be it a vacuum sampling method or a peristaltic pump method the key is to ensure that the sampling system does not block (especially with crude sewage) and samples consistently. The sources of error can be limited by locating the analyser as close as possible to the medium being analysed and potentially using a method of pre-filtration. Reagent edition – The correct amount of reagent being added to the sample is also crucial. Digestion Process – If the digestion process is incomplete there will be errors in the amount of total phosphorus measured. Errors in the photometric detection are rare and will not have a significant effect. Controlling phosphorus In phosphorus removal the most common method is to use a chemical precipitation method using either aluminium or iron salts of which the latter is the most prevalent due to the toxicity of aluminium in the aquatic environment. There is a well defined stoichiometric relationship between iron and phosphorus with Page 20
  21. 21. 7 parts of iron required to remove 1 part of phosphorus. This is often used when designing chemical phosphorus removal systems. What is important is how the chemical dosing system is controlled, if it is controlled. Without any chemical dosing control system there is the potential for either under or over dosing iron salts. This can actually cause damage to structures within the treatment works this especially the case with flow measurement flumes which are sensitive to any damage. As a result of the risk of under or over dosing especially with the ultra-low phosphorus consents that are being put in place there is a need for increasing control over phosphorus removal chemically based systems. The simplest method of control is to use flow based control and presumes that the concentration of phosphorus is stable. In this methodology the amount of iron is dosed proportionally to the flow rate. The method is simple but does need a form of flow measurement but does not require online phosphorus analysis. The next method is to manually establish phosphorus concentrations over time and work on a assumed concentration profile and use flow measurement to establish a assumed load profile. In this way a more advanced dosing control system is put in place without the need for online phosphorus measurement. The last method is to measure the online phosphorus load and use the results to calculate the amount of chemical that needs to be added with the potential of a downstream feedback control loop working on a nudge and wait system. This is obviously the most accurate and best control system but it comes with additional complexity and cost. However this additional complexity and cost, especially on a chemical dosing system is worth it when there is the potential for multiple chemical dosing stages which is common when ultra-low consents are in place necessitating tertiary treatment processes which will be sensitive to the incoming pollutant load (phosphorus in this particular case). The future of phosphorus treatment What is clear with the current trends within the water industry is that the permitted levels of phosphorus are going to get lower and this is where the factory approach and the circular economy become more of a feasible solution either through the use of phosphorus in sewage sludges or the extraction of phosphorus from sewage sludge and conversion to a useable product. Up until the economics of recovering phosphorus from all but the largest of wastewater treatment works has not been financially viable. However as treatment costs rise, the technical development of phosphorus recovery technologies and improvements in monitoring and control technologies means that accurate phosphorus loading can be measured in an online format. Through the measurement of phosphorus through the treatment works what can be a problem substance can actually converted into a raw material produced in a factory-based system approach. Control System Simulator Helps Operators Learn To Fight Hackers A simulator that comes complete with a virtual explosion could help the operators of chemical processing plants – and other industrial facilities – learn to detect attacks by hackers bent on causing mayhem. The simulator will also help students and researchers understand better the security issues of industrial control systems. Facilities such as electric power networks, manufacturing operations and water purification plants are among the potential targets for malicious actors because they use programmable logic controllers (PLCs) to open and close valves, redirect electricity flows and manage large pieces of machinery. Efforts are underway to secure these facilities, and helping operators become more skilled at detecting potential attacks is a key part of improving security. “The goal is to give operators, researchers and students experience with attacking systems, detecting attacks and also seeing the consequences of manipulating the physical processes in these systems,” said Raheem Beyah, the Motorola Foundation Professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. “This system allows operators to learn what kinds of things will happen. Our goal is to make sure the good guys get this experience so they can respond appropriately.” Details of the simulator were presented August 8 at Black Hat USA 2018, and August 13 at the 2018 USENIX Workshop on Advances in Security Education. The simulator was developed in part by Atlanta security startup company Fortiphyd Logic, and supported by the Georgia Research Alliance. The simulated chemical processing plant, known as the Graphical Realism Framework for Industrial Control Simulations (GRFICS), allows users to play the roles of both attackers and defenders – with separate views provided. The attackers might take control of valves in the plant to build up pressure in a reaction vessel to cause an explosion. The defenders have to watch for signs of attack and make sure security systems remain operational. Of great concern is the “man-in-the-middle” attack in which a bad actor breaks into the facility’s control system – and also takes control of the sensors and instruments that provide feedback to the operators. By gaining control of sensors and valve position indicators, the attacker could send false readings that would reassure the operators – while the damage proceeded. “The pressure and reactant levels could be made to seem normal to the operators, while the pressure is building toward a dangerous point,” Beyah said. Though the readings may appear normal, however, a knowledgeable operator might still detect clues that the system has been attacked. “The more the operators know the process, the harder it will be to fool them,” he said. The GRFICS system was built using an existing chemical processing plant simulator, as well as a 3D video gaming engine running on Linux virtual machines. At its heart is the software that runs PLCs, which can be changed out to represent different types of controllers appropriate to a range of facilities. The human- machine interface can also be altered as needed to show a realistic operator control panel monitoring reaction parameters and valve controller positions. “This is a complete virtual network, so you can set up your own entry detection rules and play on the defensive side to see whether or not your defences are detecting the attacks,” said David Formby, a Georgia Tech post-doctoral researcher who has launched Fortiphyd Logic with Beyah to develop industrial control security products. “We provide access to simulated physical systems that allow students and operators to repeatedly study different parameters and scenarios.” GRFICS is currently available as an open source, free download for use by classes or individuals. It runs on a laptop, but because of heavy use of graphics, requires considerable processing power and memory. An online version is planned, and future versions will simulate the electric power grid, water and wastewater treatment facilities, manufacturing facilities and other users of PLCs. Page 21

×