This document discusses using stream computing approaches to better analyze large amounts of smart meter data from power grids. It proposes moving away from centralized data processing models towards more distributed event processing models. This would allow utilities to create real-time insights from operational data and improve demand response management. The document also explores using cloud platforms and complex event processing techniques to more efficiently handle smart meter data streams in real-time at large scales.
Analytics is being used in data centers, buildings, and municipalities to provide streamlined, proactive services. The deeper understanding of daily operations that analytics provides enables companies and governments to address problems that stretch across their systems, and makes it easier for them to simplify infrastructures and reduce costs.
The fact that a Meter Data Management (MDM) system is the single, secure repository for the millions of data points collected by an AMI makes it the logical solution for data analytics such as validation, editing and estimation that improve the accuracy of billing information. Yet, as a single-source system of record, the MDM also is the starting point for integration of meter-read data with other enterprise systems to improve real-time efficiency of network operations and business processes.
The MDM with meter modelling components and standardized connectivity can integrate with the utility geodatabase (GIS) and outage management system (OMS) to significantly streamline outage detection and restoration verification.
MDM integrated with the utility supervisory control and data acquisition (SCADA) system or distribution management system (DMS) allows comparison of information at substation/net-stations with aggregated meter data to detect potential theft or network loss during distribution. Similar aggregate comparison helps analyse power quality, identify demand trending and forecast demand. These network analysis capabilities empower accurate asset planning and the utility’s ability to meet demand without adding more capacity.
In all of these enterprise-level functions, MDM integration with the GIS provides valuable visualization that facilitates operator and analyst identification of areas of concern or opportunity.
The real-time network intelligence possible with such a powerful MDM solution can return substantial benefits to several utility operations and business processes — well beyond the initial-level billing accuracy improvement.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Analytics is being used in data centers, buildings, and municipalities to provide streamlined, proactive services. The deeper understanding of daily operations that analytics provides enables companies and governments to address problems that stretch across their systems, and makes it easier for them to simplify infrastructures and reduce costs.
The fact that a Meter Data Management (MDM) system is the single, secure repository for the millions of data points collected by an AMI makes it the logical solution for data analytics such as validation, editing and estimation that improve the accuracy of billing information. Yet, as a single-source system of record, the MDM also is the starting point for integration of meter-read data with other enterprise systems to improve real-time efficiency of network operations and business processes.
The MDM with meter modelling components and standardized connectivity can integrate with the utility geodatabase (GIS) and outage management system (OMS) to significantly streamline outage detection and restoration verification.
MDM integrated with the utility supervisory control and data acquisition (SCADA) system or distribution management system (DMS) allows comparison of information at substation/net-stations with aggregated meter data to detect potential theft or network loss during distribution. Similar aggregate comparison helps analyse power quality, identify demand trending and forecast demand. These network analysis capabilities empower accurate asset planning and the utility’s ability to meet demand without adding more capacity.
In all of these enterprise-level functions, MDM integration with the GIS provides valuable visualization that facilitates operator and analyst identification of areas of concern or opportunity.
The real-time network intelligence possible with such a powerful MDM solution can return substantial benefits to several utility operations and business processes — well beyond the initial-level billing accuracy improvement.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology.
Service oriented cloud architecture for improved performance of smart grid ap...eSAT Journals
Abstract An effective and flexible computational platform is needed for the data coordination and processing associated with real time operational and application services in smart grid. A server environment where multiple applications are hosted by a common pool of virtualized server resources demands an open source structure for ensuring operational flexibility. In this paper, open source architecture is proposed for real time services which involve data coordination and processing. The architecture enables secure and reliable exchange of information and transactions with users over the internet to support various services. Prioritizing the applications based on complexity enhances efficiency of resource allocation in such situations. A priority based scheduling algorithm is proposed in the work for application level performance management in the structure. Analytical model based on queuing theory is developed for evaluating the performance of the test bed. The implementation is done using open stack cloud and the test results show a significant gain of 8% with the algorithm. Index Terms: Service Oriented Architecture, Smart grid, Mean response time, Open stack, Queuing model
This checklist explores some fundamental aspects of the data architecture necessary for IoT success. It will examine what is required to enable an environment that can rapidly adapt to the dynamic nature of massive numbers of connected sensors and other end-point devices, communication and data streaming, ingestion and analysis, and deployment of developed analytics models for automated decision making.
ESB PLATFORM INTEGRATING KNIME DATA MINING TOOL ORIENTED ON INDUSTRY 4.0 BASE...ijaia
In this paper are discussed some results related to an industrial project oriented on the integration of data
mining tools into Enterprise Service Bus (ESB) platform. WSO2 ESB has been implemented for data
transaction and to interface a client web service connected to a KNIME workflow behaving as a flexible
data mining engine. In order to validate the implementation two test have been performed: the first one is
related to the data management of two relational database management system (RDBMS) merged into one
database whose data have been processed by KNIME dashboard statistical tool thus proving the data
transfer of the prototype system; the second one is related to a simulation of two sensor data belonging to
two distinct production lines connected to the same ESB. Specifically in the second example has been
developed a practical case by processing by a Multilayered Perceptron (MLP) neural networks the
temperatures of two milk production lines and by providing information about predictive maintenance. The
platform prototype system is suitable for data automatism and Internet of Thing (IoT) related to Industry
4.0, and it is suitable for innovative hybrid system embedding different hardware and software technologies
integrated with ESB, data mining engine and client web-services.
Real Time Dynamics Monitoring System (RTDMS): Phasor Applications for the Co...Power System Operation
The electric power grid in the US has evolved
from a vertically integrated system to a mixture of
regulated and deregulated competitive market system.
Grid oversight is transitioning from local utilities to an
assortment of transmission companies, regional
Independent System Operators (ISOs) and Regional
Transmission Organizations (RTOs). Regulatory and
economic pressures have caused new transmission
construction to lag the growth in demand. These forces
have increased pressure on electricity markets and
caused operators to maximize the utilization of the
system. The result is an operating environment where
operators are faced with quick changing and previously
unseen power flow patterns and operational conditions
with limited information available for real-time
operation and decision-making. Furthermore, the aging
Decision Making Framework in e-Business Cloud Environment Using Software Metr...ijitjournal
Cloud computing technology is most important one in IT industry by enabling them to offer access to their
system and application services on payment type. As a result, more than a few enterprises with Facebook,
Microsoft, Google, and amazon have started offer to their clients. Quality software is most important one in
market competition in this paper presents a hybrid framework based on the goal/question/metric paradigm
to evaluate the quality and effectiveness of previous software goods in project, product and organizations
in a cloud computing environment. In our approach it support decision making in the area of project,
product and organization levels using Neural networks and three angular metrics i.e., project metrics,
product metrics, and organization metrics
ENERGY EFFICIENT COMPUTING FOR SMART PHONES IN CLOUD ASSISTED ENVIRONMENTIJCNCJournal
In recent years, the employment of smart mobile phones has increased enormously and are concerned as an area of human life. Smartphones are capable to support immense range of complicated and intensive applications results shortened power capability and fewer performance. Mobile cloud computing is the newly rising paradigm integrates the features of cloud computing and mobile computing to beat the constraints of mobile devices. Mobile cloud computing employs computational offloading that migrates the computations from mobile devices to remote servers. In this paper, a novel model is proposed for dynamic task offloading to attain the energy optimization and better performance for mobile applications in the cloud environment. The paper proposed an optimum offloading algorithm by introducing new criteria such as benchmarking for offloading decision making. It also supports the concept of partitioning to divide the computing problem into various sub-problems. These sub-problems can be executed parallelly on mobile device and cloud. Performance evaluation results proved that the proposed model can reduce around 20% to 53% energy for low complexity problems and up to 98% for high complexity problems.
Delivering IT as A Utility- A Systematic Reviewijfcstjournal
Utility Computing has facilitated the creation of new markets that has made it possible to realize the longheld
dream of delivering IT as a Utility. Even though utility computing is in its nascent stage today, the
proponents of utility computing envisage that it will become a commodity business in the upcoming time
and utility service providers will meet all the IT requests of the companies. This paper takes a crosssectional
view at the emergence of utility computing along with different requirements needed to realize
utility model. It also surveys the current trends in utility computing highlighting diverse architecture
models aligned towards delivering IT as a utility. Different resource management systems for proficient
allocation of resources have been listed together with various resource scheduling and pricing strategies
used by them. Further, a review of generic key perspectives closely related to the concept of delivering IT
as a Utility has been taken citing the contenders for the future enhancements in this technology in the form
of Grid and Cloud Computing.
This paper contains the details of the study
of Insurance Management system. The developed system
will manage all the information regarding Insured and
policies offered by the Life Insurance companies. It also
contains an integrated tool of voice enabled appointment
scheduler that alerts an agent for his daily activities. It also
contains features like Smart Data backup system,
Provisioning System, Policies Record, Commission Reports
The application created Proposal/ Policy Entries and then
was helpful for agents. It will be designed to offer east
accessible to all records to provide better maintainability
and to enable the user to make the required modification
as and when necessary. Execution of this project would
enable the user to seek, use and manipulate the records
pertaining to every client.
Stream Processing Environmental Applications in Jordan ValleyCSCJournals
Database system architectures have been gone through innovative changes, specially the unifications of algorithms and data via the integration of programming languages with the database system. Such an innovative changes is needed in Stream-based applications since they have different requirements for the principal of stream data processing system. For example, the monitoring component requires query processing system to detect user-defined events in a timely manner as in real time monitoring system. Furthermore, stream processing fits a large class of new applications for which conventional DBMSs fall short since many stream-oriented systems are inherently geographically distributed and the distribution offers a scalable load management and higher availability. This paper presents statistical information about metrological data such as the weather, soil and evapotranspiration as collected by the weather stations distributed in different locations in Jordan Valley. In addition, it shows the importance of Stream Processing in some real life applications, and shows how the database systems can help researcher in building prototypes that can be implemented and used in a continuous monitoring system.
Implementing Oracle Utility-Meter Data Management For Power ConsumptionIJERDJOURNAL
ABSTRACT: In this digital mobile world, it‟s need of time to streamline and increase efficiency in business processes like effective data collection, measurement, automatic validation, editing and estimation of measurement data, analysis and dashboard for forecasting and ease in end user accessibility with Just in Time. This paper is following two methodology in this process. CEMLI is an extensive framework for developing and implementing for Oracle whereas OUM is business process and use case driven process which supports products, tool, technologies and documentation. This paper have focused on analytical data, system automation functionality along with prototype designing. For this, analysts and administrators will collect and define calculation rule for data collection and measurement, deployment methods, dashboards and security features. This paper gives measure understanding of cloud technologies and their features like services (SaaS), deployment methods, security and ability to reduce overhead cost, downtime, and automate business processes with 360 degree review and analysis. It consolidates data in one system with volumes of analog and interval data which facilitates new customer with offering and effective program. Also it maximizes return on investments and protects revenue through comprehensive exception management.
Revue de presse IoT / Data du 26/03/2017Romain Bochet
Sommaire :
- From the Edge To the Enterprise
- The Internet of Energy: Smart Sockets
- Google's big data calculates US rooftop solar potential
- Energy management: Oracle Utilities launches smart grid and IoT device management solution in the cloud
- Are vehicles the mobile sensor beds of the future?
Service oriented cloud architecture for improved performance of smart grid ap...eSAT Journals
Abstract An effective and flexible computational platform is needed for the data coordination and processing associated with real time operational and application services in smart grid. A server environment where multiple applications are hosted by a common pool of virtualized server resources demands an open source structure for ensuring operational flexibility. In this paper, open source architecture is proposed for real time services which involve data coordination and processing. The architecture enables secure and reliable exchange of information and transactions with users over the internet to support various services. Prioritizing the applications based on complexity enhances efficiency of resource allocation in such situations. A priority based scheduling algorithm is proposed in the work for application level performance management in the structure. Analytical model based on queuing theory is developed for evaluating the performance of the test bed. The implementation is done using open stack cloud and the test results show a significant gain of 8% with the algorithm. Index Terms: Service Oriented Architecture, Smart grid, Mean response time, Open stack, Queuing model
This checklist explores some fundamental aspects of the data architecture necessary for IoT success. It will examine what is required to enable an environment that can rapidly adapt to the dynamic nature of massive numbers of connected sensors and other end-point devices, communication and data streaming, ingestion and analysis, and deployment of developed analytics models for automated decision making.
ESB PLATFORM INTEGRATING KNIME DATA MINING TOOL ORIENTED ON INDUSTRY 4.0 BASE...ijaia
In this paper are discussed some results related to an industrial project oriented on the integration of data
mining tools into Enterprise Service Bus (ESB) platform. WSO2 ESB has been implemented for data
transaction and to interface a client web service connected to a KNIME workflow behaving as a flexible
data mining engine. In order to validate the implementation two test have been performed: the first one is
related to the data management of two relational database management system (RDBMS) merged into one
database whose data have been processed by KNIME dashboard statistical tool thus proving the data
transfer of the prototype system; the second one is related to a simulation of two sensor data belonging to
two distinct production lines connected to the same ESB. Specifically in the second example has been
developed a practical case by processing by a Multilayered Perceptron (MLP) neural networks the
temperatures of two milk production lines and by providing information about predictive maintenance. The
platform prototype system is suitable for data automatism and Internet of Thing (IoT) related to Industry
4.0, and it is suitable for innovative hybrid system embedding different hardware and software technologies
integrated with ESB, data mining engine and client web-services.
Real Time Dynamics Monitoring System (RTDMS): Phasor Applications for the Co...Power System Operation
The electric power grid in the US has evolved
from a vertically integrated system to a mixture of
regulated and deregulated competitive market system.
Grid oversight is transitioning from local utilities to an
assortment of transmission companies, regional
Independent System Operators (ISOs) and Regional
Transmission Organizations (RTOs). Regulatory and
economic pressures have caused new transmission
construction to lag the growth in demand. These forces
have increased pressure on electricity markets and
caused operators to maximize the utilization of the
system. The result is an operating environment where
operators are faced with quick changing and previously
unseen power flow patterns and operational conditions
with limited information available for real-time
operation and decision-making. Furthermore, the aging
Decision Making Framework in e-Business Cloud Environment Using Software Metr...ijitjournal
Cloud computing technology is most important one in IT industry by enabling them to offer access to their
system and application services on payment type. As a result, more than a few enterprises with Facebook,
Microsoft, Google, and amazon have started offer to their clients. Quality software is most important one in
market competition in this paper presents a hybrid framework based on the goal/question/metric paradigm
to evaluate the quality and effectiveness of previous software goods in project, product and organizations
in a cloud computing environment. In our approach it support decision making in the area of project,
product and organization levels using Neural networks and three angular metrics i.e., project metrics,
product metrics, and organization metrics
ENERGY EFFICIENT COMPUTING FOR SMART PHONES IN CLOUD ASSISTED ENVIRONMENTIJCNCJournal
In recent years, the employment of smart mobile phones has increased enormously and are concerned as an area of human life. Smartphones are capable to support immense range of complicated and intensive applications results shortened power capability and fewer performance. Mobile cloud computing is the newly rising paradigm integrates the features of cloud computing and mobile computing to beat the constraints of mobile devices. Mobile cloud computing employs computational offloading that migrates the computations from mobile devices to remote servers. In this paper, a novel model is proposed for dynamic task offloading to attain the energy optimization and better performance for mobile applications in the cloud environment. The paper proposed an optimum offloading algorithm by introducing new criteria such as benchmarking for offloading decision making. It also supports the concept of partitioning to divide the computing problem into various sub-problems. These sub-problems can be executed parallelly on mobile device and cloud. Performance evaluation results proved that the proposed model can reduce around 20% to 53% energy for low complexity problems and up to 98% for high complexity problems.
Delivering IT as A Utility- A Systematic Reviewijfcstjournal
Utility Computing has facilitated the creation of new markets that has made it possible to realize the longheld
dream of delivering IT as a Utility. Even though utility computing is in its nascent stage today, the
proponents of utility computing envisage that it will become a commodity business in the upcoming time
and utility service providers will meet all the IT requests of the companies. This paper takes a crosssectional
view at the emergence of utility computing along with different requirements needed to realize
utility model. It also surveys the current trends in utility computing highlighting diverse architecture
models aligned towards delivering IT as a utility. Different resource management systems for proficient
allocation of resources have been listed together with various resource scheduling and pricing strategies
used by them. Further, a review of generic key perspectives closely related to the concept of delivering IT
as a Utility has been taken citing the contenders for the future enhancements in this technology in the form
of Grid and Cloud Computing.
This paper contains the details of the study
of Insurance Management system. The developed system
will manage all the information regarding Insured and
policies offered by the Life Insurance companies. It also
contains an integrated tool of voice enabled appointment
scheduler that alerts an agent for his daily activities. It also
contains features like Smart Data backup system,
Provisioning System, Policies Record, Commission Reports
The application created Proposal/ Policy Entries and then
was helpful for agents. It will be designed to offer east
accessible to all records to provide better maintainability
and to enable the user to make the required modification
as and when necessary. Execution of this project would
enable the user to seek, use and manipulate the records
pertaining to every client.
Stream Processing Environmental Applications in Jordan ValleyCSCJournals
Database system architectures have been gone through innovative changes, specially the unifications of algorithms and data via the integration of programming languages with the database system. Such an innovative changes is needed in Stream-based applications since they have different requirements for the principal of stream data processing system. For example, the monitoring component requires query processing system to detect user-defined events in a timely manner as in real time monitoring system. Furthermore, stream processing fits a large class of new applications for which conventional DBMSs fall short since many stream-oriented systems are inherently geographically distributed and the distribution offers a scalable load management and higher availability. This paper presents statistical information about metrological data such as the weather, soil and evapotranspiration as collected by the weather stations distributed in different locations in Jordan Valley. In addition, it shows the importance of Stream Processing in some real life applications, and shows how the database systems can help researcher in building prototypes that can be implemented and used in a continuous monitoring system.
Implementing Oracle Utility-Meter Data Management For Power ConsumptionIJERDJOURNAL
ABSTRACT: In this digital mobile world, it‟s need of time to streamline and increase efficiency in business processes like effective data collection, measurement, automatic validation, editing and estimation of measurement data, analysis and dashboard for forecasting and ease in end user accessibility with Just in Time. This paper is following two methodology in this process. CEMLI is an extensive framework for developing and implementing for Oracle whereas OUM is business process and use case driven process which supports products, tool, technologies and documentation. This paper have focused on analytical data, system automation functionality along with prototype designing. For this, analysts and administrators will collect and define calculation rule for data collection and measurement, deployment methods, dashboards and security features. This paper gives measure understanding of cloud technologies and their features like services (SaaS), deployment methods, security and ability to reduce overhead cost, downtime, and automate business processes with 360 degree review and analysis. It consolidates data in one system with volumes of analog and interval data which facilitates new customer with offering and effective program. Also it maximizes return on investments and protects revenue through comprehensive exception management.
Revue de presse IoT / Data du 26/03/2017Romain Bochet
Sommaire :
- From the Edge To the Enterprise
- The Internet of Energy: Smart Sockets
- Google's big data calculates US rooftop solar potential
- Energy management: Oracle Utilities launches smart grid and IoT device management solution in the cloud
- Are vehicles the mobile sensor beds of the future?
Preparing for the Future: How Asset Management Will Evolve in the Age of Smar...Schneider Electric
Most utilities struggle to organize information about their distribution network assets. Operations, engineering, accounting, and other business functions all use different tools and systems, forcing grid operators to synchronize separate databases. This paper presents an improved approach to managing grid assets by establishing a ‘single source of the truth,’ eliminating special-purpose databases, utilizing spatial databases, and incorporating a workflow management tool to support database updates.
Advanced Automated Approach for Interconnected Power System Congestion ForecastPower System Operation
system operators need the solution that
will allow them to keep the electrical grid secure in
spite of frequent changes in network loadings. The
Day-ahead congestion forecast (DACF) is a part of
congestion management process that becomes more
and more important. This paper contains the
description of an approach to automate the DACF for
an interconnected power system network. Using the
existing industrial tools and workflows automation
system, the congestion forecast system runs in fully
automatic mode, significantly reducing need of
specialist resources in operational congestion
management process. The realisation of the advanced
automated approach allows a quick, efficient and cost
effective method for energy management that could be
easily adopted by transmission system operators.
This presentation examines how AMI data, the collection of this data and the creation of tools to use this data have dramatically changed and is continuing to change metering operations. We will look at some of the challenges we are facing as we learn how to do business most effectively with this information and these tools.
Redefining Smart Grid Architectural Thinking Using Stream ComputingCognizant
Using stream computing, power utilities can capture and analyze data generated by smart meters to achieve new thresholds of performance, while building better consumer relationships.
1. Redefining Smart Grid Architectural
Thinking Using Stream Computing
• Cognizant 20-20 Insights
Executive Summary
After an extended pilot phase, smart meters
have moved into the mainstream for measuring
the performance of a multiplicity of business
functions across the power utilities industry.
Moving forward, the next objective is to create new
ways of handling large data sets for constructing
actionable responses to smart-meter-generated
data, particularly when it comes to processes
such as validation estimation and evaluation,
demand response and load management.
As smart meters proliferate across power grids,
energy utilities are dealing with huge packets of
data coursing through their IT networks. More and
more granular data holds the promise of enabling
faster and more informed decision making that
drives operational improvements and, perhaps,
enables consumers to better manage their own
power consumption. To get there, however,
utilities must first conquer growing network
latency challenges caused not only by the huge
profusion of smart-meter-generated data but
also by processing inefficiencies created by their
dependence on more centralized models.
Forward-thinking utilities need more distributed
and virtual complex event processing models that
transform real-time operational data into applied
insights.Creatingreal-timeoperationalknowledge
can drive better demand response management,
improve quality of service and preempt fraud and
service outages before they inflict reputational
damage. Rethinking their basic information archi-
tecture will help utilities transform their power
grids into adaptive and intelligent infrastructures
that inform continuous improvements in opera-
tional efficiency and business effectiveness.
This white paper explores the challenges and
benefitsofSmartGridcreationandoffersconcrete
thinking on new architectural approaches built
on emerging software standards that more
effectively leverage established forms of stream
computing.1
It examines new thinking on ways to
capture and analyze data generated by smart
meters that can help power utilities achieve new
thresholds of performance over the near- and
long-term, while building better relationships with
consumers. We examine how stream data2
aids
usage forecasts (predicted by converting historic
data coupled with real-time events into opera-
tional KPIs) and identifies anomalies and patterns
in an ever-changing and high-transaction environ-
ment. In our view, when operational data is trans-
ported on a pervasive communication infrastruc-
ture (and coupled with two-way communication
between utilities and consumers) data integration
challenges can be overcome, setting the stage for
a brighter and more energy-efficient future.
Using Cloud Platforms for Smart Meter
Infrastructure
One way to unlock the data treasure trove
enabled by smart meters is by tapping virtual
data processing infrastructure delivered via
cloud computing. Clouds offer the advantages of
scalable and elastic resources to build software
cognizant 20-20 insights | june 2011
2. infrastructure that support such dynamic,
always-on applications. But the unique needs of
energy informatics applications also highlight the
challenges of using cloud platforms, such as the
need to support efficient and reliable streaming,
low-latency scheduling and scale-out, as well as
effective data sharing.
Cloud platforms are an intrinsic component in
creating a software architecture to drive more
effective use of Smart Grid applications. The
primary reason: Cloud data centers can accom-
modate the large-scale data interactions that
take place on Smart Grids and are better archi-
tected than centralized systems to process the
huge, persistent flows of data generated across
the utility value chain. Figure 1 shows how this
might work within a power utilities company.
The computational demand for demand-response
applications will be a function of the energy
deficit between supply and demand. This typically
oscillates based on the time of the day and
possible weather conditions. This translates into a
need for compute- intensive, low-latency response
at midday and limited analysis in off-peak evening
hours. The elastic nature of cloud resources makes
it possible for utilities to avoid costly capital
investment for their peak computation needs.
This results in information sharing on real-time
energy usage and power pricing. As Figure 1
shows, Smart Grid applications that span smart
meters (distributed at the consumer level),
cloud platforms (for data integration by service
providers) and clusters (at energy utilities)
introduce systems heterogeneity, which efficient
streaming is positioned to rationalize.
The need to perform complex processing with
minimal latency over large volumes of data has
led to the evolution of various data processing
paradigms. Industry majors such as IBM, Oracle,
Microsoft and SAP have developed event-oriented
application development approaches to decrease
the latency in processing large volumes of data.
These efforts reveal the following:
Since smart meters generate interval data•
that is time-series in nature, companies need
efficient ways of processing queries incremen-
tally and via in-memory technologies. They
then need a way to apply the results to their
emerging dynamic business process models.
Since this buffered data is also stored offline•
for static analysis, mining, tracing and back-
testing, companies need a means of managing
and accessing these stores efficiently.
As Smart Grids proliferate, businesses require
greater data availability rates. Companies can no
longer afford to collect all time-series data, load it
into a database and then build database indexes
for query efficiency. Instead, businesses need
cognizant 20-20 insights 2
Consumers and Smart Meters: Interactions on a Cloud Stream
Active feedback of pricing
Load curtailment signals
Commercial
Consumption
Power
Generation
Historian
Pattern
Recognition
Residential
Consumption
Hourly Consumption
Prediction
Power consumption data stream
Weather data
Power production data
Figure 1
3. cognizant 20-20 insights 3
these queries to be continuously and incremen-
tally computed and updated as new relevant data
arrives from smart meter sources. This will avoid
the need to re-process existing data. Incremental
computation is necessary to create a low-latency
response to continuously flowing time-series data.
Complex event processing (CEP) is a widely used
technique in smart meter data processing, where
data is continuously monitored, verified and acted
upon, given ongoing and changing conditions.
With this approach, data, including the event
streams from multiple sources, is processed based
on a declarative query language. Importantly, all
of this is accomplished with near-zero latency.
Event-Driven Data Processing
Challenges
The key attributes of complex event processing
include:
Express fundamental query logic:• Incorpo-
rate windowed processing and time progress
as a core component for query logic.
Handle error or delayed data:• Delayed
processing until guaranteed, with no late-arriv-
ing events. This increases latency; otherwise,
aggressively process event and produce
tuples.3
Extensibility:• Given the complexity of meter
data and event operations, there is a need
to support custom-built streaming logic as
libraries.
Universal specification:• Semantics of query
logic need to be independent of when and how
programmers physically read and understand
events. Applications time, rather than system
time, is used to enable a universal time zone.
These attributes ensure that with complex event
processing, query logic is kept generic regarding
how events are read and how their output is inter-
preted. Tuples should follow universal time, which
can be read and processed on any system.
Performance Implications
In-stream processing doesn’t allow data to be
written back to the disk for processing later from
internal state in main memory. With smart meter
data, an event queue is filled to capacity once
the arrival rate is greater than the processing
capability of the system. The metrics used to
manage the data stream are latency, throughput,
correctness and memory usage.
Ease of Management
To effectively deploy smart meters and the data
they generate, a number of factors need to be
addressed, including query composability and
ease of deployment over a variety of environ-
ments, such as single servers and clusters. Query
composability requires the ability to “publish”
query results, as well as the ability for Continuous
Query (CQ) to consume results of existing CQs
and streams.
Typical meter streaming queries entail rules such
as:
Present the top three values every 10 minutes.•
Compute a running average of each sensor•
value over the last 20 seconds.
Filter out sensor readings when the device was•
in a maintenance period.
Illustrate when event “A” was followed by event•
“B” within three minutes.
OSIsoft’s PI System provides power utilities
with the leading operation data management
infrastructure for Smart Grid components that
encompass power generation, transmission and
distribution. This software provides capabilities
for energy management, condition-based mainte-
nance, operational performance monitoring, cur-
tailment programs, renewable energy monitoring
and phasor monitoring of transmission lines,
among other functionalities. `
OSIsoft MDUS integrates a utility’s meter system
and SAP’s AMI Integration for Utilities to perform
tasks such as billing. It also provides the ability to
integrate meter data with other operational data.
It serves as a real-time data collector, which is
head-end system vendor-independent.
Integration of meter data into business systems
such as billing requires data validation and other
forms of aggregations. OSIsoft has chosen to
leverageCEPtoaccomplishthistask.CEPprovides
the scalability required by SAP AMI and utilizes a
SQL-based language for defining the validation
rules. OSIsoft uses Microsoft’s StreamInsight
CEP engine, which enables utilities to define and
implement required meter data validation. This
puts this important facet of regulatory compliance
requirements into the hands of the utility’s IT and
business specialists.
4. cognizant 20-20 insights 4
There are two ways streaming can be adopted in
current meter data systems:
Server-side streaming:• The stream is pro-
cessed on the (OSIsoft) PI snapshot and
streamed with the processed results back to
the PI server (see Figure 2).
Edge processing:• In this model, the CQs
are applied at the data source (and at the PI
interface level), where the results are only
stored as processed data (see Figure 3).
Cloud and Adaptive Rate Control
The growing importance for utilities to process
and analyze thousands of meter data streams
suggests that they should
consider the adoption of
cloud platforms to achieve
scalable, latency-sensitive
stream processing. One
approach to consider is
adaptive rate control, which
is the process of restrict-
ing the stream rate to meet
accuracy requirements for
Smart Grid applications.
This approach consumes
less bandwidth and com-
putational overhead within
the cloud for stream
processing. The experi-
mentation of the Smart Grid stream processing
pipeline, modeled using IBM InfoSphere Streams
and deployed on the Eucalyptus4
private cloud,5
shows 50% bandwidth savings, resulting from
adaptive stream rate control.
Low-latency stream processing is a key com-
ponent of the software architecture required
to support demand-response applications. The
stream processing system ingests smart meter
data arriving from consumers and acts as a first
responder to detect local and global power usage
skews and to alert the utility operator. At 1KB per
event generated each minute, 2TB of data will
stream each day. Processing such large-scale
streams can be compute- and data-intensive;
public or private cloud platforms provide a scal-
able and flexible infrastructure for building such
Smart Grid applications.
However, computational and bandwidth con-
straints at the consumer and utility levels mean
that power usage data streamed at static rates
from smart meters to the utility can either be
at too high a latency to detect usage skews in a
timely manner or at too high a rate to computa-
tionally overwhelm the system. Smart meters
connect to the utility using heterogeneous
networks and range from low bandwidth power
line carriers at ~20Kbps, to 3G cellular networks
at ~2Mbps, as well as ZigBee at ~250Kbps.
Network bandwidth can thus be a scare resource
at the consumer end. In the case of smart meters,
traffic can be bursty, since data is sent indepen-
dently, causing instantaneous bandwidth needs
to spike.
In the case of high power demand, meters emit
a large volume of information, which requires a
throttle controller to respond to these events and
control latency.
Applying InfoSphere Streams
IBM InfoSphere Streams is a stream processing
system that continuously analyzes massive
volumes of streaming data for business activity
monitoring and active diagnostics. It consists
of a runtime environment that contains stream
instances running on one or more hosts. Within
InfoSphere is a Stream Processing Application
Declarative Engine (known as SPADE), which is
a stream programming model (executed by the
runtime environment) that supports stream
data sources that continuously generate tuples
containing typed attributes.
The growing
importance for utilities
to process and analyze
thousands of meter
data streams suggests
that they should
consider the adoption
of cloud platforms
to achieve scalable,
latency-sensitive
stream processing.
PI Interface Node
Foreign
Device
System
Data
Source
PI Server
Queries
(vs .NET-LINQ)
Complex Event Processing Engine
Input Adapter(s)
Input Adapter(s)
Output Adapter(s)
Output Adapter(s)
Stream Insight Engine
Stream Insight Engine
PI Server
Queries
(vs .NET-
LINQ)
Figure 2
PI Interface Node
Foreign
Device
System
Data
Source
PI Server
Queries
(vs .NET-LINQ)
Complex Event Processing Engine
Input Adapter(s)
Input Adapter(s)
Output Adapter(s)
Output Adapter(s)
Stream Insight Engine
Stream Insight Engine
PI Server
Queries
(vs .NET-
LINQ)
Figure 3
5. cognizant 20-20 insights 55
Figure 5 shows the smart meters present on the
public Internet that generate power usage data
streams accessible over TCP sockets. Here, the
InfoSphere streams run on a cluster that doesn’t
support out-of-box deployment on a cloud plat-
form. To instantiate a stream processing environ-
ment on a Eucalyptus private cloud, a customized
VM image must be built that supports InfoSphere
streams. Communication to the stream instance
is activated when the VM instances are online.
This communication, however, is initiated exter-
nally by a SPADE application started on a stream
instance and configured with a list of named
stream instances on specific hosts.
Each smart meter is a stream source whose
tuples have the identity of the smart meter,
power used within a time duration, as well as the
timestamps of the measurement interval. Addi-
tional meta data about the smart meter and con-
sumer is part of the payload but will be ignored
for the purposes of this discussion. Each tuple
is about 1KB in size. The pipeline first checks if
each individual power usage tuple reports usage
that exceeds a certain constant threshold, Umax
m defined by the utility.
Crossing this threshold will trigger a critical
notification to a utility manager. Next, a relative
condition will check to see if the user’s consump-
tion increases by more than 25% since his/her
previous consumption. This will trigger a less
critical notification. The pipeline then archives
the tuple into a sink file and proceeds to compute
a running sum of the daily usage by the consumer.
Subsequently, the running average over a
tumbling window is updated. These operations are
performed for each smart meter stream (shaded
in brown in Figure 4.
Next, the pipeline aggregates smart meter tuples
across all streams using a tumbling window to
calculate the cumulative consumer energy usage
within a 15-minute time window. This stream
operator (shaded blue in Figure 4) calculates the
total load on the utility. It can be used to alert the
utility manager in case, say, the total consumption
reaches 80%, 90% and >100% of available power
capacity at the utility. Operators shown in dotted
lines (Figure 4) are not part of the application
logic and form the adaptive throttling introduced
next. This core model could be used in demand
response management.
SAP Event Insight
The emergence of smarter grids powered by
stream computing has made clear the need for
more robust processing at the enterprise systems
level. These systems typically struggle to keep
pace with high data volume and a large number
of heterogeneous and widely dispersed data
sources and changing data requirements. This is
being resolved by enterprise software systems
such as mySAP ERP, which have begun to adapt
in-memory processing algorithms for this new
architectural proposition. The result is that SAP
can now deliver an event insight application that
understands the impact of operational events
in real time. In-memory processing not only
brings just-in-time rhyme and reason to real-time
business events, but it can also do so with signifi-
cantly less effort, a reduction in reporting, oper-
ational and opportunity costs, which can power
competitive advantage.
Tracking Energy Consumption
Figure 4
A stream processing pipeline is used to continuously monitor energy usage. Processing elements
in dotted lines show the addition of throttle logic.
Superscript = Meter ID
Subscript = Time
Notify Notify
DB/File
(m1
,t1,u1
1)
(mn
,t1,un
1)
if(u1
1 >Umax
) if(u1
1 >.136*u1
avg) Update u1
sum Update u1
avg
Update u1
avg
R1
++
if(c1
-u1
avg < accept)
R1
Condition
Running
daily sum
AMI’s 15-min
average
Utility’s 15-min
average
Increase
AMI rate
Decrease
AMI rate
Network
Condition
StoreConditionCondition
6. cognizant 20-20 insights 6
Looking Down the Road
We see stream computing as a key element of the
future of work that could be applied broadly by
the power utilities industry. Our view is that its
deployment would minimize network latency and
function as a key component for demand response
management. Moreover, we are planning to inves-
tigate stream computing on the cloud platform.
Our research will appraise the throughput of
a stream processing system and its latency in
processing each tuple as the stream rates adapt.
This approach will help utilities that are adopting
Smart Grids in their mainstream business with
network optimization and intelligent processing,
saving money by automating their demand
response program and load management
processes. Standardizing these processes saves
IT maintenance expense, freeing capital to be
invested in other core business activities.
In a business context, this approach will help
utilities with energy efficiency programs and
grid management. It does this by providing a
mechanism to convert dollars saved by eliminat-
ing inefficient energy generation and distribution
toward more effective asset management.
Architecture of Stream Processing and the Throttle Controller
Control Feedbacks
InfoSphere Streams
Response
Industrial/Commercial
Residential Building
AMI
AMI
DataFiles
TCP/IP
Input Streams Streams Processing
Throttle Controller
DataFiles
Electric
Gas
Electric
Gas
Figure 5
Footnotes
1
Stream computing is a high-performance computer system that analyzes multiple data streams from
many sources, live. Stream computing uses software algorithms to analyze data in real time, which
increases speed and accuracy when dealing with data handling and analysis.
2
Stream data is a sequence of digitally encoded coherent signals (packets of data or data packets) used
to transmit or receive information.
3
Tuple is an ordered pair of energy data to be processed and is an effective way of representing
in-stream computing.
4
Eucalyptus Cloud is a software platform for the implementation of private cloud computing on
computer clusters.
7. cognizant 20-20 insights 77
5
Private clouds are internal clouds that, according to some vendors, emulate cloud computing on private
networks. These (typically virtualization automation) products offer the ability to host applications or
virtual machines in a company’s own set of hosts. They provide the benefits of utility computing,
such as shared hardware costs, the ability to recover from failure and the ability to scale up or down
depending upon demand.
References
“IBM Infosphere Streams Version 1.2.1: Programming Model and Language Reference,” IBM, Oct. 4,
2010, http://publib.boulder.ibm.com/infocenter/streams/v1r2/topic/com.ibm.swg.im.infosphere.streams.
product.doc/doc/IBMInfoSphereStreams-LangRef.pdf.
D. J. Abadi, Y. Ahmad, M. Balazinska, U. Cetintemel, M. Cherniack, J. H. Hwang, W. Lindner, A. Maskey,
A. Rasin, E. Ryvkina, N. Tatbul, Y. Xing and S. B. Zdonik, “The Design of the Borealis Stream Processing
Engine,” Proceedings of the Second Biennial Conference on Innovative Data Systems Research, pp
277-289, January 2005.
D. J. Abadi, D. Carney, U. Cetintemel, M. Cherniack, C. Convey, S. Lee, M. Stonebraker, N. Tatbul and S.
Zdonik. “Aurora: A New Model and Architecture for Data Stream Management,” The VLDB Journal, Vol
12, Issue 2, August 2003.
A. Arasu, S. Babu and J. Widom. “The CQL Continuous Query Language: Semantic Foundations and
Query Execution.” The VLDB Journal, Vol 15, Issue 2, June 2006.
A. M. Ayad, J. F. Naughton. “Static Optimization of Conjunctive Queries with Sliding Windows Over Infinite
Streams,” Proceedings of the International Conference on Management of Data, SIGMOD 2004, ACM.
C. Ballard, D. M. Farrell, M. Lee, P. D. Stone, S. Thibault and S. Tucker, “IBM InfoSphere Streams Harnessing
Data in Motion,” IBM, September 2010.
A. Biem, E. Bouillet, H. Feng, A. Ranganathan, A. Riabov, O. Verscheure, H. Koutsopoulos and C. Moran,
“IBM InfoSphere Streams for Scalable, Real-Time Intelligent Transportation Services,” Proceedings of
the International Conference on Management of Data, SIGMOD 2010, pp 1,093-1,104, ACM.
S. Chandrasekaran, O. Cooper, A. Deshpande, M. J. Franklin, J. M. Hellerstein, W. Hong, S. Krishnamurthy,
S. Madden, V. Raman, F. Reiss and M. A. Shah, “TelegraphCQ: Continuous Dataflow Processing for an
Uncertain World,” SIGMOD 2003, ACM.
StreamBase, http://www.streambase.com/
D. Abadi et al., “The Design of the Borealis Stream Processing Engine.”
“Why IP is the Right Foundation for the Smart Grid,” Cisco Systems, Inc., January 2010.
“The Role of the Internet Protocol (IP) in AMI Networks for Smart Grid,” National Institute of Standards
and Technology, NIST PAP 01, Oct. 24, 2009.
D. Zinn, Q. Hart, B. Ludaescher and Y. Simmhann, “Streaming Satellite Data to Cloud Workflows for
On-Demand Computing of Environmental Products,” Workshop on Workflows in Support of Large-Scale
Science (WORKS), 2010.
Arvind Arasu, Shivnath Babu, Jennifer Widom, ”CQL: A Language for Continuous Queries over Streams
and Relations,” Database Programming Languages, 9th International Workshop, DBPL 2003, Potsdam,
Germany, Sept. 6-8, 2003.
“Pattern Detection with StreamInsight” Microsoft StreamInsight blog, Sept. 2, 2010, http://tinyurl.
com/2afzbhd
“InfoSphere Streams,” IBM, http://www.ibm.com/software/data/infosphere/streams