1) The document discusses how big data, analytics, and physics-based modeling can transform industrial sectors like power, manufacturing, and transportation by making machines more intelligent and efficient.
2) It argues that connecting millions of industrial machines to collect massive amounts of data, and applying advanced analytics, will improve productivity, optimize operations, and reduce costs across industries.
3) A key enabler is developing "software-defined machines" that can easily connect to the internet, run analytics apps in the cloud to become self-aware, and update capabilities without hardware changes.
What are big data in the contacts of energy & utilities, and how/where can the utilities find value in the data. In this C-level presentation we discussed the three prime areas: grid operations, smart metering and asset & workforce management. A section on cognitive computing for utilities have been omitted from the presentation due to confidentiality - but I tell you - it is mind-blowing perspectives on how IBM Watson will help utilities plan and optimize their operations in the near future!
See more on http://www.ibmbigdatahub.com/industry/energy-utilities
AWEA Cognitive Analytics for Predictive FuturesSparkCognition
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting™, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations.
Research has demonstrated the value of Machine Learning in delivering next generation analytics to improve safety, performance, and reliability in today’s modern wind turbines.
SMi Group is bringing to London this December, a new masterclass training course entitled Big Data for Utilities - combining and creating value from transactional, geospatial and real-time domain information. Don't miss this must attend course in association with Alliander and SAP UK & Ireland
From grid infrastructure analytics to consumer analytics, the true power of data is starting to be realized. Greentech Media Co-Founder and President, Rick Thompson, sets the stage for the days presentations and panels.
Data Science for Energy Efficiency (Dmytro Mindra Technology Stream)IT Arena
Lviv IT Arena is a conference specially designed for programmers, designers, developers, top managers, inverstors, entrepreneurs and startuppers. Annually it takes place at the beginning of October in Lviv at Arena Lviv stadium. In 2016 the conference gathered more than 1800 participants and over 100 speakers from companies like Microsoft, Philips, Twitter, UBER and IBM. More details about the conference at itarena.lviv.ua.
Research results indicating steps and success factors for integrating operational and information technologies particularly for organisations that have critical asset infrastructure such as power stations, control and managed remotely using technology such as SCADA.
What are big data in the contacts of energy & utilities, and how/where can the utilities find value in the data. In this C-level presentation we discussed the three prime areas: grid operations, smart metering and asset & workforce management. A section on cognitive computing for utilities have been omitted from the presentation due to confidentiality - but I tell you - it is mind-blowing perspectives on how IBM Watson will help utilities plan and optimize their operations in the near future!
See more on http://www.ibmbigdatahub.com/industry/energy-utilities
AWEA Cognitive Analytics for Predictive FuturesSparkCognition
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting™, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations.
Research has demonstrated the value of Machine Learning in delivering next generation analytics to improve safety, performance, and reliability in today’s modern wind turbines.
SMi Group is bringing to London this December, a new masterclass training course entitled Big Data for Utilities - combining and creating value from transactional, geospatial and real-time domain information. Don't miss this must attend course in association with Alliander and SAP UK & Ireland
From grid infrastructure analytics to consumer analytics, the true power of data is starting to be realized. Greentech Media Co-Founder and President, Rick Thompson, sets the stage for the days presentations and panels.
Data Science for Energy Efficiency (Dmytro Mindra Technology Stream)IT Arena
Lviv IT Arena is a conference specially designed for programmers, designers, developers, top managers, inverstors, entrepreneurs and startuppers. Annually it takes place at the beginning of October in Lviv at Arena Lviv stadium. In 2016 the conference gathered more than 1800 participants and over 100 speakers from companies like Microsoft, Philips, Twitter, UBER and IBM. More details about the conference at itarena.lviv.ua.
Research results indicating steps and success factors for integrating operational and information technologies particularly for organisations that have critical asset infrastructure such as power stations, control and managed remotely using technology such as SCADA.
Data as the New Oil: Producing Value in the Oil and Gas IndustryVMware Tanzu
Oil and gas exploration and production activities generate large amounts of data from sensors, logistics, business operations and more. Given the data volume, variety and velocity, gaining actionable and relevant insights from the data is challenging. Learn about these challenges and how to address them by leveraging big data technologies in this webinar.
During the webinar we will dive deep into approaches for predicting drilling equipment function and failure, a key step towards zero unplanned downtime. In the process of drilling wells, non-productive time due to drilling equipment failure can be expensive. We will highlight how the Pivotal Data Labs team uses big data technologies to build models for predicting drilling equipment function and failure. Models such as these can be used to build essential early warning systems to reduce costs and minimize unplanned downtime.
Panelist:
Rashmi Raghu, Senior Data Scientist, Pivotal
Hosted by:
Tim Matteson, Co-Founder -- Data Science Central
Video replay is available to watch here: http://youtu.be/dhT-tjHCr9E
Emergence of ITOA: An Evolution in IT Monitoring and ManagementHCL Technologies
IT operations analytics(ITOA) plays key role by providing intelligence that makes business sense out of the real-time data being generated by infrastructure components and applications.
Supercharging Smart Meter BIG DATA Analytics with Microsoft Azure Cloud- SRP ...Mike Rossi
Explosive growth of Smart Meter (SM) deployments has presented key infrastructure challenges across the utility industry. The huge volumes of smart meter data has led the industry to a tipping point which requires investments in modernizing existing data warehouses. Typical modernization efforts lead to huge capital expenditures for DW appliances and storage. Sizing this new infrastructure is tricky and can lead to underutilized or poorly performing hardware.
The Cloud is the catalyst to solving these Big Data challenges.
Utilizing a Cloud architecture delivers huge benefits by:
Maximizing use of existing architecture
Minimizing new CapEx expenditures
Lowering overall storage costs
Enabling scale on demand
It seems that everyone is talking about Big Data these days. As the Industrial Internet evolves and continues to feed the Big Data machine, companies are finding it more and more critical to develop strategies for turning data into information and information in intelligence. There’s certainly not a shortage of technologies in the marketplace to start playing with the petabytes of data coming from within and outside of the enterprise.
This is the powerpoint presentation used by our guest speaker Barbara (Barb) Kruetzkamp, IT Leader in Data Management at GE Aviation, to discuss approaches and frameworks to enhance business intelligence capabilities by linking industrial and enterprise (internal) data. We also compared traditional vs. transformational IT execution models, and how to put data first.
Barb was born and raised in Cincinnati, Ohio. She attended Thomas More College in Kentucky, graduating with a B.A. in Computer Science and Business Administration. She built technical depth leading infrastructure architecture, then as a Chief Enterprise Architect at GE Corporate. Barb returned to GE Aviation in 2014 to lead the master data management initiative. Barb enjoys volunteering with the developmentally disabled, STEM and high school band students. She also likes to cook, jazzercise, and travel abroad. Her 3 kids keep life active and fun.
This survey was conducted in January 2016 among 400 U.S. facility leaders in establishments including data centers, commercial and industrial buildings, retail, healthcare, education, government and other building environments. Respondents have responsibility related to purchasing energy and technology solutions, and their biggest responsibilities included facility management and operations management. Facility managers are increasingly adapting their building maintenance strategies in response to the Internet of Things (IoT).
Machine learning’s impact on utilities webinarSparkCognition
Navigant Research estimates that utility companies will spend almost $50 billion on asset management and grid monitoring technology by 2023. Today many organizations are facing budgetary challenges in order to increase reliability, uptime and safety within their facilities.
The industry is adapting to new technologies including utilization of advanced sensors and sensor fusion, edge devices, artificial intelligence, and machine learning to create the maintenance center of the future.
Bernie Cook, former Director of Maintenance and Diagnostics at Duke Energy and now VP of Woyshner Service consulting, will join us to provide practical guidance and examples of how utilities can begin adapting these next generation technologies within their facilities to drive significant reduction in maintenance costs.
Following Bernie, Stuart Gillen, Director of Business Development at SparkCognition, will give examples of how machine learning technologies are augmenting current practices that make maintenance engineers more efficient at predicting critical asset failure.
Join this webinar to learn about:
- Real examples of ways utilities are moving to more advanced monitoring and diagnostic capabilities and the technologies involved.
- How machine learning can improve equipment reliability and performance, and reduce operational and maintenance costs.
- How machine learning can augment or even supplement human subject matter experts by providing significant advance notice of asset performance issues.
Data as the New Oil: Producing Value in the Oil and Gas IndustryVMware Tanzu
Oil and gas exploration and production activities generate large amounts of data from sensors, logistics, business operations and more. Given the data volume, variety and velocity, gaining actionable and relevant insights from the data is challenging. Learn about these challenges and how to address them by leveraging big data technologies in this webinar.
During the webinar we will dive deep into approaches for predicting drilling equipment function and failure, a key step towards zero unplanned downtime. In the process of drilling wells, non-productive time due to drilling equipment failure can be expensive. We will highlight how the Pivotal Data Labs team uses big data technologies to build models for predicting drilling equipment function and failure. Models such as these can be used to build essential early warning systems to reduce costs and minimize unplanned downtime.
Panelist:
Rashmi Raghu, Senior Data Scientist, Pivotal
Hosted by:
Tim Matteson, Co-Founder -- Data Science Central
Video replay is available to watch here: http://youtu.be/dhT-tjHCr9E
Emergence of ITOA: An Evolution in IT Monitoring and ManagementHCL Technologies
IT operations analytics(ITOA) plays key role by providing intelligence that makes business sense out of the real-time data being generated by infrastructure components and applications.
Supercharging Smart Meter BIG DATA Analytics with Microsoft Azure Cloud- SRP ...Mike Rossi
Explosive growth of Smart Meter (SM) deployments has presented key infrastructure challenges across the utility industry. The huge volumes of smart meter data has led the industry to a tipping point which requires investments in modernizing existing data warehouses. Typical modernization efforts lead to huge capital expenditures for DW appliances and storage. Sizing this new infrastructure is tricky and can lead to underutilized or poorly performing hardware.
The Cloud is the catalyst to solving these Big Data challenges.
Utilizing a Cloud architecture delivers huge benefits by:
Maximizing use of existing architecture
Minimizing new CapEx expenditures
Lowering overall storage costs
Enabling scale on demand
It seems that everyone is talking about Big Data these days. As the Industrial Internet evolves and continues to feed the Big Data machine, companies are finding it more and more critical to develop strategies for turning data into information and information in intelligence. There’s certainly not a shortage of technologies in the marketplace to start playing with the petabytes of data coming from within and outside of the enterprise.
This is the powerpoint presentation used by our guest speaker Barbara (Barb) Kruetzkamp, IT Leader in Data Management at GE Aviation, to discuss approaches and frameworks to enhance business intelligence capabilities by linking industrial and enterprise (internal) data. We also compared traditional vs. transformational IT execution models, and how to put data first.
Barb was born and raised in Cincinnati, Ohio. She attended Thomas More College in Kentucky, graduating with a B.A. in Computer Science and Business Administration. She built technical depth leading infrastructure architecture, then as a Chief Enterprise Architect at GE Corporate. Barb returned to GE Aviation in 2014 to lead the master data management initiative. Barb enjoys volunteering with the developmentally disabled, STEM and high school band students. She also likes to cook, jazzercise, and travel abroad. Her 3 kids keep life active and fun.
This survey was conducted in January 2016 among 400 U.S. facility leaders in establishments including data centers, commercial and industrial buildings, retail, healthcare, education, government and other building environments. Respondents have responsibility related to purchasing energy and technology solutions, and their biggest responsibilities included facility management and operations management. Facility managers are increasingly adapting their building maintenance strategies in response to the Internet of Things (IoT).
Machine learning’s impact on utilities webinarSparkCognition
Navigant Research estimates that utility companies will spend almost $50 billion on asset management and grid monitoring technology by 2023. Today many organizations are facing budgetary challenges in order to increase reliability, uptime and safety within their facilities.
The industry is adapting to new technologies including utilization of advanced sensors and sensor fusion, edge devices, artificial intelligence, and machine learning to create the maintenance center of the future.
Bernie Cook, former Director of Maintenance and Diagnostics at Duke Energy and now VP of Woyshner Service consulting, will join us to provide practical guidance and examples of how utilities can begin adapting these next generation technologies within their facilities to drive significant reduction in maintenance costs.
Following Bernie, Stuart Gillen, Director of Business Development at SparkCognition, will give examples of how machine learning technologies are augmenting current practices that make maintenance engineers more efficient at predicting critical asset failure.
Join this webinar to learn about:
- Real examples of ways utilities are moving to more advanced monitoring and diagnostic capabilities and the technologies involved.
- How machine learning can improve equipment reliability and performance, and reduce operational and maintenance costs.
- How machine learning can augment or even supplement human subject matter experts by providing significant advance notice of asset performance issues.
A technical Introduction to Big Data AnalyticsPethuru Raj PhD
This presentation gives the details about the sources for big data, the value of big data, what to do with big data, the platforms, the infrastructures and the architectures for big data analytics
Gain New Insights by Analyzing Machine Logs using Machine Data Analytics and BigInsights.
Half of Fortune 500 companies experience more than 80 hours of system down time annually. Spread evenly over a year, that amounts to approximately 13 minutes every day. As a consumer, the thought of online bank operations being inaccessible so frequently is disturbing. As a business owner, when systems go down, all processes come to a stop. Work in progress is destroyed and failure to meet SLA’s and contractual obligations can result in expensive fees, adverse publicity, and loss of current and potential future customers. Ultimately the inability to provide a reliable and stable system results in loss of $$$’s. While the failure of these systems is inevitable, the ability to timely predict failures and intercept them before they occur is now a requirement.
A possible solution to the problem can be found is in the huge volumes of diagnostic big data generated at hardware, firmware, middleware, application, storage and management layers indicating failures or errors. Machine analysis and understanding of this data is becoming an important part of debugging, performance analysis, root cause analysis and business analysis. In addition to preventing outages, machine data analysis can also provide insights for fraud detection, customer retention and other important use cases.
Why Your Data Science Architecture Should Include a Data Virtualization Tool ...Denodo
Watch full webinar here: https://bit.ly/35FUn32
Presented at CDAO New Zealand
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python, and Scala put advanced techniques at the fingertips of the data scientists.
However, most architecture laid out to enable data scientists miss two key challenges:
- Data scientists spend most of their time looking for the right data and massaging it into a usable format
- Results and algorithms created by data scientists often stay out of the reach of regular data analysts and business users
Watch this session on-demand to understand how data virtualization offers an alternative to address these issues and can accelerate data acquisition and massaging. And a customer story on the use of Machine Learning with data virtualization.
Unlocking the Power of Data: Data Driven Product Engineering, Evren Eryurek, ...Zinnov
We live in a data-rich world - almost everything we do is being captured and stored somewhere. There are algorithms crunching the data every millisecond and conveying unknown and untapped information. At an enterprise level, data analytics provides us a 360-degree view of our customers, products and the business landscape to make effective, smart decisions. This presentation delves into how the traditional business philosophy of ‘proximity to customer’ will lose its significance and how data will drive product decisions.
Artificial Intelligence Application in Oil and GasSparkCognition
Visit http://sparkcognition.com for more information.
To access and listen to the on-demand version of the webinar, go here:
http://sparkcognition.com/ai-oil-and-gas-webinar-video/
Learn how Artificial Intelligence and Machine Learning are being effectively applied in Oil & Gas right now, how they will become even more prevalent, and how they can impact your bottom line and transform your business.
We'll cover:
• Fundamentals of Artificial Intelligence and Machine Learning
• Understanding of why Artificial Intelligence and Machine Learning are revolutionary in how they can help the Oil & Gas industry. This technology is already being used to prevent downhole tool failures or events like stuck pipes, pinpointing the ideal drilling locations during exploration and discovery, predicting pipeline pump failures, identify frack truck pump failures, etc.
• Real world examples of how other clients are using AI/ML today
How Data Virtualization Puts Machine Learning into Production (APAC)Denodo
Watch full webinar here: https://bit.ly/3mJJ4w9
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spend most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this session to learn how companies can use data virtualization to:
- Create a logical architecture to make all enterprise data available for advanced analytics exercise
- Accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- Integrate popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc
Check out this presentation from Pentaho and ESRG to learn why product managers should understand Big Data and hear about real-life products that have been elevated with these innovative technologies.
Learn more in the brief that inspired the presentation, Product Innovation with Big Data: http://www.pentaho.com/resources/whitepaper/product-innovation-big-data
Digital technologies for improved performance in cognitive Production PlantsMário Gamas
Develop new technologies to realise cognitive production plants, with improved efficiency and sustainability, by use of smart and networked sensor technologies, intelligent handling and online evaluation of various forms of data streams as well as new methods for self-organizing processes and process chains.
In Short: Go from Smart to Smarter (Cognitive).
Big Data is an emerging technology in Information Management that holds promising returns on investment, as it can provide advanced analytics capabilities. It is well suited for large enterprises, and when used properly, it can lead to breakthroughs in analytics, deriving information from data that was previously not possible. However, a Big Data project cannot be approached using traditional IT system design and methods. Its success relies on teamwork and collaboration among petroleum engineering subject matter experts, senior IT professionals, and data scientists. To ensure that Big Data initiatives do not deliver poor results or disappoint, Big Data projects require significant preparation, which dramatically increases the chances of success. This presentation provides practical information about how to get started and what to consider in your plan, and it gives useful tips and examples for planning and executing a Big Data project. At the end of the presentation, attendees will know what Big Data is, what it offers, how to plan such projects, what the roles and responsibilities are for the key project members, and how these projects should be implemented to benefit their organization. Big Data analytics offers enterprises a chance to move beyond simply gathering data to analyzing, mining, and correlating results for insights that translate into business solutions.
Similar to Big Data, Physics, and the Industrial Internet: How Modeling & Analytics are Making the World Work Better (Matt Denesuk) (20)
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
2. What’s this all about?
Industries that are all about
data IT see outsized
productivity performance
gains
• Telecom, financial srvcs,…
2
Making industrials all about data
IT will transform how the world
works
• Power, water, aviation, rail, mining, oil
gas, manufacturing, …
And Big Data + Physics is the enabler
5. Cornerstone of IoT Transformation is
Software-Defined Machines (SDM’s)
CONSUMER
COMMERCIAL INDUSTRIAL
• Easily connect machines to Internet
• Embed apps and analytics into machines and cloud, making them intelligent and self-aware
• Change and update capabilities of machines and devices without changing hardware
• Deliver intelligence to users providing continuously better outcomes
• Extend Industrial Internet platform via API and ecosystem
7. The Value to Customers is Huge
Efficiency and cost savings, new customer services, risk
avoidance – 1% improvements cuts $276B in waste across
industries
Industry Segment Type of savings
$30B
7 GESoftware.com | @GESoftware |
#IndustrialInternet
Aviation
Power
Healthcare
Rail
Oil and Gas
Estimated value
over 15 years
$66B
$63B
$27B
$90B
Commercial
Gas-fired
generation
System-wide
Freight
1% fuel savings
Exploration and
development
1% fuel savings
1% reduction in
system inefficiency
1% reduction in
system inefficiency
1% reduction in
capital expenditures
Note: Illustrative examples based on potential one percent savings applied across specific global industry sectors. Source: GE estimates
8. 4 Big Data
Forces shaping
the Industrial Internet
8 GESoftware.com | @GESoftware |
#IndustrialInternet
Internet
1 of things
Intelligent,
SW-defined
machines
2 Big Data
Analytics
3 Physics +
A living network
of machines, data,
and people
Increasing system
intelligence through
embedded software
Employing deep
physics engineering
models to leap-frog
what’s possible with
data-driven
techniques
Transforming massive
amounts of data into
intelligence,
generating data-driven
insights, and
enhancing asset
performance
9. Reference Architecture
Platform for the Industrial Internet must bridge OT IT
Single Record
of Asset
Business Process Management
Industrial Big Data Management
Event Processing
PaaS
SaaS
Industrial
Data Lake
Analytics
Modeling
Integration with ERP / CRM
Device mgmt. M2M, M2H,
M2C
Insight to Action
• Maintenance
• SW Upgrades
• Machine Control
Mobility and Collaboration
Cyber-Security Operational Reliability
Any
Machine
Any
Device
11. 11
Two ways of seeing a data set* (and the world)
Computer Scientist: “get the knowledge locked in the data”
The data set is record of everything that happened, e.g.,
• All customer transactions last month
• All friendship links between members of social networking site
Goal is to find interesting patterns, rules, and/or
associations.
Physical Scientist – “get the knowledge”
(*See D. Lambert, or R. Mahoney, e.g.)
• The data set is an partial, and often very noisy
reflection of some underlying phenomenon, e.g.,
– Emission spectra from stars
– Battery voltage varying with current, time, and temperature
• Goal is better understanding or ability to predict
aspects of that phenomenon, often through a
mathematical model
For certain kinds of problems, immense power in the
combination
12. Example: Statistical Translation
• Employ language experts to codify
rules, exceptions, vocabulary
mappings, etc.
• Apply transformation to user’s query.
• Gather and classify lots of translated
docs (websites, UN, books, …)
• Identify match patterns
• Map to user’s translation query.
Regular Science
approach
Statistical (data-driven)
approach
Use of language is infinitely
complex, but you can teach a
computer all the rules and
content.
People say the same kind of
things over and over. And
somebody has already
translated it.
• Costly, hard to scale
• Can translate nearly any statement
(but accuracy variable)
• In theory, could be better than
human.
• Incrementally low cost, highly
scalable.
• Limited in scope to digitized docs
that have been translated before
• Limited by skill of human translators
Will flop with innovative
use of language (new
poetry, …)
Too expensive and
difficult to deploy
comprehensively
13. 13
Three basic components of Industrial Data
Science
Physics/engineering-based models
• Need much less data
• Powerful, but difficult to maintain and scale
Empirical, heuristic rules insights
• Straightforward to understand
• Captures accumulated knowledge of your experts
Data-driven techniques – machine learning,
statistics, optimization, advanced visualization, …
• Often not enough data in the industrial domain
• Bias: limited to regions of parameter space traversed
in normal operation
• But easiest to maintain and scale
15. 15
Industrial Example: improving rule based systems
Many equipment operators have a system something like this, with rules
derived based on experience and intuition.
Rule sets
implemented in
Analytics Engine
Produce alerts
Low-latency
operational
data
Alerts
16. 16
Industrial Example: improving rule based systems
Rule sets
implemented in
Analytics Engine
Produce alerts
Low-latency
operational
data
Pattern, sequence,
association mining, etc.
Outcome
data
Combine ML plus rule-based
alerts with outcome data to
produce better alerts
More
actionable
alerts
17. 17
Industrial Example: improving rule based systems
Rule sets
implemented in
Analytics Engine
Low-latency
operational
data
Outcome
data
Recommendation
engine
Use ML and outcome data to refine
and extend rule base, providing yet
further actionability, resulting in
substantial improvements in
operational outcomes.
Tune parameters of
existing rules, and
create new rules.
Actionable
Recommendations
18. 18
Another Industrial Example: use advanced physical
models to create new features for ML approaches
Sensor Data
Predicted Values
and Δs
Variety of Machine
Learning
Techniques
Outcome
data
Using as ML features the:
1. Deviations from
expected physics,
2. Inferred or hidden
parameter estimates
provides much richer and
effectively less noisy
data, resulting in much
stronger predictions and
models.
19. Fleet/operation-wide optimization levels.
Trade-offs to optimize business
performance
19
Climbing up the value chain toward Condition-based
Performance Management and Business Optimization.
Need:
• Earlier detection
• Root cause
• Scaling to more
equipment Types
instances
19
Fix it when it breaks
Prescriptive recommendations (multi-channel)
Predictive Maintenance (“future”)
Condition-based Maintenance (“now”)
Model-driven
Work-driven
Time-driven
New levers for
optimization across the
operation or business
“Equipment heath
is not a given, but
a variable”
20. 20
Capability / Impact Ramp
Sophisticated, optimized
management of business
Complexity
Science Predictive
analytics
Rules
Data Anomaly
augmentation
Detection
Advanced
Basic
Reporting
Reporting
Data completeness, breadth, quality Operational
optimization
Prescriptive
analytics
Alerts
Highly-actionable
management
info
High-value
guidance
operations
21. Broad range of deep Data Science capabilities
needed
Optimizes the design
operations of complex
business and physical
systems, extracting more
value at lower risk
Innovates new ways of
performing reliability
analysis, statistical
modeling of large data,
biomarker discovery and
financial risk management
Focuses on developing
algorithms and systems for
real time video analysis
Research in algorithms and
software systems that analyze
understand images to produce
actionable insights
Develop scalable and cross-disciplinary
machine learning
predictive capabilities to
derive actionable insights from
big data
Modeling complex system and
noise processes to detect subtle
deviations and estimate critical
system parameters
Industrial
Data
Science
Employing deep physical and
engineering understanding of
equipment and processes to
generate normative models.
Sensor
Signal
Analytics
Knowledge
Discovery
Delivering data and
knowledge-driven decision
support via semantic
technologies and big data
systems research
Applied
Statistics
Physics
expert-based
Modeling
Machine
Learning
Computer
Vision
Image
Analytics
Optimization
Management
Science
21
22. 22
“Industrial Data Science”
① Outcome-oriented application of mathematical physics-based
analysis models to real-world problems in industrial operations.
② Tools processes needed to do that continually at scale.
Improve the performance of industrial operations, e.g.,
• Higher equipment uptime, utilization,
• Lower maintenance/shop costs, longer component life
• Fleet level optimization trade-offs
• Business optimization (linking to financial customer data)
• Service / contract management
Combination of :
• Physical expert modeling experience depth
• Installed base of industrial equipment and data.
• Big Data, Machine Learning, and statistical capabilities
Industrial
Data
Science
What
is it?
Why do
we do it
What’s
needed