Visit http://sparkcognition.com for more information.
To access and listen to the on-demand version of the webinar, go here:
http://sparkcognition.com/ai-oil-and-gas-webinar-video/
Learn how Artificial Intelligence and Machine Learning are being effectively applied in Oil & Gas right now, how they will become even more prevalent, and how they can impact your bottom line and transform your business.
We'll cover:
• Fundamentals of Artificial Intelligence and Machine Learning
• Understanding of why Artificial Intelligence and Machine Learning are revolutionary in how they can help the Oil & Gas industry. This technology is already being used to prevent downhole tool failures or events like stuck pipes, pinpointing the ideal drilling locations during exploration and discovery, predicting pipeline pump failures, identify frack truck pump failures, etc.
• Real world examples of how other clients are using AI/ML today
Artificial Intelligence has caught the attention of the oil and gas industry. This presentation sets out 4 use cases that are live in oil and gas as of October 2017. The speaker notes include specific solution names and vendors, including Stream Systems, Veerum, Osprey Analytics and IBM.
Data as the New Oil: Producing Value in the Oil and Gas IndustryVMware Tanzu
Oil and gas exploration and production activities generate large amounts of data from sensors, logistics, business operations and more. Given the data volume, variety and velocity, gaining actionable and relevant insights from the data is challenging. Learn about these challenges and how to address them by leveraging big data technologies in this webinar.
During the webinar we will dive deep into approaches for predicting drilling equipment function and failure, a key step towards zero unplanned downtime. In the process of drilling wells, non-productive time due to drilling equipment failure can be expensive. We will highlight how the Pivotal Data Labs team uses big data technologies to build models for predicting drilling equipment function and failure. Models such as these can be used to build essential early warning systems to reduce costs and minimize unplanned downtime.
Panelist:
Rashmi Raghu, Senior Data Scientist, Pivotal
Hosted by:
Tim Matteson, Co-Founder -- Data Science Central
Video replay is available to watch here: http://youtu.be/dhT-tjHCr9E
IIoT + Predictive Analytics: Solving for Disruption in Oil & Gas and Energy &...DataWorks Summit
The electric grid has evolved from linear generation and delivery to a complex mix of renewables, prosumer-generated electricity, and electric vehicles (EVs). Smart meters are generating loads of data. As a result, traditional forecasting models and technologies can no longer adequately predict supply and demand. Extreme weather, an aging infrastructure, and the burgeoning worldwide population are also contributing to increased outage frequency.
In oil and gas, commodity pricing pressures, resulting workforce reductions, and the need to reduce failures, automate workflows, and increase operational efficiencies are driving operators to shift analytics initiatives to advanced data-driven applications to complement physics-based tools.
While sensored equipment and legacy surveillance applications are generating massive amounts of data, just 2% is understood and being leveraged. Operationalizing it along with external datasets enables a shift from time-based to condition-based maintenance, better forecasting and dramatic reductions in unplanned downtime.
The session includes plenty of real-world anecdotes. For example, how an electric power holding company reduced the time it took to investigate energy theft from six months to less than one hour, producing theft leads in minutes and an expected multi-million dollar ROI. How a global offshore contract drilling services provider implemented an open source IIoT solution across its fleet of assets in less than a year, enabling remote monitoring, predictive analytics and maintenance.
Key takeaways:
• How are new processes for data collection, storage and democratization making it accessible and usable at scale?
• Beyond time series data, what other data types are important to assess?
• What advantage are open source technologies providing to enterprises deploying IIoT?
• Why is collaboration important across industrial verticals to increase IIoT open source adoption?
Speaker
Kenneth Smith, General Manager, Energy, Hortonworks
“The Digital Oilfield” : Using IoT to reduce costs in an era of decreasing oi...Karthikeyan Rajamanickam
Executive Summary:
oWe decided to create this point-of-view after seeing many abstract presentations and esoteric concepts on Digital Oilfield, IoT, Big Data and Analytics.
oThis is our attempt to bring a practical implementation view to IoT by combining Digital Oilfield and IoT.
oHere, we also envisage sharing our IoT experience and lessons learnt in implementing Digital Oilfield solutions around IoT.
oThe following comprise our fundamental business case for Finance:
oPRODUCTION FORECAST
oFAULT COMPARTMENTS
oWELL LOCATION OPTIMIZATION
Artificial Intelligence has caught the attention of the oil and gas industry. This presentation sets out 4 use cases that are live in oil and gas as of October 2017. The speaker notes include specific solution names and vendors, including Stream Systems, Veerum, Osprey Analytics and IBM.
Data as the New Oil: Producing Value in the Oil and Gas IndustryVMware Tanzu
Oil and gas exploration and production activities generate large amounts of data from sensors, logistics, business operations and more. Given the data volume, variety and velocity, gaining actionable and relevant insights from the data is challenging. Learn about these challenges and how to address them by leveraging big data technologies in this webinar.
During the webinar we will dive deep into approaches for predicting drilling equipment function and failure, a key step towards zero unplanned downtime. In the process of drilling wells, non-productive time due to drilling equipment failure can be expensive. We will highlight how the Pivotal Data Labs team uses big data technologies to build models for predicting drilling equipment function and failure. Models such as these can be used to build essential early warning systems to reduce costs and minimize unplanned downtime.
Panelist:
Rashmi Raghu, Senior Data Scientist, Pivotal
Hosted by:
Tim Matteson, Co-Founder -- Data Science Central
Video replay is available to watch here: http://youtu.be/dhT-tjHCr9E
IIoT + Predictive Analytics: Solving for Disruption in Oil & Gas and Energy &...DataWorks Summit
The electric grid has evolved from linear generation and delivery to a complex mix of renewables, prosumer-generated electricity, and electric vehicles (EVs). Smart meters are generating loads of data. As a result, traditional forecasting models and technologies can no longer adequately predict supply and demand. Extreme weather, an aging infrastructure, and the burgeoning worldwide population are also contributing to increased outage frequency.
In oil and gas, commodity pricing pressures, resulting workforce reductions, and the need to reduce failures, automate workflows, and increase operational efficiencies are driving operators to shift analytics initiatives to advanced data-driven applications to complement physics-based tools.
While sensored equipment and legacy surveillance applications are generating massive amounts of data, just 2% is understood and being leveraged. Operationalizing it along with external datasets enables a shift from time-based to condition-based maintenance, better forecasting and dramatic reductions in unplanned downtime.
The session includes plenty of real-world anecdotes. For example, how an electric power holding company reduced the time it took to investigate energy theft from six months to less than one hour, producing theft leads in minutes and an expected multi-million dollar ROI. How a global offshore contract drilling services provider implemented an open source IIoT solution across its fleet of assets in less than a year, enabling remote monitoring, predictive analytics and maintenance.
Key takeaways:
• How are new processes for data collection, storage and democratization making it accessible and usable at scale?
• Beyond time series data, what other data types are important to assess?
• What advantage are open source technologies providing to enterprises deploying IIoT?
• Why is collaboration important across industrial verticals to increase IIoT open source adoption?
Speaker
Kenneth Smith, General Manager, Energy, Hortonworks
“The Digital Oilfield” : Using IoT to reduce costs in an era of decreasing oi...Karthikeyan Rajamanickam
Executive Summary:
oWe decided to create this point-of-view after seeing many abstract presentations and esoteric concepts on Digital Oilfield, IoT, Big Data and Analytics.
oThis is our attempt to bring a practical implementation view to IoT by combining Digital Oilfield and IoT.
oHere, we also envisage sharing our IoT experience and lessons learnt in implementing Digital Oilfield solutions around IoT.
oThe following comprise our fundamental business case for Finance:
oPRODUCTION FORECAST
oFAULT COMPARTMENTS
oWELL LOCATION OPTIMIZATION
PENNGLEN FIELD Development Plan (GULF of MEXICO)PaulOkafor6
A FDP designed with the goal to define the development scheme that allows the optimization of the hydrocarbon recovery at a minimal cost for project sanction
This was designed by MSc Students from the Institute of Petroleum Studies, UNIPORT/ IFP School, France
The breath and depth of Azure products that fall under the AI and ML umbrella can be difficult to follow. In this presentation I’ll first define exactly what AI, ML, and deep learning is, and then go over the various Microsoft AI and ML products and their use cases.
Analytics play a critical role in supporting strategic business initiatives. Despite the obvious value to analytic professionals of providing the analytics for these initiatives, many executives question the economic return of analytics as well as data lakes, machine learning, master data management, and the like.
Technology professionals need to calculate and present business value in terms business executives can understand. Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help technology professionals research, measure, and present the economic value of a proposed or existing analytics initiative, no matter the form that the business benefit arises. The session will provide practical advice about how to calculate ROI and the formulas, and how to collect the necessary information.
Introduction-Alpha….. Betical PRINCIPLES of Petroleum Geology; Classification of fossil fuels as hydrocarbon resources and hydrocarbon producing resources; Oil/Gas Generation and Diagenesis; Types of Oil & Natural Gas Plays; Occurrence of Oil and Gas; umbrella terms given to petroleum: Conventional oil and Unconventional oil; Associated Gas and Non-associated Gas; In Situ Oil and Gas Resources versus Supply; Natural Gas Resource and Quality Types; Natural GAS; Oil and Gas Process; Oil/Gas Field Life Cycle; Oil Field Pyramid ; Giant Oil Field
What’s shale gas ?
Finding the sweet spot?
How is shale gas formed?
How to produce the shale gas ?
Why We Fracture Shale Gas Wells…!
Shale Gas; Shale Gas Revolution; Main Shale Gas Reservoir Characterization; produce the shale gas; Shale Gas Play; Roadmap to Shale Gas; EVALUATION SHALE GAS; Shale Gas Production Cost Curve
Be a part of the modern world by integrating digital technologies in the Oil & Gas operations. It will not only keep you digitally connected but also reduce the cost and risk involved in day-to-day industry activities. Download our free copy of whitepaper: https://www.bluemailmedia.com/oil-gas-a-definitive-path-towards-digitalization.php
Oil 101 - A Free Introduction to Oil and Gas
Introduction to Drilling
Today we’re going to talk about the Drilling function of Upstream. If you missed our previous podcasts on Upstream Fundamentals and Exploration, be sure to go check them out. We’ll put the relevant links in the program notes.
In this drilling overview we touch on the wildcat well and current drilling capabilities, offer more insight into the role of oilfield services, and give some historical perspective on this segment of upstream oil and gas.
Drilling Details
As we discussed in the Exploration podcast, the first step in adding value is to locating the oil and gas reservoirs that are often far below the surface, and in deeper offshore prospects.
Even with the latest seismic technology and computer modeling, many characteristics of a prospect remain unknown until an exploratory or “wildcat” well is drilled. Repeating from that podcast, “you can’t find oil if you don’t drill wells.”
Oil 101: Introduction to Oil and Gas - UpstreamEKT Interactive
Oil 101: Introduction to Oil and Gas - Upstream
What is Upstream? This Midstream content is derived from our Oil 101 Upstream ebook and can be found in our oil and gas learning community.
This Upstream module includes the following sections (use the links below for quick access):
-Introduction to Upstream
-Upstream Business Characteristics
-Oilfield Services
-Reserves – Formation and Importance
-Production – The First Step in Adding Value
-The Unconventional Future of Upstream
Upstream
What is Upstream? Most oil and gas companies’ business structures are segmented and organized according to business segment, assets, or function.
The upstream segment of the business is also known as the exploration and production (E&P) sector because it encompasses activities related to searching for, recovering and producing crude oil and natural gas.
The upstream segment is all about wells: where to locate them; how deep and how far to drill them; and how to design, construct, operate and manage them to deliver the greatest possible return on investment with the lightest, safest and smallest operational footprint.
Exploration
The exploration sector involves obtaining a lease and permission to drill from the owners of onshore or offshore acreage thought to contain oil or gas, and conducting necessary geological and geophysical (G&G) surveys required to explore for (and hopefully find) economic accumulations of oil or gas.
Drilling
There is always uncertainty in the geological and geophysical survey results. The only way to be sure that a prospect is favorable is to drill an exploratory well. Drilling is physically creating the “borehole” in the ground that will eventually become an oil or gas well. This work is done by rig contractors and service companies in the Oilfield Services business sector.
Production
The production sector of the upstream segment maximizes recovery of petroleum from subsurface reservoirs.
Natural gas Process and Production course
https://www.youtube.com/watch?v=_9HHJ-AjQUY&t=27s
http://www.mediafire.com/file/zu640mv8rpj257w/1.%20Natural%20Gas%20Overview.pdf
How to Choose the Right Database for Your WorkloadsInfluxData
Learn how to make the right choice for your workloads with this walkthrough of a set of distinct database types (graph, in-memory, search, columnar, document, relational, key-value, and time series databases). In this webinar, we will review the strengths and qualities of each database type from their particular use-case perspectives.
Review Schneider Electric’s innovative and efficient upstream oil and gas offer and how to optimize remote assets. Benefit from industry expertise and live demonstrations that highlight reducing total cost of ownership and turning data into reliable information to drive business.
PENNGLEN FIELD Development Plan (GULF of MEXICO)PaulOkafor6
A FDP designed with the goal to define the development scheme that allows the optimization of the hydrocarbon recovery at a minimal cost for project sanction
This was designed by MSc Students from the Institute of Petroleum Studies, UNIPORT/ IFP School, France
The breath and depth of Azure products that fall under the AI and ML umbrella can be difficult to follow. In this presentation I’ll first define exactly what AI, ML, and deep learning is, and then go over the various Microsoft AI and ML products and their use cases.
Analytics play a critical role in supporting strategic business initiatives. Despite the obvious value to analytic professionals of providing the analytics for these initiatives, many executives question the economic return of analytics as well as data lakes, machine learning, master data management, and the like.
Technology professionals need to calculate and present business value in terms business executives can understand. Unfortunately, most IT professionals lack the knowledge required to develop comprehensive cost-benefit analyses and return on investment (ROI) measurements.
This session provides a framework to help technology professionals research, measure, and present the economic value of a proposed or existing analytics initiative, no matter the form that the business benefit arises. The session will provide practical advice about how to calculate ROI and the formulas, and how to collect the necessary information.
Introduction-Alpha….. Betical PRINCIPLES of Petroleum Geology; Classification of fossil fuels as hydrocarbon resources and hydrocarbon producing resources; Oil/Gas Generation and Diagenesis; Types of Oil & Natural Gas Plays; Occurrence of Oil and Gas; umbrella terms given to petroleum: Conventional oil and Unconventional oil; Associated Gas and Non-associated Gas; In Situ Oil and Gas Resources versus Supply; Natural Gas Resource and Quality Types; Natural GAS; Oil and Gas Process; Oil/Gas Field Life Cycle; Oil Field Pyramid ; Giant Oil Field
What’s shale gas ?
Finding the sweet spot?
How is shale gas formed?
How to produce the shale gas ?
Why We Fracture Shale Gas Wells…!
Shale Gas; Shale Gas Revolution; Main Shale Gas Reservoir Characterization; produce the shale gas; Shale Gas Play; Roadmap to Shale Gas; EVALUATION SHALE GAS; Shale Gas Production Cost Curve
Be a part of the modern world by integrating digital technologies in the Oil & Gas operations. It will not only keep you digitally connected but also reduce the cost and risk involved in day-to-day industry activities. Download our free copy of whitepaper: https://www.bluemailmedia.com/oil-gas-a-definitive-path-towards-digitalization.php
Oil 101 - A Free Introduction to Oil and Gas
Introduction to Drilling
Today we’re going to talk about the Drilling function of Upstream. If you missed our previous podcasts on Upstream Fundamentals and Exploration, be sure to go check them out. We’ll put the relevant links in the program notes.
In this drilling overview we touch on the wildcat well and current drilling capabilities, offer more insight into the role of oilfield services, and give some historical perspective on this segment of upstream oil and gas.
Drilling Details
As we discussed in the Exploration podcast, the first step in adding value is to locating the oil and gas reservoirs that are often far below the surface, and in deeper offshore prospects.
Even with the latest seismic technology and computer modeling, many characteristics of a prospect remain unknown until an exploratory or “wildcat” well is drilled. Repeating from that podcast, “you can’t find oil if you don’t drill wells.”
Oil 101: Introduction to Oil and Gas - UpstreamEKT Interactive
Oil 101: Introduction to Oil and Gas - Upstream
What is Upstream? This Midstream content is derived from our Oil 101 Upstream ebook and can be found in our oil and gas learning community.
This Upstream module includes the following sections (use the links below for quick access):
-Introduction to Upstream
-Upstream Business Characteristics
-Oilfield Services
-Reserves – Formation and Importance
-Production – The First Step in Adding Value
-The Unconventional Future of Upstream
Upstream
What is Upstream? Most oil and gas companies’ business structures are segmented and organized according to business segment, assets, or function.
The upstream segment of the business is also known as the exploration and production (E&P) sector because it encompasses activities related to searching for, recovering and producing crude oil and natural gas.
The upstream segment is all about wells: where to locate them; how deep and how far to drill them; and how to design, construct, operate and manage them to deliver the greatest possible return on investment with the lightest, safest and smallest operational footprint.
Exploration
The exploration sector involves obtaining a lease and permission to drill from the owners of onshore or offshore acreage thought to contain oil or gas, and conducting necessary geological and geophysical (G&G) surveys required to explore for (and hopefully find) economic accumulations of oil or gas.
Drilling
There is always uncertainty in the geological and geophysical survey results. The only way to be sure that a prospect is favorable is to drill an exploratory well. Drilling is physically creating the “borehole” in the ground that will eventually become an oil or gas well. This work is done by rig contractors and service companies in the Oilfield Services business sector.
Production
The production sector of the upstream segment maximizes recovery of petroleum from subsurface reservoirs.
Natural gas Process and Production course
https://www.youtube.com/watch?v=_9HHJ-AjQUY&t=27s
http://www.mediafire.com/file/zu640mv8rpj257w/1.%20Natural%20Gas%20Overview.pdf
How to Choose the Right Database for Your WorkloadsInfluxData
Learn how to make the right choice for your workloads with this walkthrough of a set of distinct database types (graph, in-memory, search, columnar, document, relational, key-value, and time series databases). In this webinar, we will review the strengths and qualities of each database type from their particular use-case perspectives.
Review Schneider Electric’s innovative and efficient upstream oil and gas offer and how to optimize remote assets. Benefit from industry expertise and live demonstrations that highlight reducing total cost of ownership and turning data into reliable information to drive business.
Machine learning’s impact on utilities webinarSparkCognition
Navigant Research estimates that utility companies will spend almost $50 billion on asset management and grid monitoring technology by 2023. Today many organizations are facing budgetary challenges in order to increase reliability, uptime and safety within their facilities.
The industry is adapting to new technologies including utilization of advanced sensors and sensor fusion, edge devices, artificial intelligence, and machine learning to create the maintenance center of the future.
Bernie Cook, former Director of Maintenance and Diagnostics at Duke Energy and now VP of Woyshner Service consulting, will join us to provide practical guidance and examples of how utilities can begin adapting these next generation technologies within their facilities to drive significant reduction in maintenance costs.
Following Bernie, Stuart Gillen, Director of Business Development at SparkCognition, will give examples of how machine learning technologies are augmenting current practices that make maintenance engineers more efficient at predicting critical asset failure.
Join this webinar to learn about:
- Real examples of ways utilities are moving to more advanced monitoring and diagnostic capabilities and the technologies involved.
- How machine learning can improve equipment reliability and performance, and reduce operational and maintenance costs.
- How machine learning can augment or even supplement human subject matter experts by providing significant advance notice of asset performance issues.
Intelligent Production: Deploying IoT and cloud-based machine learning to opt...Amazon Web Services
Alex Robart, CEO of Ambyint, presents their AI-driven production optimization platform for the Oil and Gas Industry.
Their IoT-based innovative hardware and software solution, delivers a revolutionary approach to monitoring Oil and Gas production operations, by updating traditional SCADA-based telemetry, cloud-enabling them, and bringing in Artificial Intelligence capabilities. Presented at the AWS Oil and Gas Industry Day in Calgary, 2017.
Big Data, IoT, data lake, unstructured data, Hadoop, cloud, and massively parallel processing (MPP) are all just fancy words unless you can find uses cases for all this technology. Join me as I talk about the many use cases I have seen, from streaming data to advanced analytics, broken down by industry. I’ll show you how all this technology fits together by discussing various architectures and the most common approaches to solving data problems and hopefully set off light bulbs in your head on how big data can help your organization make better business decisions.
Seal the Formation Surface With Optimized Bridging Blendpvisoftware
This white paper discusses how to use BridgePRO, a Bridging Agent Size Selection software that aids in the determination of the optimum calcium carbonate blend to achieve maximum bridging of sandstone reservoirs. The software optimization is based on specific formation characteristics and the particle-size distribution of available grades of calcium carbonates.
Torque and Drag: Concepts that Every Drilling and Completion Engineer Should ...pvisoftware
This white paper talks about torque and drag concepts that every drilling and completion engineer should know. With TADPRO, the risks associated with drilling and completing a well can be assessed and much of the risk can be remediated during pre-job planning.
This course examines the difficulties, challenges and problems facing today’s deepwater drilling programme designers and personnel where wells are being drilled in increasingly difficult downhole and environmental conditions with ever increasing cost and legislation further contributing to design and operational pressures.
The Science of Predictive Maintenance: IBM's Predictive Analytics SolutionSenturus
Overview of IBM’s Predictive Maintenance and Quality (PMQ) solution. View the webinar video recording and download this deck: http://www.senturus.com/resources/science-predictive-maintenance/.
We show you the PMQ solution can keep manufacturing processes, infrastructure and field equipment running to maximize use and performance, while minimizing costs.
We show how you can use powerful analytics and data integration to help: Anticipate asset maintenance and product quality problems, Reduce unscheduled asset downtime, Spend less time solving production machinery and field asset problems, Improve asset productivity and process quality, Monitor how assets are performing in real-time and predict what will happen next.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Amidst an industry cloud of confusion about what “AIOps” is and what it can do, these slides--based on the webinar from EMA research--delineates a clear path to victory for business and IT stakeholders seeking to use machine learning to optimize the performance of critical business services.
The last 18+ months have proven to be like no other time in modern history, and it has had a profound effect on the supply chain in the manufacturing industry. This disruption has meant many restless nights worrying about supply chains, workforce agility, capacity planning, resource allocation, and much more for manufacturers. Manufacturers have realized that better planning and preparedness are crucial to adapting to the rapid changes in demand seen in today's current climate.
In this webinar, you will learn how to address these challenges head-on as we discuss how your organization can become more agile and scale to your specific business requirements and how Cloud ERP systems can support better planning and preparedness for what's next.
________________________________________
About The Presenter
Steve Canter - Director of Global Service Delivery
Steve Canter has over 25 years of experience in the information technology industry. Steve has been responsible for delivering solutions to many medium-sized and large companies in a variety of industries as a consultant and project manager. Steve also brings a unique perspective to SmartERP, having spent over ten years as the CIO for a manufacturing and distribution company. During that period, he also helped shape product and customer service strategy at Microsoft and Oracle as a member of several customer advisory boards.
Intelligent Maintenance: Mapping the #IIoT ProcessDan Yarmoluk
A presentation about Industrial IoT, the value chain and real-world use cases; how to create value with IoT at your organization with an emphasis on predictive maintenance (bearing fault detection).
OpsRamp and Mystic River are joining forces to bring you this interactive webinar. Can we create gateways between IT and OT? IT/OT convergence has been defined as the integration of information technology (IT) systems used for data-centric computing with operational technology (OT) systems used to monitor events, processes and devices and make adjustments in enterprise and industrial operations. But what this will look like and what is the new role of IT operations management? What is changing in IT and OT to align these worlds? This Tech Talk features our partner Mystic River Consulting, a firm with proven, repeatable service delivery methodologies and a proprietary platform that drives transformational IT project results, at a game changing speed. We’ll discuss the convergence of IT and OT and dive into a demonstration to show what’s possible with OpsRamp.
Watch the recording: https://www.brighttalk.com/webcast/17791/416457
Learn more at https://www.opsramp.com
Also, follow us on social media channels to learn about product highlights, news, announcements, events, conferences and more:
Twitter - https://www.twitter.com/OpsRamp
LinkedIn - https://www.linkedin.com/company/opsramp
Facebook - https://www.facebook.com/OpsRampHQ/
Future-Proofing Asset Failures with Cognitive Predictive MaintenanceAnita Raj
The industry is reeling under the explosion of data generated by smart sensors, motors, actuators, machines, and other “things”. With the pace at which production is happening currently, the last straw would be an asset breakdown. Statistics show that the automotive industry deals with an alarming 800 hours of downtime every month. The cost of such downtime is a staggering US$22,000 per minute, or US$12.6 million a month.
Additionally, data shows that 20% of these breakdowns are common or predictable and that a majority – a shocking 80% – of them are seemingly random instances and cannot be predicted.
According to McKinsey, the Industrial IoT (IIoT) market is worth $11 trillion, and predictive maintenance solutions can help companies save $630 billion over the next 15 years. So, how can manufacturers tap these savings and benefits?
Learn how manufacturer and suppliers can experience the power of Cognitive Predictive Maintenance (CPdM) to avoid unplanned downtimes and drive greater efficiencies.
In medicine - an MRI can quickly reveal a hidden ailment and actionable insight to get better. For IT and business leaders whose key concern with the mainframe is the platform costs and lean operations - the CA Mainframe Resource Intelligene reveals multiple sources of hidden mainframe costs and operational inefficiencies along with actionable recommendations.View this slideshare to understand how this new SaaS offering from CA brings together automation, speed, analytics and mainframe expertise of 40+ years. CA Mainframe Resource Intelligence reports answer your CIO’s toughest questions about mainframe optimization and potential for digital transformation.
For more information, please contact your account director or mainframe specialist at:
http://ow.ly/PALG50htHgF
These slides--based on the webinar featuring Shamus McGillicuddy, senior analyst at leading IT analyst firm Enterprise Management Associates (EMA)--provide insights into leading Enterprise Hybrid Infrastructure Management (EHIM) solutions.
The slides also cover:
- The evolving EHIM requirements of the modern NOC and cross-domain operations center
- The points to consider when comparing enterprise EHIM vendors
- The 13 vendors included in this EMA Radar report
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis .
The CSC Big Data Analytics Insights service enables clients who do not have an analytics capability to implement the business, data and technology changes to gain business benefit from an initial set of analytics based on a roadmap of changes created by CSC or provided from a compatible set of inputs.
CSC Analytic Insights Implementation has four phases:
Stage 1: Analytic Engagement
Stage 2: Analytic Discovery
Stage 3: Implementation Planning
Stage 4: Embedding Analysis
ERP Optimization: How to Save Cost And Gear Up For Business Innovation Simult...CAST
This slide presentation features Liz Herbert, Principal Analyst with Forrester Research Inc, on how to take control of your ERP system to enable growth and revenue generation by learning to:
-Find the balance between cost control and business innovation
-Manage the TCO of ERP systems by reducing complexity
-Identify performance and stability vulnerabilities by focusing on the quality of customizations
To watch the complete webinar, visit http://www.castsoftware.com/news-events/event/erp-customization-kills-innovation?gad=ss
Presented at the IndicThreads.com Software Development Conference 2016 held in Pune, India. More at http://www.IndicThreads.com and http://Pune16.IndicThreads.com
--
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix WebinarImpetus Technologies
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix Webinar
View the webcast on http://bit.ly/1HFD8YR
The speakers from Forrester and Impetus talk about the options and optimal architecture to incorporate real-time insights into your apps that provisions benefitting from future innovation also.
Digital Transformation through Product and Service Innovation - Session Spons...Amazon Web Services
Today large enterprises are under pressure to innovate faster than ever, drive down costs, and deliver increased value to their organisations through more responsive and flexible IT. Organisations that are shifting to a data-driven, insight-powered culture will be in the best position to defend, differentiate and disrupt in their respective industries, potentially expanding their business with new products and revenue sources. Learn how some leading companies are leveraging data – from IoT sources, social sources, enterprise, partners, competitors, and consumers – to unlock new sources of insight.
Speaker: Amit Bansal, Digital Delivery & Analytics Lead, APAC, Accenture
Similar to Artificial Intelligence Application in Oil and Gas (20)
How to Use Artificial Intelligence to Minimize your Cybersecurity Attack SurfaceSparkCognition
Cybercrime is an exponentially growing threat to the world’s businesses, governments, and citizens. Estimates of the annual impact of cybercrime on the global economy reach as high as $600 billion, and attackers have been easily evading signature-based antivirus solutions for years. Making matters worse, the exponential growth in both connected devices and malware is overwhelming the capacity of enterprise security teams. Organizations are being asked to secure IT, Mobility, IoT, and OT assets all while staying on top of the latest zero-day and polymorphic threats.
Fill out the form below to access our free on-demand version of the webinar
Using a Cognitive Analytic Approach to Enhance Cybersecurity on Oil and Gas O...SparkCognition
IoT has revolutionized processes throughout oil and gas operations, but the increased connectivity it provides also leaves systems more vulnerable to cyberattacks than ever before. To sufficiently combat the growth of threats in both number and sophistication, combined with the scarcity of security talent, the oil and gas industry needs a stronger approach to cybersecurity. AI-based solutions for cybersecurity can monitor and protect not only the IT infrastructure, but also the OT network.
SparkSecure adds a cognitive layer to traditional security solutions, increasing the operational efficiency and knowledge retention of your incident response and security analyst teams. Essentially, SparkSecure does much of what a human security analyst can do, but at machine speed and Big Data scale.
Cognitive Security: How Artificial Intelligence is Your New Best FriendSparkCognition
For more information, visit http://sparkcognition.com
For all that you hear about artificial intelligence and machine learning, how can it help you keep your networks safer and more secure?
In this new era of computing, we will explore how artificial intelligence is being used to super charge human intelligence in threat detection, evidence gathering and remediation.
In this webinar we will discuss how this new, cutting edge cognitive security is being utilized to:
Increase speed, accuracy, and data processing capabilities to unparalleled levels
Reduce false alarms
Provide sub-second malware detection
Retain knowledge in a self-learning environment
Provide signature free security and zero-day threat detection
AWEA Cognitive Analytics for Predictive FuturesSparkCognition
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting™, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations.
Research has demonstrated the value of Machine Learning in delivering next generation analytics to improve safety, performance, and reliability in today’s modern wind turbines.
For more information, visit http://sparkcognition.com
Cybersecurity is the number one priority for both industry and government sectors. In fact, $90 trillion could be lost by 2030 due to cyberattacks if cybersecurity fails to accelerate. What are we doing now and what’s the future?
Machine Learning and Cognitive Fingerprinting - SparkCognitionSparkCognition
Source: http://sparkcognition.com
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations.
In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to improve safety, performance, and reliability in today's modern wind turbines.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
13. Category Key Features
Business Intelligence (BI)
• Centralized analysis
• Uniform data collection
• Average visualizations
Rules Based Modeling
• Fixed rules must account for all types of transactions in all types of conditions; lead to
rule proliferation and management challenges
• May be good measures for some simple situations, but average (or even sub-par)
measures for others
Statistical Analysis
• Identifies deviations from “normal”
• More a platform for model building and data scientists than an alert generating
solution
• Not automated to account for changing conditions
Physics Based Modeling
• Asset-type specific
• Model building Is a very hands-on process involving laboratory experiments
• Domain experts apply these physical models universally to assets
Common Approaches
14. Enables machines to penetrate the complexity of data to identify associations
Presents powerful techniques to handle unstructured data
Continuously learns not only from previous insights, but also for new data entering the system
Provides NLP support to enable human to machine and machine to machine communication
Does not require rules, instead relies on hypothesis generation using multiple data sets
which might not always appear connected or relevant
Benefits of Cognitive Analytics
NLP: Natural Language Processing
Cognitive Analytics is inspired by the way the human brain operates:
Processes
Information
Draws
Conclusions
Codifies Instincts &
Experience into Learning
15. Cognitive Algorithms-SparkArtemis™ & SparkPythia™
Artemis
Artemis FeaturesTake Artemis features
Captures the state of and
evolution to failure/event
including subtle influencers
Start Neural net genetic comp
Predict Based on a Function
Significantly advanced compared
to existing algorithms
Feature Selection
Automatically find significant data
Adaptive & Self-learning
Identify multiple top performers
Define Relationships
18. Machine Learning & Cognitive Analytics can deliver several benefits
External Factors
Can incorporate external factors (e.g.
environmental issues such as birds &
bats)
Scalability
Automated model building capability
does not require manual model building
of every asset/component
In-context Remediation
Advisor that understands natural language
to help technical teams
Security
Out-of-band, symptom-sensitive approach
beyond IT security
Adaptability
Adapts to new and changing conditions
automatically
Higher Accuracy
Automated feature enrichment and
extraction that can deliver better insights
and higher accuracy
20. About FlowServe
Largest Pump
Manufacturer in the
United States
Market Cap of $6.22B
Founded in 1790 (26
years after America)
Over 18,000 employees
in 56 countries
World Headquarters
Sales Offices
Service Centers & Quick Response Centers
Manufacturing Plants & Regional Operations Centers
21. Pump Monitoring Application Trial
Desired Results
Predict failures with 1 day advanced notice
Zero or minimal false positives
“Dummy Light” output
Data Provided
3 years of historical data
Pre-filtered FFT data from production assets
10 second time resolution
Major component failure logs
22. Trial Outcome
60
60
60
60
02/01 02/15 03/01 03/15 04/01
DerivedFeature
EARLY WARNING
Dynamic and adaptive threshold that will
continue to learn and adjust with more data
We can optimize the threshold
to reduce false positive alerts
POINT OF FAILURE
Predicted failures 5 to 6
days in advance (20x
improvement)
Previous method
predicted only 3-6
hours in advance
Completed with less
than 2% false positive
rates
23. Next Steps
Deploy on Azure cloud instances
Pending results of expanded sites, committed to enterprise wide roll-out
Explore predictive models for other major components
Explore adjacent Intelligent Documentation problem
Main takeaway is that the “Lower for longer” environment creating opportunity/forcing owner operators to consider new technologies and changing mindsets – strong value proposition of new technologies is now being heard and adoption is growing since financial survival (not just shareholder value) in many cases is not assured
Lower for longer environment is forcing companies to think differently and act differently using disruptive technologies that can help improve all facets of value chain
What particular tech do you plan on calling out?
Intelligent pumps
Intelligent completion
Read as is but add “… and information systems that are increasingly employing cognitive computing, machine learning, etc.
This slide is a bit hard to read. Have Nicholas look at it.
Key components of IoT are listed on this slide as is (not much detail discussed except):
Key focus will be on the Analytics and Software components and how AI/machine learning can play a role
Talking about drilling with closed-loop systems. Analysis provides options and expedites decision making.
Focus is on two major drivers: increasing asset uptime/availability (minimize downtime) and increasing production of wells, etc., increasing employee productivity (or helping deal with Great Crew change), and ability for operational excellence
Any thought on how lack of new wells means requisite increased production in existing wells?
Not much here since reiterates much of previous slide
Quickly Reiterate Benefits:
Improve performance/uptime of assets
Lower costs
Platform for innovation/operational excellence
Can we make text white?
Provide some examples – either generic anecdotally or with specific owner operator’s names and some of the potential ROI, paybacks, etc.
Provide some examples – either generic anecdotally or with specific owner operator’s names and some of the potential ROI, paybacks, etc.
Summary – I’ve shown some examples as to how people are doing things today. Passing to Stuart to explain the current methods of doing this and how machine learning is pushing the state of the art to really augment human capabilities (help you do what you’ve been trying to do for 20 years).
– 37% of non-productive drill time is spent on stuck pipe [BP]
http://www.slb.com/~/media/Files/resources/oilfield_review/ors91/oct91/5_perspective.pdf
I want to start this section out by first discussing typical ways in which organizations or users approach a data analytics project. These are loosely in order of least to most advanced
The first category is that of the Business Intelligence Software. For many of us, Excel is the first tool we turn to when trying to solve a data science problem Excel is OK, but as most of us know it has its limitations in the amount of data it can ingest. For example, yesterday I was trying to load one day’s worth of OSI PI data and Excel was unable to load even a small amount of tags. Additionally, it can be difficult to create and manage rules, formulas, etc., and then creating appealing dashboards.
Second, there are rules based modelling engines. These can be considered “If This then That” types of engines. While these types of systems can be affective they many times require a large amount of Subject Matter Experts time and knowledge and quickly become unmanageable because of the sheer number of changing variables in any “real” system.
Next there are statistical, sometimes called Advanced Pattern Recognition, tools. These tools typically look at standard deviations from a “normal” operating condition and alert operators when unaccounted deviations occur. These systems work well in use cases where the normal is known and fluctuations are rare, such as Nuclear generation; but struggle again when there are many dynamically changing signals. Very infrequently do they account for drift or capture degradation over time.
Finally, there is physics based modeling where users (or more typically consultants, manufacturers, etc.) provide a model of the asset, in this case a turbine, which is then used to predict failures. These models require extensive time from domain experts to ensure the accuracy of the model. More importantly, these models are often created in lab settings, making them static to a particular location or operating regime, meaning they don’t dynamically adapt to changing conditions well.
At SparkCognition we take a cognitive analytics approach. This is really a fancy way of saying we approach the problem similarly to how the human brain would: “cognitively”. We don’t just look at one approach or algorithm to the problem, but rather use several different techniques to identify associations in the data.
One of the central tenants of the cognitive approach, and what differentiates us from all of the “common approaches”, is the ability for users who are not PhD’s in Machine Learning to easily create dynamic, probabilistic models which continuously adapt and learn to new inputs from the system.
The tools aren’t restricted to rigid structured data, but can also reach out and look at unstructured data in conjunction with Natural Language Processing and IBM Watson to bring context to the information. As a refresher for structured vs unstructured data. Structured data, as the name implies, refers to information where there is a high degree of organization. Unstructured data is essentially the opposite. As an example, think of Wikipedia. In Wikipedia there is structured information including things like the “entry ID”, “Publication Date”, “Title”, and “Authors”. Anyone wanting to do a search on these topics can do so in a fairly, straightforward, “structured” way. But, think about the actual content. This is far larger in volume and would be considered Unstructured data. The strength of NLP is in being able to parse the unstructured data and include it with structured data to find unique patterns.
An example we have worked on in the O&M world is to take maintenance manuals and correlate the information to the problem the operator is experiencing and returning dynamic results. Imagine previously where the customer had manuals which referred to procedures in other manuals which might refer to steps in a third or fourth manual. Wouldn’t it be great, given a particular fault code that instead of manually trying to deduce which manual to go find the procedure the maintenance engineer simply typed in her fault code and the software, using Natural Language Processing and SparkCognition algorithms, could look at all the maintenance manuals, ingest their data, and dynamically present her with the correct procedures in one place?
To really explain how cognitive analytics can be used in context, I want to walk you through an example of how two algorithms or ours, SparkArtemis and SparkPythia, can be used to predict stuck pipe. First, Artemis needs to ingest all of the feature data. In this case, we were pulling 70 inputs from an EDR system. The second step is that SparkArtemis does something called feature expansion, which takes those 70 inputs and creates second, third, and even fourth order derivations of those inputs. This results in 1000s of new features to look at.
That’s where Pythia comes in. We need some way to automatically find the “right features” to use for our models. Pythia digs through the 1000s of features at our disposal to find those that are most differentiated and relevant to actually identifying stuck pipe. After the features have been selected, a model needs to be created. We don’t just select a Neural Network or other algorithms with a fancy name. Our method is far more advanced. We build several different models utilizing a variety of data science algorithms and then ensemble them to create the most accurate model for the given data set. In this case, we were able to build a globally optimal model that predicted stuck pipe with over two hours of advance warning!
To get a better, understanding of how this works, I want to break it down in a different way…
Imagine you have several blind folded people in a room with an elephant. One feels the tail, thinks he has a rope. Another feels the tusk, thinks she has a spear. The 3rd feels the side and thinks he has a wall. Individually, they are all different things, but if you ensemble those responses in the correct way, you can put together a bigger picture and realize you have an elephant. This is, in a simple way, what SparkPythia does, but it goes one step further and continues learning as new data is ingested.
Now that SparkPythia has created a model, those models need to be deployed to the asset; but how is this done…
In conclusion. The SparkCognition Algorithms provide users a simple way to create dynamic models without having a PhD in data science and can in real time adjust to changing parameters. We have utilized the algorithms in a variety of applications where we have taken vast amounts of data and been able to improve safety, performance, and reliability of modern machinery.
Before we dive into the to specific techniques SparkCognition employed to solve the gearbox problem Maggie described, I thought I would spend the next few slides discussing some machine learning basics.
The first concept is that of Unsupervised learning. Imagine you are given a bag of marbles and asked to label or sort them.
With unsupervised learning we don’t know what the “buckets” or labels are and therefore need to implement a model which can appropriately cluster the marbles and put them in appropriate buckets. In this example, for instance, our model may try and identify the marbles based on their size (small, medium or large) and group those into the appropriate bin. As you can see from this simple example there are numerous other ways the model could try and interpret the marbles (pattern, color, etc.). The power of machine learning is to employ all of these techniques and find the “best fit”.
With supervised learning we are given the “buckets” and labels and now need to implement techniques which can appropriately classify the marbles into the corresponding bins. In this simple example we classify the marbles by colors.
A more familiar example is a spreadsheet which contains information about an asset like ID, Date/Time, and Value. In this case because we don’t have any labelled failure data we have to use unsupervised learning techniques and try and detect outliers or anomalies in our datasets.
However, if we are then given failure information such as component and action taken we can apply supervised techniques allowing us to classify and predict failures. If the data is available supervised learning can provide deeper insights into what might be going on within your system, however, as you can see from the previous example even having unlabeled data can lead to impactful results.
For the Invenergy use case we had labelled data and was therefore able to utilize supervised learning to build our model., but for other applications we have had to utilize unsupervised learning.