In a new era of heightened oil-price volatility, data and technology are crucial in helping operators cut costs and maximise value; 10 real-world examples of oil and gas innovators using data for economic effect
IRJET- Analysis of Well Head Pressure Sensor Data for Anomaly Detection in Oi...IRJET Journal
This document summarizes a research paper that analyzes well head pressure sensor data from oil wells to detect anomalies using unsupervised machine learning techniques. It discusses how industrial internet of things (IIOT) technologies and machine learning algorithms can be applied to large amounts of sensor data to predict issues in oil wells early. Specifically, it explores using principal component analysis, Mahalanobis distance, and isolation forest algorithms on pressure sensor time series data to identify anomalies and notify customers before problems occur in their oil wells. The goal is to help improve oil well maintenance and reduce downtime through early defect prediction.
Pipeline and Gas Tech April 09 - SCADA Evolutionsmrobb
The document discusses the evolution of SCADA systems from early systems that collected small amounts of data from remote field devices to modern enterprise operations platforms that integrate field data across business systems. It describes how one large oil and gas company overhauled its SCADA system, replacing 14 separate data silos with a single system to collect daily updates from across operations and transform data into useful information for various business user groups. The new platform improved productivity by reducing time spent on tasks like production monitoring and regulatory reporting and allowing the company to handle more work with existing staff.
Companies around the world are developing strategies for the Industrial Internet of Things. Predictive maintenance--using analytics--is becoming the first step for many on this voyage. Mike Kanellos at OSIsoft explains why.
The document discusses integrated knowledge and a single database for Sonatrach that allows for multi-discipline collaboration and sharing of best practices. It promotes linking local intranet networks to a global network to share production data, equipment/spare part management, human resources skills, and unified standards across the organization. Real-time monitoring of oilfield assets using intelligent networks is also mentioned to improve decision making and performance. The challenges of a human resource shortage and how big data analytics can help develop digital oilfields are summarized as well.
Industry4.0 Oil & Gas - Exploration & Production / UpstreamDrew Sparrow
This document discusses how digital innovations are reshaping the oil and gas exploration and production industry. It outlines several macro trends impacting supply and demand forces and also key digital trends like internet of things, big data/analytics, mobile devices, and cloud computing. The document then focuses on how these digital technologies can be applied across the asset lifecycle to create significant value through initiatives like operations optimization, advanced analytics and modeling, connected worker technologies, new era of automation including autonomous operations and remote operations centers, predictive maintenance, and more. It provides estimates on the potential financial and operational benefits these initiatives could provide as well as estimated impacts on reducing costs, accidents, emissions and more.
The article looks at how new technologies will lead to an increasingly integrated approach within the O&G sector, siting specifics such as the IoT and robotics & the radical impact they will have on optimising productions within the sector.
The use of LNG is a proven, reliable, and safe process, and natural gas is quickly becoming the world’s cleanest burning fossil fuel as it emerges as the environmentally preferred fuel of choice. It is for this reason that LNG facilities are put under more pressure than ever to meet the world’s natural gas demand. Keeping your plants running at an efficient pace is vital to production and condition monitoring plays a critical part in this process. Condition monitoring provides a proactive approach so that maintenance can be planned, eliminating unscheduled outages and optimizing machine performance. In addition, condition monitoring helps to avoid breakdowns with subsequent secondary damage and loss of production revenue meaning that today’s LNG facilities simply can’t afford to not have a reliable condition monitoring solution.
IRJET- Analysis of Well Head Pressure Sensor Data for Anomaly Detection in Oi...IRJET Journal
This document summarizes a research paper that analyzes well head pressure sensor data from oil wells to detect anomalies using unsupervised machine learning techniques. It discusses how industrial internet of things (IIOT) technologies and machine learning algorithms can be applied to large amounts of sensor data to predict issues in oil wells early. Specifically, it explores using principal component analysis, Mahalanobis distance, and isolation forest algorithms on pressure sensor time series data to identify anomalies and notify customers before problems occur in their oil wells. The goal is to help improve oil well maintenance and reduce downtime through early defect prediction.
Pipeline and Gas Tech April 09 - SCADA Evolutionsmrobb
The document discusses the evolution of SCADA systems from early systems that collected small amounts of data from remote field devices to modern enterprise operations platforms that integrate field data across business systems. It describes how one large oil and gas company overhauled its SCADA system, replacing 14 separate data silos with a single system to collect daily updates from across operations and transform data into useful information for various business user groups. The new platform improved productivity by reducing time spent on tasks like production monitoring and regulatory reporting and allowing the company to handle more work with existing staff.
Companies around the world are developing strategies for the Industrial Internet of Things. Predictive maintenance--using analytics--is becoming the first step for many on this voyage. Mike Kanellos at OSIsoft explains why.
The document discusses integrated knowledge and a single database for Sonatrach that allows for multi-discipline collaboration and sharing of best practices. It promotes linking local intranet networks to a global network to share production data, equipment/spare part management, human resources skills, and unified standards across the organization. Real-time monitoring of oilfield assets using intelligent networks is also mentioned to improve decision making and performance. The challenges of a human resource shortage and how big data analytics can help develop digital oilfields are summarized as well.
Industry4.0 Oil & Gas - Exploration & Production / UpstreamDrew Sparrow
This document discusses how digital innovations are reshaping the oil and gas exploration and production industry. It outlines several macro trends impacting supply and demand forces and also key digital trends like internet of things, big data/analytics, mobile devices, and cloud computing. The document then focuses on how these digital technologies can be applied across the asset lifecycle to create significant value through initiatives like operations optimization, advanced analytics and modeling, connected worker technologies, new era of automation including autonomous operations and remote operations centers, predictive maintenance, and more. It provides estimates on the potential financial and operational benefits these initiatives could provide as well as estimated impacts on reducing costs, accidents, emissions and more.
The article looks at how new technologies will lead to an increasingly integrated approach within the O&G sector, siting specifics such as the IoT and robotics & the radical impact they will have on optimising productions within the sector.
The use of LNG is a proven, reliable, and safe process, and natural gas is quickly becoming the world’s cleanest burning fossil fuel as it emerges as the environmentally preferred fuel of choice. It is for this reason that LNG facilities are put under more pressure than ever to meet the world’s natural gas demand. Keeping your plants running at an efficient pace is vital to production and condition monitoring plays a critical part in this process. Condition monitoring provides a proactive approach so that maintenance can be planned, eliminating unscheduled outages and optimizing machine performance. In addition, condition monitoring helps to avoid breakdowns with subsequent secondary damage and loss of production revenue meaning that today’s LNG facilities simply can’t afford to not have a reliable condition monitoring solution.
Big Data Analytics for Commercial aviation and AerospaceSeda Eskiler
globalaviationaerospace.com
An opportunity for insight in the changing commercial aerospace business
Vision for New Applications of Analytic Insight in Commercial Aerospace
Benefit of Big Data Analytics for the Airline Operator
Modern, Mobile Experience
Big Data Analytics In Action
Predictive Analytics To Prevent Engine Events
Predictive Analytics Improves Safety and Quality
Predictive Analytics Keeps More Planes in the Air
Using Machine Learning to Quantify the Impact of Heterogeneous Data on Transf...Power System Operation
Using large-scale distributed computing and a variety of heterogeneous data sources including real-time sensor measurements, dissolved gas measurements, and localized historical weather, we construct a predictive model that allows us to accurately predict remaining useful life and failure probabilities for a fleet of network transformers. Our model is robust to highly variable data types, including both static and dynamic data, sparse and dense time series, and measurements of internal and external processes (such as weather). By comparing the predictive performance of models built on different combinations of these data sources, we can quantify the marginal benefit of including each additional data source in our model.
In order to relate each type of data to the risk of failure across a fleet of transformers, we have developed a novel class of survival models, the convex latent variable (CLV) model. This type of specialized survival model has several advantages. Rather than an opaque and subjective "health index", it produces interpretable predictions like the probability of failure within a given time window or the expected RUL of an asset. Our framework supports accurate estimates of the risk of equipment failure across a wide range of time-scales, from a few weeks to many years in the future, and can model not just the instantaneous risk of failure due to an event like a storm, but also the long-term impact on the risk of failure.
K-IMS is a communications platform developed by KONGSBERG for the maritime industry. It collects data from vessel systems and sensors and distributes it to fleet management offices and suppliers to create a full operational picture. K-IMS provides a single portal for reduced costs, increased performance awareness, and improved decision-making through data analysis. It allows secure access to information anywhere through its interactive web interface and efficient data sharing capabilities.
Ensuring Resilience: Robust Backup Strategies for the Oil and Gas SectoMaryJWilliams2
Explore robust backup strategies designed for the oil and gas industry with our detailed PDF submission. Uncover the challenges and considerations specific to data protection in the oil and gas sector. Gain practical insights into implementing resilient backup solutions to safeguard critical operations and assets. Equip yourself with the knowledge needed to ensure data resilience in the dynamic environment of the oil and gas industry. To Know more: https://stonefly.com/white-papers/requirement-reliable-robust-backup-oil-gas-industry/
The document discusses the evolution of the digital oilfield concept from early data gathering to modern integrated operations enabled by technology advances. It provides an overview of key developments like early logging data transmission and downhole sensors. The modern digital oilfield uses real-time data in continuous optimization cycles and allows centralized decision making. Integrated operations aim to improve effectiveness through cross-discipline collaboration. An example is provided in the Kuwait Intelligent Digital Field project.
Industry 4.0: Merging Internet and FactoriesFabernovel
Industrial IoT and connected objects for factories are part of our research at FABERNOVEL OBJET, our activity dedicated to IoT.
The future of industry is at the crossroads of internet and factories. Some call it INDUSTRY 4.0 or FACTORY 4.0 in reference to the upcoming fourth industrial revolution. Governments and private companies in Germany, UK and the USA have acknowledged the importance of industrial IoT and its central role in future industrial transformation.
The adoption of Industrial Internet has both near-term and long-term impacts and will be characterized by the emergence of new models such as the “Outcome Economy” and the “Autonomous, Pull Economy”.
We believe that INDUSTRY 4.0 is a growth opportunity for industrial companies, and have decrypted this very phenomenon in the following presentation.
Phillips 66 has developed a reliability program to manage equipment health across its 11,000 miles of pipelines, 144 pump stations, and 22 product terminals. The program uses a proactive maintenance model incorporating condition monitoring, including vibration analysis of over 700 drive trains. By correlating vibration and operational data, the program identifies defects early, avoiding costly repairs. It has realized over $2 million in annual maintenance savings through reduced downtime and repairs. The comprehensive program analyzes gaps, benchmarks performance, and continually evaluates technologies to optimize reliability cost-effectively and ensure safe, efficient operations across Phillips 66's diverse assets.
The power of the industrial internet has trasformed the use of equipment and given a boost to technology. In the case of paralell flow regenerative technolgy used in lime production, it has enabled lime producers to improve efficiency, reduce power demand and yeld significant cost benefits.
The power of the industrial internet has trasformed the use of equipment and given a boost to technology. In the case of paralell flow regenerative technolgy used in lime production, it has enabled lime producers to improve efficiency, reduce power demand and yeld significant cost benefits.
The document discusses how several companies have saved significant amounts of money through data-based maintenance using the PI System. Dong Energy believes it can save 20 million Euros per year by remotely monitoring offshore wind farm maintenance. Syncrude estimates savings of $20 million per year and an 85% reduction in certain injuries through data-based maintenance. Columbia Pipeline Group saved $9.6 million since 2002 by using the PI System to manage its natural gas network and avoid outages. Petronas collects real-time data to remotely monitor offshore platforms and prevented five unplanned shutdowns in the first year.
LEVERAGING BIG DATA FOR INTELLIGENT WATER MANAGEMENTwle-ss
The document discusses how water utilities can leverage big data and artificial intelligence using integrated software from AVEVA and OSIsoft. It describes how their software helps water utilities improve efficiency, optimize assets, ensure safety and quality, and meet regulatory reporting requirements. The software allows utilities to collect data from various sources, store and analyze it in a centralized data hub, and gain insights to improve performance. Case studies show benefits like reducing water loss and deferred capital expenditures.
Achieve Higher Quality Decisions Faster for a Competitive Edge in the Oil and...Hitachi Vantara
Hitachi next-generation unified storage solutions meet the challenges of today’s data-intensive oil and gas exploration and production activities. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Application of Big Data Systems to Airline ManagementIJLT EMAS
The business world is in the midst of the next
revolution following the IT revolution – the Big Data revolution.
The sheer volume of data produced is a major reason for the big
data revolution. Aviation and aerospace are typical areas that
can apply big data systems due to the scale of data produced, not
only by the plane sensors and passengers, but also by the
prospective passengers. Data that need to be considered include,
but are not limited to, aircraft sensor data, passenger data,
weather data, aircraft maintenance data and air traffic data.
This paper aims at identifying areas in aviation where big data
systems can be utilized to enhance operational performances
improve customer relations and thereby aiding the ultimate goal
of increased profits at reduced costs. An improved management
model built on a strong big data infrastructure will reduce
operation costs, improve safety, bring down the cost and time
spent on maintenance and drastically improve customer
relations.
The white paper discusses the concept of the "connected oilfield", which involves using information and communication technology to integrate business processes, assets, data, and stakeholders across organizations and geographies in order to improve decision making. A connected oilfield requires an intelligent network that can connect everyone and everything, allowing remote monitoring and management of equipment and collaboration regardless of location. Benefits of a connected oilfield include increased production and reserves, reduced costs, and improved safety and environmental performance. The key is providing seamless connectivity and integration of data, workflows, and people.
Access to large amounts of seismic data is essential for oil and gas companies to make timely decisions about new prospects and reduce the time to discovery. As energy demands increase, more sophisticated analysis of greater volumes of data is needed. Speed and access to rapidly expanding datasets is key to accelerating analysis workflows and high-quality decision making within project deadlines.
A new age of industrial productionThe Internet of Things, Se.docxransayo
The document discusses the new age of industrial production enabled by connecting industrial equipment, systems, and processes to the Internet. This industrial Internet of Things allows for improved productivity and efficiency through data collection and analytics, predictive maintenance, remote monitoring and control, and more flexible automation from order to delivery. It highlights examples of ABB solutions that utilize sensors and data to optimize operations in various industries like manufacturing, mining, shipping, and energy.
BIG DATA AND BIG OIL – GE’S SYSTEMS AND SENSORS DRIVE EFFICIENCIES FOR BPGE Software
With energy demands rising and
reserves of oil and gas becoming
more challenging to access, the
productivity revolution promoted
by the Industrial Internet is of vital
importance to the oil and gas
sector. By combining decades
of manufacturing expertise with
its rapidly expanding software
engineering capability, GE is leading
the big data revolution so that its
customers can operate both more
effectively and efficiently.
It is an analytic solution for an oil and gas industry. This will provide the business intelligence solution for the industry business for management and deep insights.
Differentiation through digital innovation and transformationGlen Koskela
1) The document discusses how digital innovation and transformation can impact various industries like transportation, manufacturing, and maritime. It provides examples of how technologies like IoT, data analytics, and AI can optimize operations and efficiency.
2) Specific applications mentioned include using real-time data to optimize transportation systems and traffic management, creating digital twins to monitor industrial assets and processes, and analyzing ship data to reduce fuel consumption and improve maintenance.
3) The document emphasizes that digital transformation requires focusing technologies to drive value for customers in each industry by repositioning offerings in the market.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
More Related Content
Similar to OSIsoft White Paper "Impacting the Bottom Line" in O&G
Big Data Analytics for Commercial aviation and AerospaceSeda Eskiler
globalaviationaerospace.com
An opportunity for insight in the changing commercial aerospace business
Vision for New Applications of Analytic Insight in Commercial Aerospace
Benefit of Big Data Analytics for the Airline Operator
Modern, Mobile Experience
Big Data Analytics In Action
Predictive Analytics To Prevent Engine Events
Predictive Analytics Improves Safety and Quality
Predictive Analytics Keeps More Planes in the Air
Using Machine Learning to Quantify the Impact of Heterogeneous Data on Transf...Power System Operation
Using large-scale distributed computing and a variety of heterogeneous data sources including real-time sensor measurements, dissolved gas measurements, and localized historical weather, we construct a predictive model that allows us to accurately predict remaining useful life and failure probabilities for a fleet of network transformers. Our model is robust to highly variable data types, including both static and dynamic data, sparse and dense time series, and measurements of internal and external processes (such as weather). By comparing the predictive performance of models built on different combinations of these data sources, we can quantify the marginal benefit of including each additional data source in our model.
In order to relate each type of data to the risk of failure across a fleet of transformers, we have developed a novel class of survival models, the convex latent variable (CLV) model. This type of specialized survival model has several advantages. Rather than an opaque and subjective "health index", it produces interpretable predictions like the probability of failure within a given time window or the expected RUL of an asset. Our framework supports accurate estimates of the risk of equipment failure across a wide range of time-scales, from a few weeks to many years in the future, and can model not just the instantaneous risk of failure due to an event like a storm, but also the long-term impact on the risk of failure.
K-IMS is a communications platform developed by KONGSBERG for the maritime industry. It collects data from vessel systems and sensors and distributes it to fleet management offices and suppliers to create a full operational picture. K-IMS provides a single portal for reduced costs, increased performance awareness, and improved decision-making through data analysis. It allows secure access to information anywhere through its interactive web interface and efficient data sharing capabilities.
Ensuring Resilience: Robust Backup Strategies for the Oil and Gas SectoMaryJWilliams2
Explore robust backup strategies designed for the oil and gas industry with our detailed PDF submission. Uncover the challenges and considerations specific to data protection in the oil and gas sector. Gain practical insights into implementing resilient backup solutions to safeguard critical operations and assets. Equip yourself with the knowledge needed to ensure data resilience in the dynamic environment of the oil and gas industry. To Know more: https://stonefly.com/white-papers/requirement-reliable-robust-backup-oil-gas-industry/
The document discusses the evolution of the digital oilfield concept from early data gathering to modern integrated operations enabled by technology advances. It provides an overview of key developments like early logging data transmission and downhole sensors. The modern digital oilfield uses real-time data in continuous optimization cycles and allows centralized decision making. Integrated operations aim to improve effectiveness through cross-discipline collaboration. An example is provided in the Kuwait Intelligent Digital Field project.
Industry 4.0: Merging Internet and FactoriesFabernovel
Industrial IoT and connected objects for factories are part of our research at FABERNOVEL OBJET, our activity dedicated to IoT.
The future of industry is at the crossroads of internet and factories. Some call it INDUSTRY 4.0 or FACTORY 4.0 in reference to the upcoming fourth industrial revolution. Governments and private companies in Germany, UK and the USA have acknowledged the importance of industrial IoT and its central role in future industrial transformation.
The adoption of Industrial Internet has both near-term and long-term impacts and will be characterized by the emergence of new models such as the “Outcome Economy” and the “Autonomous, Pull Economy”.
We believe that INDUSTRY 4.0 is a growth opportunity for industrial companies, and have decrypted this very phenomenon in the following presentation.
Phillips 66 has developed a reliability program to manage equipment health across its 11,000 miles of pipelines, 144 pump stations, and 22 product terminals. The program uses a proactive maintenance model incorporating condition monitoring, including vibration analysis of over 700 drive trains. By correlating vibration and operational data, the program identifies defects early, avoiding costly repairs. It has realized over $2 million in annual maintenance savings through reduced downtime and repairs. The comprehensive program analyzes gaps, benchmarks performance, and continually evaluates technologies to optimize reliability cost-effectively and ensure safe, efficient operations across Phillips 66's diverse assets.
The power of the industrial internet has trasformed the use of equipment and given a boost to technology. In the case of paralell flow regenerative technolgy used in lime production, it has enabled lime producers to improve efficiency, reduce power demand and yeld significant cost benefits.
The power of the industrial internet has trasformed the use of equipment and given a boost to technology. In the case of paralell flow regenerative technolgy used in lime production, it has enabled lime producers to improve efficiency, reduce power demand and yeld significant cost benefits.
The document discusses how several companies have saved significant amounts of money through data-based maintenance using the PI System. Dong Energy believes it can save 20 million Euros per year by remotely monitoring offshore wind farm maintenance. Syncrude estimates savings of $20 million per year and an 85% reduction in certain injuries through data-based maintenance. Columbia Pipeline Group saved $9.6 million since 2002 by using the PI System to manage its natural gas network and avoid outages. Petronas collects real-time data to remotely monitor offshore platforms and prevented five unplanned shutdowns in the first year.
LEVERAGING BIG DATA FOR INTELLIGENT WATER MANAGEMENTwle-ss
The document discusses how water utilities can leverage big data and artificial intelligence using integrated software from AVEVA and OSIsoft. It describes how their software helps water utilities improve efficiency, optimize assets, ensure safety and quality, and meet regulatory reporting requirements. The software allows utilities to collect data from various sources, store and analyze it in a centralized data hub, and gain insights to improve performance. Case studies show benefits like reducing water loss and deferred capital expenditures.
Achieve Higher Quality Decisions Faster for a Competitive Edge in the Oil and...Hitachi Vantara
Hitachi next-generation unified storage solutions meet the challenges of today’s data-intensive oil and gas exploration and production activities. For more information on Hitachi Unified Storage and Hitachi NAS Platform 4000 series please visit: http://www.hds.com/products/file-and-content/network-attached-storage/?WT.ac=us_mg_pro_hnasp
Application of Big Data Systems to Airline ManagementIJLT EMAS
The business world is in the midst of the next
revolution following the IT revolution – the Big Data revolution.
The sheer volume of data produced is a major reason for the big
data revolution. Aviation and aerospace are typical areas that
can apply big data systems due to the scale of data produced, not
only by the plane sensors and passengers, but also by the
prospective passengers. Data that need to be considered include,
but are not limited to, aircraft sensor data, passenger data,
weather data, aircraft maintenance data and air traffic data.
This paper aims at identifying areas in aviation where big data
systems can be utilized to enhance operational performances
improve customer relations and thereby aiding the ultimate goal
of increased profits at reduced costs. An improved management
model built on a strong big data infrastructure will reduce
operation costs, improve safety, bring down the cost and time
spent on maintenance and drastically improve customer
relations.
The white paper discusses the concept of the "connected oilfield", which involves using information and communication technology to integrate business processes, assets, data, and stakeholders across organizations and geographies in order to improve decision making. A connected oilfield requires an intelligent network that can connect everyone and everything, allowing remote monitoring and management of equipment and collaboration regardless of location. Benefits of a connected oilfield include increased production and reserves, reduced costs, and improved safety and environmental performance. The key is providing seamless connectivity and integration of data, workflows, and people.
Access to large amounts of seismic data is essential for oil and gas companies to make timely decisions about new prospects and reduce the time to discovery. As energy demands increase, more sophisticated analysis of greater volumes of data is needed. Speed and access to rapidly expanding datasets is key to accelerating analysis workflows and high-quality decision making within project deadlines.
A new age of industrial productionThe Internet of Things, Se.docxransayo
The document discusses the new age of industrial production enabled by connecting industrial equipment, systems, and processes to the Internet. This industrial Internet of Things allows for improved productivity and efficiency through data collection and analytics, predictive maintenance, remote monitoring and control, and more flexible automation from order to delivery. It highlights examples of ABB solutions that utilize sensors and data to optimize operations in various industries like manufacturing, mining, shipping, and energy.
BIG DATA AND BIG OIL – GE’S SYSTEMS AND SENSORS DRIVE EFFICIENCIES FOR BPGE Software
With energy demands rising and
reserves of oil and gas becoming
more challenging to access, the
productivity revolution promoted
by the Industrial Internet is of vital
importance to the oil and gas
sector. By combining decades
of manufacturing expertise with
its rapidly expanding software
engineering capability, GE is leading
the big data revolution so that its
customers can operate both more
effectively and efficiently.
It is an analytic solution for an oil and gas industry. This will provide the business intelligence solution for the industry business for management and deep insights.
Differentiation through digital innovation and transformationGlen Koskela
1) The document discusses how digital innovation and transformation can impact various industries like transportation, manufacturing, and maritime. It provides examples of how technologies like IoT, data analytics, and AI can optimize operations and efficiency.
2) Specific applications mentioned include using real-time data to optimize transportation systems and traffic management, creating digital twins to monitor industrial assets and processes, and analyzing ship data to reduce fuel consumption and improve maintenance.
3) The document emphasizes that digital transformation requires focusing technologies to drive value for customers in each industry by repositioning offerings in the market.
Similar to OSIsoft White Paper "Impacting the Bottom Line" in O&G (20)
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Infrastructure Challenges in Scaling RAG with Custom AI modelsZilliz
Building Retrieval-Augmented Generation (RAG) systems with open-source and custom AI models is a complex task. This talk explores the challenges in productionizing RAG systems, including retrieval performance, response synthesis, and evaluation. We’ll discuss how to leverage open-source models like text embeddings, language models, and custom fine-tuned models to enhance RAG performance. Additionally, we’ll cover how BentoML can help orchestrate and scale these AI components efficiently, ensuring seamless deployment and management of RAG systems in the cloud.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
OSIsoft White Paper "Impacting the Bottom Line" in O&G
1. Impacting the bottom line
10 real-world examples of oil and gas
innovators using data for economic effect
In a new era of heightened oil-price volatility,
data and technology are crucial in helping
operators cut costs and maximise value.
2. 2
A
s everybody in the oil industry knows, it’s battling
against low crude prices, tougher regulations,
alternative forms of fuel and power, and rising
upstream costs. From China to Libya, operators in many
of the major oil-producing nations are struggling to achieve
breakeven costs of production and most analysts believe
things will get worse before they get better.
This is why many operators are sweating every last dollar
of value from their physical infrastructure just to break even
on the cost of production. But there’s only so much value
which can be wrung out of physical assets and the smartest
operators are turning to a new kind of infrastructure to
face today’s and tomorrow’s challenges. It’s called data
infrastructure. And it’s this technology which relays vital
information from great depths, provides 24/7 monitoring of
the condition of mission-critical equipment, monitors for
safety issues and much else.
But important as this information is, operators are
drowning in fragmented, critical, time-series data pouring
out of multiple and diverse systems that don’t talk to each
other. The burning imperative now is how to harness all
this vital information and convert it to data you can use.
The OSIsoft PI System does this—and it’s been doing it
for years. The PI System provides a platform that converts
an ever-mounting deluge of data into a coherent whole
that allows operators to save money, meet regulatory
requirements and improve safety. Some E&P users of the
PI System have cut barrels of oil equivalent per day (boe/d)
costs by up to 2-5% while others have cut their controllable
margin in logistics by 1-5%. In the hard-hit North Sea,
some operators have halved average unit operating costs in
the last two years, from $29.70 per barrel ($/b) to $15.30/b.
In this paper you’ll find enlightening case studies of what
PI System can do.
Achieving
more with less
3. 3
W
hen the Polar Vortex hits the Midwest and
Eastern Seaboard of America, dumping
temperatures to 40F below, Columbia Pipeline
(now part of TransCanada) is ready, despite its more than
15,000 miles of pipelines spanning 16 states from New York
to the Gulf of Mexico. That’s because the TransCanada-
owned group relies on enterprise analytics in the form of
the PI System to keep the gas flowing to its domestic and
industrial customers.
As the company reports, the consequent gains in time,
money and man-hours from a PI System-based enterprise
analytics is considerable. The cost savings between 2006 and
2016 from the kind of preventive actions that are enabled by
its data infrastructure reached a cumulative $10m.
But, as Emily Rawlings, the group’s engineering manager
of system reliability, acknowledged at the time of one of
the biggest Polar Vortex events, the confidence that comes
from being able to weather the storms goes far beyond the
bottom line. “There are great intangible benefits”, she told a
presentation. “The customer confidence [that was created]
was huge.”
And Columbia has endured the worst Polar Vortex events.
In November 2014, before temperatures fell to the lowest
levels since 1976, the group had installed the PI System
which gathers, absorbs, processes and analyses data that
enables engineers to monitor in split seconds the way the gas
is flowing. As the hostile weather loomed, Columbia Pipeline
had every reason to be concerned. As AccuWeather chief
meteorologist Elliot Abrams forecast at the time, “the Arctic
blast will have the greatest shock in the central states.” And
yet there was no interruption to gas flows.
How does enterprise analytics work? Essentially, it employs
advanced digitised technology to extract data that companies
such as Columbia Pipeline need to know. By harnessing and
presenting the data in ways that allow operators to “see”
into the efficient functioning of the entire system, enterprise
Damage
limitation
4. 4
analytics helps forestall blips in operations among other
applications.
As a result, the quality of decision-making is greatly
enhanced as engineers can access a wider range of real-time
information that enables them to respond to events, major
and minor, with confidence.
It’s all about certainty. As Chevron Pipe Line Company,
another OSIsoft customer, puts it, enterprise analytics
closes the gap between “I think” and “I know”. And when a
company is producing more than 1.3m b/d of crude, refined
products and chemicals through 4,100 miles of pipes, as
Chevron Pipe Line Company does, it’s a lot more comforting
to “know” rather than to “think”.
Safeguards
Enterprise analytics is about a lot more than protection against
storms however. No oil and gas installation, whatever its
nature, can run efficiently 100% of the time and, if problems
occur, it’s important to know what has happened and why.
In mid-2016, Columbia Pipeline engaged OSIsoft again to
deliver a real-time system for “intelligent maintenance”. With
a compression fleet grown to 1.2m horsepower, maintenance
and operational efficiency had become paramount.
Today, executives and front-line staff can see what’s
happening on dashboards. More than 7,000 mission-critical
streams of data are tapped to provide the big picture of the
compression assets. Visibility has increased steadily to 98%.
As Hungary’s MOL Group, another long-term PI System
user, recognises, visibility is everything. The PI system has
become a vital tool in the safe operation of its refineries that
produce around 417,000 barrels a day. As the group’s head
of process and automation, Tibor Comroczki, explains, in
the event of a potentially dangerous malfunction the system
dispatches operationally critical data about what happened
and automatically logs it for subsequent analysis. As
Comroczki points out, “it’s extremely important to have up-
to-date information about the state of the safeguard system,
which safety function was activated, when and why.”
Over the years, the entire group has replaced its
cumbersome paper-based reporting processes with PI’s
electronic format until today the system is the backbone of
MOL’s event management architecture.
Eyes on the ground
In the battle to lower production costs, all-round situational
awareness is essential. And one of the breakthroughs in
enterprise analytics is the ability it provides to visualise
precisely what’s going on inside operations, however big
and complex those operations are.
$10m: Cumulative cost
savings enabled by data
infrastructure between
2006 and 2016
5. 5
Kongsberg Maritime
Norway-based Kongsberg Maritime is a specialist in deep-
sea exploration for the oil and gas industry. With 58 offices
in 20 countries, it also provides services to 18,000 vessels.
It’s a big and diverse company that’s been in business for
70 years and in all that time reliability has been Kongsberg’s
watchword. The PI System is fundamental to the company’s
historic goal.
At the heart of Kongsberg’s operations is a purpose-
designed information management system―the data
infrastructure―that enables operational staff to see in
real time what’s happening in its most critical functions around
the world. For example, in its offshore drilling operations the
PI System enables Kongsberg to monitor wellhead fatigue in
real time. It maximises the operational efficiency of the drilling
riser. And it adjusts work programs according to the weather,
sometimes on a minute by minute basis. And that’s just a few
of the virtues that the PI System brings to Kongsberg.
Taking wellhead fatigue as an example, the PI System
effectively provides a running commentary on its status by
absorbing a mass of information from undersea sensors
and transmiting the data to the control room.
Similarly, in the case of Kongsberg’s sea-going fleet, the
PI system constantly analyses for cylinder status, leakage in
the piston rings and exhaust valves, and performance of the
injection system among other vital functions. And because
it’s of little use to be overloaded with data and not know
what it all means, the system “visualises” the status of the
engines and for good measure notifies the nature and pattern
of any faults.
In short, Kongsberg Maritime’s control room has a running
prognosis of the state of engine components in much the same
way that racing cars are monitored during an F1 grand prix.
And it all shows up on a dash board that, to the trained eye,
is as informationally friendly as the dials on a family saloon.
This is the kind of coherently organised data that enables
big savings.
That’s why California’s Pacific Gas and Electricity opted
for the PI System. With 43 miles of distribution and 4,500
remote terminal units (RTUs) continuously transmitting data
to a central control system, it’s essential to get a system-wide
picture of all operations.
To give PG&E that picture, OSIoft came up with a video
wall. Tools such as video walls constitute one of the most
valuable advances in the technology of exploration and
production because they provide actionable knowledge, stuff
operators need to know. In PG&E’s case, the video wall
was installed in the gas distribution control centre where it
converts a non-stop avalanche of data into valuable insights.
For instance, operators can see at a glance how well the
compressors are operating and issue instructions in the
confidence that the data on which they’re basing decisions
is accurate.
This is situational awareness. Because they can see pretty
much everything that’s going on, operators can fix things
before they cause problems.
With 43 miles of distribution
and 4,500 remote terminal units
continuously transmitting data,
it’s essential to get a system-wide
picture of operations
6. 6
W
hen Tullow Oil turned on its technology assisted
production system known as TAPS in the remote
Jubilee field in the Gulf of Guinea, offshore West
Africa, it unleashed a stream of actionable data that the group
has exploited ever since to extract maximum value from these
isolated assets. So useful has TAPS been that it’s given the
company the confidence to develop the deepwater Ten field
in the same area.
As the group acknowledges, the level of comfort provided
by TAPS is vital to operations. “Without the real-time data
and analysis tools provided by TAPS it would not have been
possible to optimise the Jubilee wells and carry out continuous
effective well and reservoir management,” says Tullow’s
project manager Mark Whitehouse.
The gains have come right across the board. The group
estimates that TAPS has saved millions of dollars in the
maintenance of the multi-phase flow metering system alone.
Based on the PI System, from conception to execution
Tullow’s digital solution was developed within just 12 months.
But what exactly is TAPS? Summarised, it’s a system for
delivering data you can’t do without. And especially when
the data is gathered from a depth of 1,250m, roughly 50kms
off the coast.
Tullow attributes to TAPS a range of benefits. By
automating a lot of routine jobs such as alerting engineers
about choke settings and well status, it frees up time for
more creative work that adds value to the field. And in such
a challenging location, there’s a lot of this kind of work. With
many of the inspection chores done for them, engineers can
put their heads together to resolve technical problems that,
if not fixed, could reduce productivity.
But how does TAPS work? With the PI system at its heart,
it’s essentially a ship-to-shore tool that gathers raw real-time
data from strategically placed sensors and transmits it to
programmable logic controllers (PLCs) application modules.
The data passes through a screen at the control centre
before being relayed through a server and collected by PI.
These modules process the resulting stream of information—
Long way
from anywhere…
7. 7
for example about the state of the wellheads—and organise
it into a shape that delivers important insights. All this
happens offshore.
Next, the data is transmitted by satellite onshore to Accra,
Ghana’s capital, where it flashes up on seven dedicated
screens in easily visualised form. At this point Tullow has data
it can use—and data is something it uses a lot of. Because the
information is so accessible and designated employees see the
same data, the company says TAPS facilitates collaboration
between different functions.
Armed with the lessons the group has learned from TAPS
in the Jubilee field, Tullow is applying the system in the TEN
area comprising Tweneboa, Enyenra and Ntomme. There
won’t be any new drilling in the area pending the resolution
of a two-year maritime dispute between the Ivory Coast and
Ghana now before the International Tribunal of the Law
of the Sea, but meantime Tullow is using TAPS to boost
production from existing wells.
Dynamic data
Plagued by unplanned shutdowns and production losses
in the Erb West field 60kms off the coast of Malaysia,
Petronas Carigali turned to the PI System a few years
ago. The oil major’s main problem in trouble-shooting
these interruptions was what it called stagnant data that
continually frustrated attempts to restore production.
It used to take three weeks before engineers onshore
received information about an incident in the field and
could then present a report to management. Once a
decision could be made, travel to the site had to be
arranged. By then, the situation might have worsened
with consequent increases in cost and difficulty of repair.
Also, since the technical information engineers were
given was often inadequate, it made troubleshooting more
challenging. Lacking a digital database containing historical
information that provided a perspective on which they
could rely, engineers often went into action inadequately
equipped. Sometimes there was no process data available
at all, or it might be hard to access because it was trapped
in isolated control systems.
“[They were] not able to predict the potential problem
[affecting] critical equipment or process instrumentation
as there was no real-time monitoring system available,”
explains Petronas instrumentation engineer Musreen
Azwan. Thus engineers often had to work blind.
That’s now all changed. A full panoply of PI System
tools was deployed including interfaces, notifications,
servers, system management and learning channels.
As a result, troubleshooting has become faster, easier and
more effective. With much of the work automated―for
instance in terms of real-time process information, safe
and critical operating limits, instead of having to be
gathered by human labour, a lot of highly-paid hours have
been saved. “[The system] has given significant return
of investment in terms of savings in labour hours and
unplanned shutdown time,” the group confirms.
Above all, engineers can now identify mounting
problems and fix them before they affect production.
Petronas Carigali’s capacity for mitigation has increased
exponentially. As Azwan puts it, “stagnant data has
been turned into a flow of profitable information.” Call it
dynamic data.
So impressed has Petronas Carigali been by the
results that it’s progressively installing the PI System
in remote sites across all its offshore operations.
Without the real-time data
provided by TAPS it would not
have been possible to optimise
the Jubilee wells
8. 8
A
s Abu Dhabi’s biggest gas company, Dolphin Energy,
is well aware, one of the biggest challenges faced by
energy providers is environmental, health and safety
compliance. Get it wrong and penalties are certain.
But Dolphin Energy has the extra pressure of satisfying
the ideals of its chairman, His Highness Sheikh Hamdan
Bin Zayed Al Nahayan. As he puts it: “We cherish our
environment because it is an integral part of our country,
our history and our heritage.”
This worthy goal requires an advanced informational
infrastructure based on complex calculations and detailed
reporting that involves all functions. Comprehensive
documentation must be available for third-party auditors.
All methodologies employed must be accurate and verifiable.
And the data must be rock-solid.
Not surprisingly, until a few short years ago Dolphin Energy
fell short of the chairman’s goals. The original EHS system,
which was based on real-time monitoring of continuous
emissions and fuel gas flow meters, was labour-intensive,
consuming far too many man-hours with disappointing results.
Also, as the company’s Dr. Rola Atiyeh, senior manager for
environment and sustainability, acknowledged, it was unreliable
for reporting purposes, particularly as regulations become
more onerous and production increased. Dolphin Energy
produces an average daily volume of 2bn cubic feet (cf) of
gas to customers such as Abu Dhabi Water and Electricity,
the latter alone taking an average 99m scf a day.
OSIsoft stepped into the breach with the PI System. It
deployed its latest-generation tools including performance
equations, system management, tag configurators, data links
and the process book, the latter being a solution-oriented
service that allows users to review multiple sets of data and
quickly identify ways of enhancing the quality of data.
Working with Trinity Consultants and Dolphin Energy’s
environmental department, automation engineers, operational
groups, process and application engineers, the instrument
department, laboratory technicians, and outside consultants,
OSIsoft developed a system that today delivers all the data
relating to emission rates, carbon footprints, gas flares, gas
temperatures, fuel usage and general efficiency among other
requirements.
Safer,
cleaner,
greener
Game of tags
Around the same time as Dolphin Energy was developing
its EHS architecture, on the other side of the world in
Canada, Suncor Energy was faced with a health and safety
issue that was wasting important operational time.
The group had been using bypass tags to monitor and
audit the safety-critical movement of personnel at two
key operations―Firebag and MacKay River that employ
steam-assisted gravity drainage techniques.
The system had however got out of control with
thousands of bypass tags issued, resulting in too many false
alarms that took key staff away from more important work.
Worried that that health and safety was being compromised,
Suncor management called in OSIsoft to see what it could
do. “It is very important to continuously monitor and audit
safety-critical bypasses as they compromise the functions to
protect humans and the environment,” underlines Suncor’s
manager of applications and infrastructure, Tripto Somani.
A few months later, the PI System was deployed with
the desired results and has been applied more widely ever
since. Now spread over four sites, it also covers Suncor’s
oil sands operations. As Somani explains, the company’s
key performance indicators show that the group complies
fully with energy and environmental regulatory reporting
as well as safety issues.
And staff aren’t caught up in wild goose chases.
The benefits fell straight to the bottom line. “We did the
project to ensure the air emissions calculation system can
produce reliable data for all air emission reporting including
corporate key performance indicators,” explains
Dr. Atiyeh. “The new system will save us a lot of time
in quality assurance and quality control. It is accurate,
verifiable and documented by scientific methodologies.”
As Trinity Consultants’ Sue Sung explains: “In short,
[Dolphin Energy] has 100%-validated data that meets
regulator’s requirements in the compliance and reporting
of environmental health and safety.”
9. 9
A
s the oil and gas industry searches at ever greater
depths for the energy that sustains the global
economy, it becomes proportionately more
important—and challenging—to know exactly how effectively
equipment is functioning far below the earth’s surface.
And that’s particularly true about drilling rates.
Here’s how Marathon Oil halved from 24 days to 12 days
the average time it took to get from spud to total depth.
Marathon Oil had been working the difficult Eagle Ford
field in Texas for many years. At Eagle Ford, oil is produced
at depths of 5,000 to 8,000 feet to the northwest while dry
gas is found at depths of 10,000 to 12,000 feet to the
southeast, making it difficult to drill and complete wells.
So tough are the technical challenges that over the years
even experienced operators ran into insurmountable
problems and went exploring in easier locations.
In 2011, Marathon first began to apply the PI System
tool box including MaraDrill and Spotfire. The results
weren’t long in coming. Within two years the crews
had knocked 50% off the average drilling rate and were
approaching 15,000 feet a day, significantly better than
industry norms.
Up-to-the-minute data
The secret is knowing what’s happening in real time at the
sharp end. In every single second crews get a flow of insights—
between 20-30 bits of data—that tell them how quickly and
effectively they are drilling. Rate of penetration, weight on
the bit, revolutions per minute, rate of mud flow among other
measurements: the data appears on screens in easily understood
form. If the drill’s slipping, they know. If it’s sticking, they know.
Sensors provide the vital feedback that creates this running
commentary of progress from a long way down. Integrated
with Spotfire visualization, the crews literally acquire a picture
of how they’re doing at any moment of the day or night. Thus
they can react quickly if anomalies show up on the screen.
As a bonus the management get much better visibility. Not
only is it now possible to measure the relative performance of
crews, executives can better understand the reasons for any
production delays and idle drilling time.
Visibility
at 2500
metres…
10. 10
A step change for Talisman
Talisman Energy is another company that has had a long
and beneficial relationship with the PI System that every year
saves many days in lost production.
Back in 2001, Talisman realised it was vulnerable in its
North Sea operations to the reliability of its ever-growing
portfolio of rotating equipment, much of which operated at
considerable depths. Important though the drilling process is,
oil and gas companies need to know how well all their hard-
worked equipment is functioning. Because there are so many
different and interconnected types of machinery in offshore
exploration, a malfunction in any one of them can quickly
lead to knock-on failures and lost production. As far as
Talisman was concerned, prevention was the best cure.
After installing its first PI System on a single sub-sea
well, Talisman was so impressed by the results that the group
began to apply it more widely. Under a project called REEP
(for rotating equipment excellence programme), Talisman
used PI to monitor the performance of precisely 2,831 pieces
of machinery ranging from diesel-driven fire pumps and
emergency power generation packages to ballast pumps,
gas turbines and compressors and main oil line pumps.
In so many words, the purpose was to deliver the data
that enables engineers to keep all these complex pieces of
engineering up and running.
Reaping the rewards
Once the PI System was in full operation, Talisman quickly
reaped the rewards. When a gas filter got fouled and
threatened to blow out, Spotlight alerted the onshore base
who deployed a crew to replace it with a standby filter. The
estimated saving in lost production was 14 days. Similarly,
the system picked up a seal on an oil tank that was registering
nearly twice its correct temperature. The resulting fix saved
about ten days in lost production.
In today’s depressed oil market, it’s these insights that can
make all the difference between breakeven costs of production
or loss.
Talisman Energy is another
company that has had a long
and beneficial relationship
with the PI System
11. 11
The PI system is a platform on
which hundreds of data-driven
applications and services
can be mounted which
provide executives with
the certainty they need to
make commercial decisions
Coherency
counts
A
s the examples in this paper show the industry
needs all the data it can get in the battle to break
even in a largely hostile commercial environment.
But, as OSIsoft has learned over the years from working
alongside these and many other companies, a flood of
conflicting information deriving from diverse systems and a
heterogenous array of units, protocols and formats can be
overwhelming and often leads to confused decision-making.
Put bluntly, data has to be banged into shape before
it’s useful. The PI System is a platform on which literally
hundreds of data-driven applications and services can be
mounted, all of them contributing to coherent insights that
provide executives and other levels of management with the
certainty they need to make sound commercial decisions.
And as data is piled on data, it often overwhelms systems
built for less information-rich times. That’s why we made
the PI System scalable so that it can grow with the demands
made on it.
Over the years the industry’s needs for data have
constantly grown and evolved. It must be delivered in
the same language, accessible anywhere on any kind of
device, predictive in a way that forestalls hostile events, and
organised so that it facilitates collaboration instead of being
buried unproductively in silos.
It’s this knowledge that underpins the PI System’s
universality—that’s why it’s deployed at over 17,000 sites
around the world.