The document discusses the history of big data in the energy industry. It describes how early well logging in 1927 and the first seismograph in 1921 helped advance oil exploration by providing more data about subsurface conditions. Over time, technology improvements like 2D and 3D seismic imaging generated exponentially larger datasets. Today's datasets can exceed 100 terabytes from sources like coil seismic surveys. Advanced data collection and reservoir modeling are needed to optimize extraction from unconventional resources and maximize recovery rates from existing wells. Data now impacts the entire oil and gas value chain and will continue shaping the future of the energy industry.
NVIDIA is working on tackling climate change through the development of digital twins of Earth using AI and high performance computing. They are collaborating with various partners on initiatives like Destination Earth, which envisions an interactive digital twin platform for modeling and simulation. NVIDIA technologies like Omniverse, AI, and upcoming CPUs like Grace could help make a fully realized digital twin a reality. This would allow researchers to better understand climate systems and explore different scenarios to help mitigate and adapt to climate change.
This document discusses NVIDIA's efforts to move AI and accelerated computing technologies from research applications to real-world deployments across various domains. It outlines NVIDIA's hardware and software stack including GPUs, DPUs, CPUs and frameworks that can rearchitect data centers for AI. It also highlights several application areas like climate science, drug discovery, cybersecurity where NVIDIA is working to apply AI at scale using technologies like accelerated computing and graph neural networks.
Being the text of the ministerial address by
H.E. Dr. Kayode Fayemi, CON
Minister of Mines and Steel Development
Federal Republic of Nigeria
at the
iPAD Nigeria Mining Week
Miners Association of Nigeria;
in partnership with Dolf Madi Consulting;
and in association with PricewaterhouseCoopers and Spintelligent
Sheraton Hotel, Abuja, Nigeria | Tuesday, October 25, 2016
El podcasting consiste en la distribución de archivos multimedia a través de internet que los usuarios pueden descargar para escuchar cuando quieran. Es similar a una suscripción a un programa de radio o podcast hablado que los usuarios reciben regularmente para escuchar en su tiempo libre. Los podcasts pueden contener diversos tipos de contenido como noticias, documentales, música o entrevistas.
NVIDIA is working on tackling climate change through the development of digital twins of Earth using AI and high performance computing. They are collaborating with various partners on initiatives like Destination Earth, which envisions an interactive digital twin platform for modeling and simulation. NVIDIA technologies like Omniverse, AI, and upcoming CPUs like Grace could help make a fully realized digital twin a reality. This would allow researchers to better understand climate systems and explore different scenarios to help mitigate and adapt to climate change.
This document discusses NVIDIA's efforts to move AI and accelerated computing technologies from research applications to real-world deployments across various domains. It outlines NVIDIA's hardware and software stack including GPUs, DPUs, CPUs and frameworks that can rearchitect data centers for AI. It also highlights several application areas like climate science, drug discovery, cybersecurity where NVIDIA is working to apply AI at scale using technologies like accelerated computing and graph neural networks.
Being the text of the ministerial address by
H.E. Dr. Kayode Fayemi, CON
Minister of Mines and Steel Development
Federal Republic of Nigeria
at the
iPAD Nigeria Mining Week
Miners Association of Nigeria;
in partnership with Dolf Madi Consulting;
and in association with PricewaterhouseCoopers and Spintelligent
Sheraton Hotel, Abuja, Nigeria | Tuesday, October 25, 2016
El podcasting consiste en la distribución de archivos multimedia a través de internet que los usuarios pueden descargar para escuchar cuando quieran. Es similar a una suscripción a un programa de radio o podcast hablado que los usuarios reciben regularmente para escuchar en su tiempo libre. Los podcasts pueden contener diversos tipos de contenido como noticias, documentales, música o entrevistas.
Shell uses big data analytics to more efficiently explore for oil and gas reserves. Sensors collect over a million readings during seismic surveys to identify potential drilling locations, which is analyzed against global data to assess probability of productive wells. Equipment sensors also monitor performance to forecast maintenance needs. This approach has increased Shell's ability to drill productive wells by 1%, generating 3 additional years of global energy. They utilize large-scale infrastructure from AWS and analytics teams to optimize exploration and extraction costs in facing challenges of limited resources.
Private Cloud Delivers Big Data in Oil & Gas v4Andy Moore
Santos implemented a private cloud solution to address the challenges of increasing data volumes and mobility needs of geoscientists. This centralized their data storage, allowed remote access from any location, and provided more processing power. It saved $5 million over 5 years. Santos is now exploring big data analytics to gain insights from their vast amounts of seismic, well, and production data through techniques like predictive modeling and self-organizing maps. This private cloud provides a platform to leverage increasing computing power and take advantage of rising data volumes and analysis capabilities into the future.
Digital oil field DOF system is mainly focused on wells, production, and operations but now expanding into field decisions and management. The integration of the production systems’ models with the reservoir models is growing in order to optimize production and recovery. Digital oil fields integrate technology, information, people, and processes to maximize asset performance and value across the oil and gas production life cycle. The purpose of digital or smart oilfields solutions is straightforward which are to optimize production, improve operational efficiency and to increase productivity through integrated workforce. This paper represents an introduction to digital oil field. Paul A. Adekunte | Matthew N. O. Sadiku | Janet O. Sadiku "Digital Oil Field" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-7 | Issue-6 , December 2023, URL: https://www.ijtsrd.com/papers/ijtsrd61268.pdf Paper Url: https://www.ijtsrd.com/engineering/other/61268/digital-oil-field/paul-a-adekunte
Reprocessing old seismic data from the 1970s and 1980s offshore Mexico provided new insights but lacked details on deep basin architecture. To address this, new modern seismic data was acquired starting in 2015. Direct comparison of the old and new data showed that the new data imaged much deeper reflectors around 8-13 km below the seafloor, revealed critical features like the Wilcox Trend and Cretaceous formations, and had almost double the usable spectral content. While reprocessing old data can still provide useful information, acquiring new seismic data using modern techniques allows for a more comprehensive understanding of regional geology through higher resolution and deeper imaging of basin structures.
Solving Geophysics Problems with PythonPaige Bailey
This document discusses using Python for solving problems in geophysics. It begins by defining geophysics as the application of physics to the study of the Earth, its environments, and its processes. It then discusses various geophysical themes like gravity, heat flow, electricity, fluid dynamics, magnetism, radioactivity, and vibration. The rest of the document focuses on different geophysical libraries and software that can be used with Python, applications of geophysics to energy exploration and production, and challenges of dealing with big data in upstream oil and gas.
The document summarizes an interview with Rafael Salmi, president of Richardson RFPD, about their latest RF products. Some key points:
- Richardson RFPD is a leading supplier of RF products worldwide.
- Rafael Salmi discusses their latest RF products and technologies, including wireless gas tank monitoring and Wi-Fi identification technology.
- The interview provides an overview of Richardson RFPD's product offerings and their role as a global leader in RF solutions.
This document discusses new techniques for reducing exploration risk in challenging subsurface environments like deep water and beneath salt or basalt. It describes how 3D seismic surveys have improved drilling success rates from 25% to 50% but success is still low in some areas like 10% in deep Gulf of Mexico wells. New methods of acquiring seismic data from multiple azimuths and with vertically aligned sources and receivers are providing higher quality images to further reduce risk by illuminating subsurface targets from more directions. Examples from various basins demonstrate how these improvements in seismic technology decrease the chance of drilling dry holes.
Bem presentation London Envirosec 2008Iphimedia LC
Presentation by Bem to defense industry technologists on the opportunities created by climate change. See EnviroSec'08 - London http://www.dynamixx-e2d.com/index.php/conferences/
This document discusses the importance of geologists relying on their geological knowledge and understanding, rather than solely on technological tools like workstations, when interpreting data and defining prospects. While technologies like workstations and nail guns can improve efficiency, they do not replace the expertise and skills of geologists or carpenters. An overreliance on workstations can lead interpreters to make inaccurate maps and assessments, resulting in unnecessary dry wells. True success comes from integrating technology with solid geological knowledge of basins and structural patterns.
GIS in the Rockies Geospatial RevolutionPeter Batty
GIS in the Rockies keynote presentation, September 15 in Loveland, CO. Much common content but slightly longer than the one I gave at NSGIC a couple of days previously.
The document provides a summary of Guy Tel-Zur's experience at the SC10 supercomputing conference. It outlines the various talks, panels, and presentations Tel-Zur attended over the course of the conference related to topics like computational physics, GPU computing, climate modeling, earthquake simulations, and the future of high performance computing. It also mentions visiting the exhibition and learning about technologies like Eclipse PTP, Elastic-R, Python for scientific computing, Amazon Cluster GPU instances, and the Top500 list of supercomputers.
Solving Geophysics Problems with Python - Speaker NotesPaige Bailey
This document provides a summary of a presentation about solving geophysics problems with Python. The presentation introduces geophysics and lists popular Python libraries for working with geophysical data. It discusses the history of technologies in the oil and gas industry like well logs, seismography, and how advances in data acquisition and analysis have improved oil discovery. The presentation outlines the iterative workflow for subsurface characterization and notes how data impacts the entire oil and gas value chain. It predicts the current decade will be one of increased sensing and data collection as mobility, IoT, and analytics bring more value to the industry.
Daniel Bochicchio, Skybernetics - “Valuable Insights from On High: Drone use ...Michael Hewitt, GISP
This document discusses how Skybernetics uses drone technology to provide measurement and data services that help environmental engineering companies adapt to changes in their industry. It highlights how drones can reduce costs, save time and effort, and generate new opportunities through capabilities like high-resolution aerial mapping, thermal imaging, and continuous monitoring. The document argues that digital transformation is necessary for organizations to survive and thrive, and that Skybernetics' solutions are designed to integrate seamlessly with customers' existing workflows.
This is the keynote talk fkw gave at cloudnet 2020. It covers all three cloudbursts we did. As of early 2021, slides 26ff is still the most detailed documentation of the 3rd cloudburst. This material will be covered in a future conference paper.
NRP Engagement webinar - Running a 51k GPU multi-cloud burst for MMA with Ic...Igor Sfiligoi
NRP Engagement webinar: Description of the 380 PFLOP32S , 51k GPU multi-cloud burst using HTCondor to run IceCube photon propagation simulation.
Presented January 27th, 2020.
Digital Infrastructure in a Carbon Constrained WorldLarry Smarr
09.01.15
Invited Presentation to the
West Coast Leadership Dialogue
Stanford University
Title: Digital Infrastructure in a Carbon Constrained World
Palo Alto, CA
Low Tg Underfill: The Good, The Bad, and The UglyCraig Hillman
Most state of the art component packaging uses underfill to improve assembly and reliability. Unfortunately, most designers and users of component packaging have a poor understanding of what works and what doesn't. In this comprehensive primer, DfR Solutions provides the historical and scientific background for engineers to make practical decisions regarding underfill selection and qualification
This document discusses producing 2.5kg of a product that results in 460kg of carbon dioxide equivalent (CO2e) being emitted if the product is sent to landfill. It also lists Gavin Starks' contact information.
This document discusses how established industries can be disrupted by discontinuous innovation. It provides two examples: the ice harvesting industry in the late 19th century was replaced by the artificial ice making industry enabled by refrigeration technology. In the late 20th century, the computer disk drive industry catering to mini-computers faced disruption from lower-cost disk drives enabling the emergence of the personal computer market. The document argues this pattern of steady innovation punctuated by dramatic shifts leading to industry replacement is common and discusses how new business models can also drive discontinuous change.
1) The document discusses easy and low-cost energy conservation measures that were proposed in the 1980s but largely ignored, such as installing insulation, adjusting lighting schedules, and maintaining steam traps.
2) While the author's report on these measures was well-received, no significant action was taken due to budget cuts. Instead of implementing an energy efficiency program, staff was laid off.
3) The document argues that seriously pursuing energy conservation, even simple measures, has been largely talked about rather than implemented at the operating level since the 1970s. Future economic constraints may make transitioning to more sustainable practices more difficult.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Shell uses big data analytics to more efficiently explore for oil and gas reserves. Sensors collect over a million readings during seismic surveys to identify potential drilling locations, which is analyzed against global data to assess probability of productive wells. Equipment sensors also monitor performance to forecast maintenance needs. This approach has increased Shell's ability to drill productive wells by 1%, generating 3 additional years of global energy. They utilize large-scale infrastructure from AWS and analytics teams to optimize exploration and extraction costs in facing challenges of limited resources.
Private Cloud Delivers Big Data in Oil & Gas v4Andy Moore
Santos implemented a private cloud solution to address the challenges of increasing data volumes and mobility needs of geoscientists. This centralized their data storage, allowed remote access from any location, and provided more processing power. It saved $5 million over 5 years. Santos is now exploring big data analytics to gain insights from their vast amounts of seismic, well, and production data through techniques like predictive modeling and self-organizing maps. This private cloud provides a platform to leverage increasing computing power and take advantage of rising data volumes and analysis capabilities into the future.
Digital oil field DOF system is mainly focused on wells, production, and operations but now expanding into field decisions and management. The integration of the production systems’ models with the reservoir models is growing in order to optimize production and recovery. Digital oil fields integrate technology, information, people, and processes to maximize asset performance and value across the oil and gas production life cycle. The purpose of digital or smart oilfields solutions is straightforward which are to optimize production, improve operational efficiency and to increase productivity through integrated workforce. This paper represents an introduction to digital oil field. Paul A. Adekunte | Matthew N. O. Sadiku | Janet O. Sadiku "Digital Oil Field" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-7 | Issue-6 , December 2023, URL: https://www.ijtsrd.com/papers/ijtsrd61268.pdf Paper Url: https://www.ijtsrd.com/engineering/other/61268/digital-oil-field/paul-a-adekunte
Reprocessing old seismic data from the 1970s and 1980s offshore Mexico provided new insights but lacked details on deep basin architecture. To address this, new modern seismic data was acquired starting in 2015. Direct comparison of the old and new data showed that the new data imaged much deeper reflectors around 8-13 km below the seafloor, revealed critical features like the Wilcox Trend and Cretaceous formations, and had almost double the usable spectral content. While reprocessing old data can still provide useful information, acquiring new seismic data using modern techniques allows for a more comprehensive understanding of regional geology through higher resolution and deeper imaging of basin structures.
Solving Geophysics Problems with PythonPaige Bailey
This document discusses using Python for solving problems in geophysics. It begins by defining geophysics as the application of physics to the study of the Earth, its environments, and its processes. It then discusses various geophysical themes like gravity, heat flow, electricity, fluid dynamics, magnetism, radioactivity, and vibration. The rest of the document focuses on different geophysical libraries and software that can be used with Python, applications of geophysics to energy exploration and production, and challenges of dealing with big data in upstream oil and gas.
The document summarizes an interview with Rafael Salmi, president of Richardson RFPD, about their latest RF products. Some key points:
- Richardson RFPD is a leading supplier of RF products worldwide.
- Rafael Salmi discusses their latest RF products and technologies, including wireless gas tank monitoring and Wi-Fi identification technology.
- The interview provides an overview of Richardson RFPD's product offerings and their role as a global leader in RF solutions.
This document discusses new techniques for reducing exploration risk in challenging subsurface environments like deep water and beneath salt or basalt. It describes how 3D seismic surveys have improved drilling success rates from 25% to 50% but success is still low in some areas like 10% in deep Gulf of Mexico wells. New methods of acquiring seismic data from multiple azimuths and with vertically aligned sources and receivers are providing higher quality images to further reduce risk by illuminating subsurface targets from more directions. Examples from various basins demonstrate how these improvements in seismic technology decrease the chance of drilling dry holes.
Bem presentation London Envirosec 2008Iphimedia LC
Presentation by Bem to defense industry technologists on the opportunities created by climate change. See EnviroSec'08 - London http://www.dynamixx-e2d.com/index.php/conferences/
This document discusses the importance of geologists relying on their geological knowledge and understanding, rather than solely on technological tools like workstations, when interpreting data and defining prospects. While technologies like workstations and nail guns can improve efficiency, they do not replace the expertise and skills of geologists or carpenters. An overreliance on workstations can lead interpreters to make inaccurate maps and assessments, resulting in unnecessary dry wells. True success comes from integrating technology with solid geological knowledge of basins and structural patterns.
GIS in the Rockies Geospatial RevolutionPeter Batty
GIS in the Rockies keynote presentation, September 15 in Loveland, CO. Much common content but slightly longer than the one I gave at NSGIC a couple of days previously.
The document provides a summary of Guy Tel-Zur's experience at the SC10 supercomputing conference. It outlines the various talks, panels, and presentations Tel-Zur attended over the course of the conference related to topics like computational physics, GPU computing, climate modeling, earthquake simulations, and the future of high performance computing. It also mentions visiting the exhibition and learning about technologies like Eclipse PTP, Elastic-R, Python for scientific computing, Amazon Cluster GPU instances, and the Top500 list of supercomputers.
Solving Geophysics Problems with Python - Speaker NotesPaige Bailey
This document provides a summary of a presentation about solving geophysics problems with Python. The presentation introduces geophysics and lists popular Python libraries for working with geophysical data. It discusses the history of technologies in the oil and gas industry like well logs, seismography, and how advances in data acquisition and analysis have improved oil discovery. The presentation outlines the iterative workflow for subsurface characterization and notes how data impacts the entire oil and gas value chain. It predicts the current decade will be one of increased sensing and data collection as mobility, IoT, and analytics bring more value to the industry.
Daniel Bochicchio, Skybernetics - “Valuable Insights from On High: Drone use ...Michael Hewitt, GISP
This document discusses how Skybernetics uses drone technology to provide measurement and data services that help environmental engineering companies adapt to changes in their industry. It highlights how drones can reduce costs, save time and effort, and generate new opportunities through capabilities like high-resolution aerial mapping, thermal imaging, and continuous monitoring. The document argues that digital transformation is necessary for organizations to survive and thrive, and that Skybernetics' solutions are designed to integrate seamlessly with customers' existing workflows.
This is the keynote talk fkw gave at cloudnet 2020. It covers all three cloudbursts we did. As of early 2021, slides 26ff is still the most detailed documentation of the 3rd cloudburst. This material will be covered in a future conference paper.
NRP Engagement webinar - Running a 51k GPU multi-cloud burst for MMA with Ic...Igor Sfiligoi
NRP Engagement webinar: Description of the 380 PFLOP32S , 51k GPU multi-cloud burst using HTCondor to run IceCube photon propagation simulation.
Presented January 27th, 2020.
Digital Infrastructure in a Carbon Constrained WorldLarry Smarr
09.01.15
Invited Presentation to the
West Coast Leadership Dialogue
Stanford University
Title: Digital Infrastructure in a Carbon Constrained World
Palo Alto, CA
Low Tg Underfill: The Good, The Bad, and The UglyCraig Hillman
Most state of the art component packaging uses underfill to improve assembly and reliability. Unfortunately, most designers and users of component packaging have a poor understanding of what works and what doesn't. In this comprehensive primer, DfR Solutions provides the historical and scientific background for engineers to make practical decisions regarding underfill selection and qualification
This document discusses producing 2.5kg of a product that results in 460kg of carbon dioxide equivalent (CO2e) being emitted if the product is sent to landfill. It also lists Gavin Starks' contact information.
This document discusses how established industries can be disrupted by discontinuous innovation. It provides two examples: the ice harvesting industry in the late 19th century was replaced by the artificial ice making industry enabled by refrigeration technology. In the late 20th century, the computer disk drive industry catering to mini-computers faced disruption from lower-cost disk drives enabling the emergence of the personal computer market. The document argues this pattern of steady innovation punctuated by dramatic shifts leading to industry replacement is common and discusses how new business models can also drive discontinuous change.
1) The document discusses easy and low-cost energy conservation measures that were proposed in the 1980s but largely ignored, such as installing insulation, adjusting lighting schedules, and maintaining steam traps.
2) While the author's report on these measures was well-received, no significant action was taken due to budget cuts. Instead of implementing an energy efficiency program, staff was laid off.
3) The document argues that seriously pursuing energy conservation, even simple measures, has been largely talked about rather than implemented at the operating level since the 1970s. Future economic constraints may make transitioning to more sustainable practices more difficult.
Comparative analysis between traditional aquaponics and reconstructed aquapon...bijceesjournal
The aquaponic system of planting is a method that does not require soil usage. It is a method that only needs water, fish, lava rocks (a substitute for soil), and plants. Aquaponic systems are sustainable and environmentally friendly. Its use not only helps to plant in small spaces but also helps reduce artificial chemical use and minimizes excess water use, as aquaponics consumes 90% less water than soil-based gardening. The study applied a descriptive and experimental design to assess and compare conventional and reconstructed aquaponic methods for reproducing tomatoes. The researchers created an observation checklist to determine the significant factors of the study. The study aims to determine the significant difference between traditional aquaponics and reconstructed aquaponics systems propagating tomatoes in terms of height, weight, girth, and number of fruits. The reconstructed aquaponics system’s higher growth yield results in a much more nourished crop than the traditional aquaponics system. It is superior in its number of fruits, height, weight, and girth measurement. Moreover, the reconstructed aquaponics system is proven to eliminate all the hindrances present in the traditional aquaponics system, which are overcrowding of fish, algae growth, pest problems, contaminated water, and dead fish.
Generative AI Use cases applications solutions and implementation.pdfmahaffeycheryld
Generative AI solutions encompass a range of capabilities from content creation to complex problem-solving across industries. Implementing generative AI involves identifying specific business needs, developing tailored AI models using techniques like GANs and VAEs, and integrating these models into existing workflows. Data quality and continuous model refinement are crucial for effective implementation. Businesses must also consider ethical implications and ensure transparency in AI decision-making. Generative AI's implementation aims to enhance efficiency, creativity, and innovation by leveraging autonomous generation and sophisticated learning algorithms to meet diverse business challenges.
https://www.leewayhertz.com/generative-ai-use-cases-and-applications/
Introduction- e - waste – definition - sources of e-waste– hazardous substances in e-waste - effects of e-waste on environment and human health- need for e-waste management– e-waste handling rules - waste minimization techniques for managing e-waste – recycling of e-waste - disposal treatment methods of e- waste – mechanism of extraction of precious metal from leaching solution-global Scenario of E-waste – E-waste in India- case studies.
Applications of artificial Intelligence in Mechanical Engineering.pdfAtif Razi
Historically, mechanical engineering has relied heavily on human expertise and empirical methods to solve complex problems. With the introduction of computer-aided design (CAD) and finite element analysis (FEA), the field took its first steps towards digitization. These tools allowed engineers to simulate and analyze mechanical systems with greater accuracy and efficiency. However, the sheer volume of data generated by modern engineering systems and the increasing complexity of these systems have necessitated more advanced analytical tools, paving the way for AI.
AI offers the capability to process vast amounts of data, identify patterns, and make predictions with a level of speed and accuracy unattainable by traditional methods. This has profound implications for mechanical engineering, enabling more efficient design processes, predictive maintenance strategies, and optimized manufacturing operations. AI-driven tools can learn from historical data, adapt to new information, and continuously improve their performance, making them invaluable in tackling the multifaceted challenges of modern mechanical engineering.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Software Engineering and Project Management - Software Testing + Agile Method...Prakhyath Rai
Software Testing: A Strategic Approach to Software Testing, Strategic Issues, Test Strategies for Conventional Software, Test Strategies for Object -Oriented Software, Validation Testing, System Testing, The Art of Debugging.
Agile Methodology: Before Agile – Waterfall, Agile Development.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
DEEP LEARNING FOR SMART GRID INTRUSION DETECTION: A HYBRID CNN-LSTM-BASED MODELijaia
As digital technology becomes more deeply embedded in power systems, protecting the communication
networks of Smart Grids (SG) has emerged as a critical concern. Distributed Network Protocol 3 (DNP3)
represents a multi-tiered application layer protocol extensively utilized in Supervisory Control and Data
Acquisition (SCADA)-based smart grids to facilitate real-time data gathering and control functionalities.
Robust Intrusion Detection Systems (IDS) are necessary for early threat detection and mitigation because
of the interconnection of these networks, which makes them vulnerable to a variety of cyberattacks. To
solve this issue, this paper develops a hybrid Deep Learning (DL) model specifically designed for intrusion
detection in smart grids. The proposed approach is a combination of the Convolutional Neural Network
(CNN) and the Long-Short-Term Memory algorithms (LSTM). We employed a recent intrusion detection
dataset (DNP3), which focuses on unauthorized commands and Denial of Service (DoS) cyberattacks, to
train and test our model. The results of our experiments show that our CNN-LSTM method is much better
at finding smart grid intrusions than other deep learning algorithms used for classification. In addition,
our proposed approach improves accuracy, precision, recall, and F1 score, achieving a high detection
accuracy rate of 99.50%.
2. Slide 2
WARNING!(or disclaimer, rather)
The views expressed in this program do not represent the views of my employer. In fact, they would
probably be really disturbed by the amount of cursing (if I curse) or if I mess up on anything
I’m also not able to tell you anything specifically about the way we structure data in our environment, or
appear to endorse anything
5. Slide 5
First well log?
Explain what a well log is
Squiggly thing
well logging parameters:
- resistivity
- image / dipmeter
- porosity
- density
- neutron porosity
- gamma ray
- self potential
- caliper
- NMR
6. Slide 6
- 1927 by Conrad Schlumberger, though he’d been formulating the idea since 1919
- He sent down a sonde (sensor attached to a wire) into a 500m deep well in the Alsace region of France
and started collecting information
- “Electrical resistivity log”
- All measurements were made by hand
8. Slide 8
- 1921 by J. Clarence Karcher, who was an Electrical Engineer
- This is the means by which the majority of the world’s oil reserves have been discovered
- Founded Geophysical Service Incorporated in 1930, which eventually turned into Texas Instruments
- Got the idea because his assignment in World War I, the assignment that took him out of grad school,
was to locate heavy artillery batteries in France by studying the acoustic waves the guns generated in the
air.
- He noticed an unexpected event in his research and switch his concentration to seismic waves in the
earth
- Karcher thought it would be possible to determine the depths of the underlying geologic strata by
vibrating the earth’s surface while precisely recording and timing the waves of energy
10. Slide 10
- Earliest known oil wells were drilled in China, in 347 AD
- These wells had depths of up to about 790 feet, and were drilled using bits attached to bamboo poles
Egyptians were using asphalt more than 4000 years ago, in the construction of the walls of Babylon. Ancient
Persians were using petroleum for medicinal and lighting uses. The first streets of Baghdad were paved with
tar.
Befuddled “shoot the ground and gusher comes up” situations. Producing dozens of barrels a day, maybe
hundreds, but recovery rates were exceptionally low, and you weren’t really finding anything interesting.
Eventually got up to the millions of barrels in early 1900’s, but oil still wasn’t the primary fuel source.
Oil as “unwanted byproduct” when drilling for salt wells
11. Slide 11
Drilling has been around for a long
time, but it’s only been successful due
to data acquisition methods.
I guess the point that I’m trying to make is that…
[read slide]
Advances in technology create a marked step change in petroleum exploration. Those advances are
primarily in terms of better hardware / equipment, which give explorers better data about the subsurface.
The data is the key.
12. Slide 12
Now, I’m a geophysicist – so those advances are the ones I’m best at spotting.
- Point out the upticks for 2D seismic, better resolution for 3D seismic
80’s: 2D data acquired, pre-stack and post-stack imaging, Cray supercomputers
90’s: 3D narrow azimuth data, 3D post-stack and pre-stack imaging, Unix
00’s: 3D wide azimuth data, imaging, reverse time migration; Linux clusters
Now: coil shooting, continuous machine-generated sensory data
Mathematical insights – mention that last night you found out that the guy who first discovered the FFT was
a Chevron employee, was just doing his job, ain’t no thing
13. Slide 13
Point out fracking boom, mention that the crazy upward tick has continued, though the steepness of the
slope has decreased a bit due to the drop in oil prices
When oil prices are high, advances are primarily in terms of engineering: getting the stuff out of
the ground more quickly
When prices drop, the emphasis shifts towards analytics, more nuanced ways of optimizing
production
15. Slide 15
World’s largest public, state-owned,
and private businesses
Shamelessly stolen from wikipedia
16. Slide 16
World’s largest public, state-owned,
and private businesses
7 out of 10
7 out of 10 of the largest public, state-owned, and private businesses – and a huge proportion of the overall
list. Trillions of dollars of revenue.
Direct link to reserves and success of a company. We’re selling a thing; the margins on the beef jerky you
buy in a gas station are higher than the margins for a barrel of oil
17. Slide 17
Profitability for oil companies is
directly tied to reserves.
Oil companies are all in the business of getting barrels out of the ground – so characterizing the subsurface
is incredibly important. Both of those bits of data that I mentioned before – that came so late in the game –
were huge technological step changes for the industry, and drastically impacted oil discovery.
Improved resolution within the reservoir is critical because deepwater wells cost a lot - $100 million or
more – and fully exploiting assets is essential
18. Slide 18
Mapping
Reservoir
Characterization
Cross-sections
Petrophysics
Reservoir Simulation Well Planning &
Drilling Simulation
Stratigraphic Modeling
Seismic Interpretation
The oil industry is a bit like an ecosystem. This particular piece is subsurface characterization – the earth
science-y and engineering bits
- Every image you see here has a data type (or more!) associated with it, and, though it’s getting better, a
shortage of standards
19. Slide 19
Mapping
Reservoir
Characterization
Cross-sections
Petrophysics
Reservoir Simulation Well Planning &
Drilling Simulation
Stratigraphic Modeling
Seismic Interpretation
So these components of the energy ecosystem, and this subsurface data workflow can be grouped into
“earth science-y bits” and “engineering bits” with this kind of fuzzy area in between with petrophysics
Earth scientists record millions and billions of data points called “seismic” and they don’t trust any of them
unless you put them all together
Engineers trust pressure readings in the well, the stuff they can measure with sensors – and trust it
everywhere, and extrapolate everywhere
20. Slide 20
Mapping
Reservoir
Characterization
Cross-sections
Petrophysics
Reservoir Simulation Well Planning &
Drilling Simulation
Stratigraphic Modeling
Seismic Interpretation
Something that I should also mention is that this is an iterative process. I put a loop here, but in reality, all
of these steps can feed back into one another – and a change to one component of the subsurface model
drastically impacts all other components
New sorts of geology: horizontal drilling and hydraulic fracturing combined have been revolutionary
For example: “Unconventional resources” such as shale gas and tight oil supply 20% of the gas used
in the USA and is expanding rapidly around the globe.
But want to hammer in: currently, recovery rates are only about 50%. The biggest risk is finding the oil; the
second biggest risk is getting it out of the ground safely.
21. Slide 21
How big is “big”?
Seismic industry has evolved over the last decade by increasing the volume of data that is typically
acquired and processed by about an order of magnitude every five years (2000)
But that’s changed
It’s exponential growth
In the 80’s, seismic was gigabytes in size; some people were still hand-interpreting on paper
5D interpolation: can produce file sets that exceed 100 TB in size
Chevron’s internal IT traffic alone exceeds 1.5 TB a day – and that’s 2013 numbers.
Shell is using fiberoptic cables created in a special partnership with HP for their sensors, and this
data is transferred to AWS servers – 1TB / day
Coil seismic has replaced lines and grids – explain why, and explain why that impacts the size of the
data that you’re looking at
22. Slide 22
CAT scanning of cores
What you’re seeing here is a subsection of the well – A&M has the largest set of core samples in the
world housed at a refrigerated warehouse on campus actually, if you’re dying to go see
Pore-scale imaging (.01 to 10 microns) can generate large data sets, as well: a centimeter cubed
can exceed 10GB, and when you take into account that you’re measuring 1000 meters of core,
that’s 1 exabyte
Reducing the approximations, improving the equations
Images taken from Schlumberger
23. Slide 23
Data impacts the entire value chain.
All that I mentioned before was earth sciences or drilling related – impacting the “upstream” components
of the oil industry.
But in reality, data impacts every single component of the oil and gas value chain. And what’s more: it’s a
variety of data, coming in at asynchronous rates.
24. Slide 24
How we get it, how we transport it, how we process it, how we use it – and of these components
have the opportunity to be honed by analytics insights.
Streamlining the transport, refinement, and distribution of O&G is vital.
Just a few examples:
Refineries have limited capacity, and fuel needs to be produced as close as possible to its point of
end use to minimize transportation costs. Complex algorithms take into account the cost of
producing the fuel as well as diverse data such as economic indicators and weather patterns to
determine demand, allocate resources and set prices at the pumps.
- With projects demanding more expensive drilling and production technology and profound
changes in government regulations and commodities, companies need to exercise operational
prudence and strategic foresight to ensure success.
- Greater competition for assets, and a smaller margin for error
- Studies show that a gradual shift to a data and technology-driven oilfield is expected to tap
into 125 billion barrels of oil, equal to the current estimated reserves of Iraq
26. Slide 26
2000 - 2010 :
Decade of “Big Data”
So this past decade, the first one of the thousands, 2000 – 2010, has been the decade of “big data”.
Kind of a buzzword, right? Like “in the cloud”. “In the cloud” is just a server in a warehouse somewhere.
How “big” does data have to be before it’s big data?
27. Slide 27
2000 - 2010 :
Decade of “Big Data”
2010 - 2020 :
Decade of Sensing
- and if you thought there was a lot of data in this first decade, you realize there's going to be a heck of a
lot more in the second.
- In a recent study (May 2015) from Microsoft and Accenture, 86 – 90% of respondents said that
increasing their analytical, mobile, and internet of things capabilities would increase the value of
their business
- In the near term during the current low crude price cycle, approximately 3 out of 5 respondents
said they plan to invest the same amount (32%) or more or significantly more (25%) in digital
technologies
- Mobility, infrastructure, and collaboration technologies currently are the biggest investment
areas
- In the next three to five years, investments are expected to increase in big data, the industrial
IoT, and automation
89% noted that leveraging more analytics capabilities would add business value
- 90% felt more mobile tech in the field would add business value
- 86% leveraging more IIoT and automation would boost value
28. Slide 28
“Oil and gas industry leaders continue to look to digital technologies as a way to address
some of the key challenges the industry faces today in this lower crude oil price cycle.
Making the most of big data, IIoT and automation are indeed the next big opportunities for
energy and oilfield services companies, and many are already starting work in these areas.
They are increasing investments in enabling people and assets, with a growing emphasis on
developing data supply chains to support analytics projects that can improve efficiencies,
manage cost and provide a competitive edge.
Companies who do not continue to invest in
digital technologies risk being left behind.”
- Rich Holsman, Accenture (global head of digital in Accenture’s energy industry group)