27. 2000 - 2010 :
Decade of “Big Data”
2010 - 2020 :
Decade of Sensing
28. “Oil and gas industry leaders continue to look to digital technologies as a way to address
some of the key challenges the industry faces today in this lower crude oil price cycle.
Making the most of big data, IIoT and automation are indeed the next big opportunities for
energy and oilfield services companies, and many are already starting work in these areas.
They are increasing investments in enabling people and assets, with a growing emphasis on
developing data supply chains to support analytics projects that can improve efficiencies,
manage cost and provide a competitive edge.
Companies who do not continue to invest in
digital technologies risk being left behind.”
The views expressed in this program do not represent the views of my employer. In fact, they would probably be really disturbed by the amount of cursing (if I curse) or if I mess up on anything
I’m also not able to tell you anything specifically about the way we structure data in our environment, or appear to endorse anything
Insert logo for PyLadies-HTX, Rice University, mention that you’re a geophysicist who works full-time for an oil company in downtown Houston, say something about the toolkit that you use at work (Python, R, Hadoop)
Go into a bit of an explanation of what a well log is
well logging parameters:
- resistivity
- image / dipmeter
- porosity
- density
- neutron porosity
- gamma ray
- self potential
- caliper
- NMR
1927 by Conrad Schlumberger, though he’d been formulating the idea since 1919
He sent down a sonde (sensor attached to a wire) into a 500m deep well in the Alsace region of France and started collecting information
“Electrical resistivity log”
All measurements were made by hand
Go into a bit of an explanation on what seismic is
1921 by J. Clarence Karcher, who was an Electrical Engineer
This is the means by which the majority of the world’s oil reserves have been discovered
Founded Geophysical Service Incorporated in 1930, which eventually turned into Texas Instruments
Got the idea because his assignment in World War I, the assignment that took him out of grad school, was to locate heavy artillery batteries in France by studying the acoustic waves the guns generated in the air.
He noticed an unexpected event in his research and switch his concentration to seismic waves in the earth
He thoughts it would be possible to determine the depths of the underlying geologic strata by vibrating the earth’s surface while precisely recording and timing the waves of energy
Earliest known oil wells were drilled in China, in 347 AD
These wells had depths of up to about 790 feet, and were drilled using bits attached to bamboo poles
Egyptians were using asphalt more than 4000 years ago, in the construction of the walls of Babylon. Ancient Persians were using petroleum for medicinal and lighting uses. The first streets of Baghdad were paved with tar.
Befuddled “shoot the ground and gusher comes up” situations. Producing dozens of barrels a day, maybe hundreds, but recovery rates were exceptionally low, and you weren’t really finding anything interesting.
I guess the point that I’m trying to make is that…
[read slide]
Advances in technology create a marked step change in petroleum exploration. Those advances are primarily in terms of better hardware / equipment, which give explorers better data about the subsurface. The data is the key.
Now, I’m a geophysicist – so those advances are the ones I’m best at spotting.
Point out the upticks for 2D seismic, better resolution for 3D seismic
80’s: 2D data acquired, pre-stack and post-stack imaging, Cray supercomputers
90’s: 3D narrow azimuth data, 3D post-stack and pre-stack imaging, Unix
00’s: 3D wide azimuth data, imaging, reverse time migration; Linux clusters
Now: coil shooting, continuous machine-generated sensory data
Mathematical insights – mention that last night you found out that the guy who first discovered the FFT was a Chevron employee, ain’t no thing
Point out fracking boom, mention that the crazy upward tick has continued, though the steepness of the slope has decreased a bit due to the drop in oil prices
Shamelessly stolen from wikipedia
7 out of 10 of the largest public, state-owned, and private businesses – and a huge proportion of the overall list. Trillions of dollars of revenue.
Direct link to reserves and success of a company. We’re selling a thing; the margins on the beef jerky you buy in a gas station are higher than the margins for a barrel of oil
Oil companies are all in the business of getting barrels out of the ground – so characterizing the subsurface is incredibly important. Both of those bits of data that I mentioned before – that came so late in the game – were huge technological step changes for the industry, and drastically impacted oil discovery.
Improved resolution within the reservoir is critical because deepwater wells cost a lot - $100 million or more – and fully exploiting assets is essential
The oil industry is a bit like an ecosystem. This particular piece is subsurface characterization – the earth science-y and engineering bits
Every image you see here has a data type (or more!) associated with it, and, though it’s getting better, a shortage of standards
So these components of the energy ecosystem, and this subsurface data workflow can be grouped into “earth science-y bits” and “engineering bits” with this kind of fuzzy area in between with petrophysicsEarth scientists record millions and billions of data points called “seismic” and they don’t trust any of them unless you put them all together
Engineers trust pressure readings in the well, the stuff they can measure with sensors – and trust it everywhere, and extrapolate everywhere
Something that I should also mention is that this is an iterative process. I put a loop here, but in reality, all of these steps can feed back into one another – and a change to one component of the subsurface model drastically impacts all other components
New sorts of geology: horizontal drilling and hydraulic fracturing combined have been revolutionary
For example: “Unconventional resources” such as shale gas and tight oil supply 20% of the gas used in the USA and is expanding rapidly around the globe.
But want to hammer in: currently, recovery rates are only about 50%. The biggest risk is finding the oil; the second biggest risk is getting it out of the ground safely.
Seismic industry has evolved over the last decade by increasing the volume of data that is typically acquired and processed by about an order of magnitude every five years (2000)
But that’s changed
It’s exponential growth
In the 80’s, seismic was gigabytes in size; some people were still hand-interpreting on paper
5D interpolation: can produce file sets that exceed 100 TB in size
Chevron’s internal IT traffic alone exceeds 1.5 TB a day – and that’s 2013 numbers.
Shell is using fiberoptic cables created in a special partnership with HP for their sensors, and this data is transferred to AWS servers – 1TB / day
Coil seismic has replaced lines and grids – explain why, and explain why that impacts the size of the data that you’re looking at
CAT scanning of cores
What you’re seeing here is a subsection of the well – A&M has the largest set of core samples in the world housed at a refrigerated warehouse on campus actually, if you’re dying to go see
Pore-scale imaging (.01 to 10 microns) can generate large data sets, as well: a centimeter cubed can exceed 10GB, and when you take into account that you’re measuring 1000 meters of core, that’s 1 exabyte
Reducing the approximations, improving the equations
Images taken from Schlumberger
All that I mentioned before was earth sciences or drilling related – impacting the “upstream” components of the oil industry.
But in reality, data impacts every single component of the oil and gas value chain. And what’s more: it’s a variety of data, coming in at asynchronous rates.
How we get it, how we transport it, how we process it, how we use it – and of these components have the opportunity to be honed by analytics insights.
Streamlining the transport, refinement, and distribution of O&G is vital.
Just a few examples:
Refineries have limited capacity, and fuel needs to be produced as close as possible to its point of end use to minimize transportation costs. Complex algorithms take into account the cost of producing the fuel as well as diverse data such as economic indicators and weather patterns to determine demand, allocate resources and set prices at the pumps.
- With projects demanding more expensive drilling and production technology and profound changes in government regulations and commodities, companies need to exercise operational prudence and strategic foresight to ensure success.
- Greater competition for assets, and a smaller margin for error
- Studies show that a gradual shift to a data and technology-driven oilfield is expected to tap into 125 billion barrels of oil, equal to the current estimated reserves of Iraq
So this past decade, the first one of the thousands, 2000 – 2010, has been the decade of “big data”.
Kind of a buzzword, right? Like “in the cloud”.
and if you thought there was a lot of data in this first decade, you realize there's going to be a heck of a lot more in the second.
In a recent study (May 2015) from Microsoft and Accenture, 86 – 90% of respondents said that increasing their analytical, mobile, and internet of things capabilities would increase the value of their business
In the near term during the current low crude price cycle, approximately 3 out of 5 respondents said they plan to invest the same amount (32%) or more or significantly more (25%) in digital technologies
Mobility, infrastructure, and collaboration technologies currently are the biggest investment areas
In the next three to five years, investments are expected to increase in big data, the industrial IoT, and automation
89% noted that leveraging more analytics capabilities would add business value
- 90% felt more mobile tech in the field would add business value
- 86% leveraging more IIoT and automation would boost value
Rich Holsman, Accenture (global head of digital in Accenture’s energy industry group)