Alan Lindsey, founder and CEO, PetroDE, speaks during a session at DUG Permian 2015. Other panelists are Ben Shattuck, upstream analyst, Wood Mackenzie; Luther Birdzell, CEO, OAG Analytics; and James Yockey, president, Oseberg. (Source: Hart Energy)
Originally published on Oil and Gas Investor by Scott Weeden, Hart Energy
Given the amount of seismic shot and number of wells drilled since the mid-1950s, the oil and gas industry has amassed massive amounts of data. Long before the concept of “Big Data” was popular, the E&P industry was searching for ways to tap into that data resource.
“Fundamentally, Big Data is about transforming data from a cost into a revenue-generating asset,” said Luther Birdzell, CEO, OAG Analytics. “It can be measured as incremental barrels of oil per dollar deployed. I think Big Data would be broadly defined as data stored for data requirements that exceeded commercially available technology.”
Birdzell participated in a roundtable on “Big Data Solutions” at the DUG Permian Conference on May 20, along with Ben Shattuck, upstream analyst, Wood Mackenzie; Alan Lindsey, founder and CEO, PetroDE; and James Yockey, president, Oseberg.
The complexity of unconventional development is much greater than conventional development. Along with that there is much more data from 3-D and 4-D seismic and microseismic technology. Computing capacity doubles every two years adding to the complexity, Birdzell explained. “The industry can benefit most quickly and with the greatest magnitude by approaching this problem from the place of maximizing value to the user by minimizing complexity.”
Shattuck pointed out that his company “fundamentally changed the way we look at things like forecasting. It is really due to the advent of Big Data. How do you take four million wells and separate the noise and distraction from what really matters and roll that up into a meaningful industry-level view?
“It is wrapping your mind around what that dataset means, what are the components of that dataset, and understanding―particularly as we roll this out to people who are not very technically minded―what are the capabilities and just as importantly the limitations of that dataset,” he said.
Maximizing Data Analysis
With all of the microseismic, geochemical, geomechanical and production information coming out of the wellbore, there is increased focus on getting the most value out of that data. Most companies take the information on a well-by-well basis and analyze it, put it under a lot of intense scrutiny and then put it on a shelf, Lindsey said.
“What is available now through the Cloud and data technologies is the ability to keep that data alive and on low-cost systems for instant access. Our industry is always worrying about how to optimize these completions, and someone will come up with a new technique. What you need to do essentially is reprocess all that old data to understand what it means,” he continued.
“The ultimate is to take us from these expensive well-spacing tests where we’re drilling physical wells to determining that spacing from our microseismic, geochemical and geomechanical measurements so we can get us very close to the correct answer right out of the starting gate,” he emphasized. “That will end up saving you millions of dollars.”
Fastest-Growing Segment
Yockey agreed with Lindsey that there are tons of great information that the oil companies have been capturing for years. To put into context how this fits in the industry, he described an investment conference he attended in Houston on May 19. One of the speakers was a managing director and head of the Houston office for McKinsey & Co.
“He was talking about innovation and technology across the entire O&G industry. In a study that McKinsey published recently, their view was that data and data analytics is going to be the fastest growing segment of technology in the O&G industry and with the highest margins, which is pretty interesting,” Yockey added.
Also at the conference an unconfirmed rumor was going around that a major oil company signed a $1 billion contract with a company known for helping the U.S. Defense Department and intelligence agencies deal with insurgents and terrorists. The contract was said to be for helping the oil company make sense of the data that it was sitting on, he continued.
Collaborating On Data
The data available to O&G companies are not limited to private datasets. Information is available from public and regulatory sources as well. The next wave for the industry is learn what to do and not to do in aggregating and analyzing disparate data to generate an actionable business concept. The companies will be focused on comparative intelligence gleaned from regulatory and public filings.
“Frankly, it is the integration of those pieces that brings a lot of added value. There are mountains of public data out there but each company carries its own mountain of more detailed information. There are lots of opportunities on both sides of that,” Yockey said.
As Birdzell explained, “We’re able to materially reduce the uncertainty of key reservoir and completion practices using advanced statistical methods from public data. We are able to do even more with proprietary data.”
One way to cost effectively reduce the uncertainty is to “contribute to pools at the same level of data that is required to be disclosed publicly in North Dakota. Those operators that are willing to do that in those other plays will very quickly enjoy that value proposition. Ultimately, the most value comes from combining public and proprietary data with even more value for every company involved in pooling that data,” he said.
Mining Public Data
There is a lot of value in analyzing public data. A company can see anomalies and outliers in the data. Yockey gave the example of Continental Resources and the South Central Oklahoma Oil Play. In September 2014, Continental announced its next big play in the Anadarko Basin. Except for the collapse of oil prices, this would be a play that everyone would be talking about.
“What we saw at Oseberg in September 2013 was that the Springer would be its next big play, a year ahead of Continental’s big announcement. You could identify this was a new formation that Continental was going to be talking about by mining the details of the formation-specific IPs that Continental and other operators were releasing through filings with the Oklahoma Corporation Commission.
“You could have leveraged a bunch of other data like spacing and increased density from filings to not just have your attention drawn to the Springer, but also delineating what the fairway was and Continental’s, Marathon’s and Newfield’s net positions in the play months before this announcement. That’s worth a lot to geologists, engineers and business executives in the E&P industry,” he continued.
As Birdzell said, “Being able to make more decisions more quickly and accelerate beyond the primary process as part of getting our heads around optimized infill wells or wellbore density programs are some of the many areas that we are working on and contributing to helping assets add more value more quickly.”