Well, the running joke in analytics is that if you torture data long enough, it will confess to anything – and the data generated by the Oil and Gas industry is so vast, almost anything is possible. For an industry that has decisively headed towards eking out every last ounce of production from their existing investments, making sense of this data is of paramount importance. The sustained low prices of oil have only further emphasised the need for getting more out of what you have.
Beyond the buzz, there are some significant challenges to achieving the all-important actionable insights and foresights from data. To start with, the different teams and assets within the enterprise generate many times the data that is currently aggregated. A McKinsey report1 suggests that of the 40,000 odd data tags found on a single rig, few of them are connected or used. Apart from sensor data, there is seismic data, equipment maintenance information, failure reports, contracts data, and a plethora of other information items that can help improve production efficiency. The report also highlights how inconsistencies in data can lead to situations where there is data not available, the available data not analysed, the analysed data not communicated, and the communicated data not used in decision-making for the simple fact that the decision-makers do not trust this data.
Enter Master Data Management
When data is captured, cleansed, stored and shared across the enterprise, methodically and systematically, it is always more trustworthy than a random aggregate of data from diverse sources. Master Data is all the data from rigs, contracts, supply chain, inventory, assets and production processes, aggregated in the Enterprise Data Warehouse – using a clearly defined data acquisition, management and access strategy. This offers a single version of truth, with users across the board accessing the same data, thus eliminating data inadequacies and anomalies arising out of maintaining multiple versions of this data in different department silos. This data however needs to be maintained and managed efficiently in order that the users get actionable insights that translate into tangible efficiencies. Thus, it is no surprise that data management is forecasted to grow at 28.4% into a $21 billion industry by 20202.
What to watch out for
While the statistics point to significant benefits, the road to an analytics-driven enterprise is not without obstacles. One of the significant pitfalls of implementing analytics is in asking the right questions to get the desired answers – this is a known challenge that is typically addressed by the mantra of thinking big and starting small. However, an even bigger challenge is in acquiring the right data for analytics. Some of the aspects that are potential pain areas are:
How to get there
Aggregating Master Data starts with an open minded approach to understanding the problems at hand first – the questions that the organisation would like to get answers to, in other words, the use cases. A simple approach to analytics would be:
With current crude prices treading very close to the five-year low and no consensus on production caps in sight, it is imperative that O&G enterprises improve their production and distribution efficiency. In this backdrop, the value of collating and managing master data is undeniable, and investments trends in Master Data Management have only underscored the market sentiment. Thus it is the safest forecast that concerted efforts in Master Data Management shall yield consistent cost savings.
1 : http://www.mckinsey.com/industries/oil-and-gas/our-insights/digitizing-oil-and-gas-production#0
2 : http://www.marketsandmarkets.com/Market-Reports/oil-gas-data-management-market-85567816.html
3 : http://blogs.gartner.com/svetlana-sicular/data-scientist-mystified/