Finding a Big Data "Safe Space" for the Energy Sector

The ongoing debate over the potential removal of an export ban on crude oil that could finally open global markets to U.S. crude oil supply would also challenge existing data collection and analysis practices, all at a time when access to energy is a key component of global security and U.S. geopolitical power.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

The oil industry was one of the original "smart" industries, but its tradition of leadership in using data to improve operations, enhance security and serve markets has been diminished over time by the counterintuitive incentives imposed by regulators.

As the sector transforms in response to a supply-side revolution and prepares to engage with global markets more directly through exports, the industry should be freed to build the next generation of infrastructure with efficient markets, security and consumer needs in mind.

One of the satisfying things about working in the oil business is how real it is: an actual physical substance is extracted, chemically upgraded and put to use every day in the real world by real people. That means it can be easy to forget how much the oil sector has relied on data collection and analysis to drive efficiencies that have served consumers well, and how a change in methods of collection and analysis can alter the way efficiencies are identified, implemented and achieved.

Business historians have recently come to appreciate the role that data-led analysis played in the success of the oil business in America -- starting with Rockefeller's Standard Oil, the company that built much of the groundwork and infrastructure on which not just the energy business but the broader North American economy relies. Pursuing relentless standardization based on the result of constant early data collection from then-balkanized production and transport methods allowed the development of the contemporary oil sector, and the importance of data collection has been enshrined in both energy companies and at energy regulators -- the existence of an entire federal Energy Information Administration at the U.S. Department of Energy is testament to the accepted value of data in energy sector decision-making.

But for an industry accustomed to collecting, handling and leveraging large amounts of data, oil companies have been perceived as laggards in taking advantage of the monitoring innovations of the past decade, innovations that have prompted huge shifts in industries from media and retailing to health care.

"Analytics have quickly transformed how internal corporate functions operate and how entire industries capture value," Dian Grueneich and David Jacot note in a recent article in the Electricity Journal discussing broader analytics and efficiency trends in the energy sector. But current estimates for unmanaged, or unstructured, data in the oil and gas industry run up to 80 percent of the total collected, and a recent Microsoft study noted that a third of oil and gas executives in a May 2013 survey expected their data storage needs to double in the two years to May 2015. The Microsoft survey identifies huge outstanding needs in managing data growth, integrating disparate business intelligence tools and then in using those tools to analyze data for insights.

Falling behind on data can have real and potentially terrifying implications. Oil majors were awakened to the risks posed by cybersecurity breaches in August 2012 when hackers damaged an estimated 30,000 computers at the offices of Saudi Aramco, one of the world's largest oil companies and a major exporter to global markets. As an investigation of the incident from the International Institute for Strategic Studies noted, cyberattacks are a common tool for Iran-sponsored threats to global businesses and states, with the oil and gas sector an obvious target.

Those companies with the most exposure to multiple parts of the industry and to the broader economy have now begun to implement the kinds of big data collection, archiving and management programs that became widespread in other sectors long ago. As the Massachusetts Institute of Technology has noted, early attempts at growing the "digital oilfield" have proliferated at international oil companies, and oilfield services giant Halliburton garnered headlines last year with its purchase of data management firm Petris, but the subject remains on the industry sidelines, secondary to traditional political considerations perceived as central to permitting of new proposed projects.

If the forecasts are for big data to proliferate, and the consensus is that oil companies have done too little to date to prepare for it, the question is why a sector with a strong technological bent and a tradition of data leadership has not embraced the Big Data Revolution.

The Dark Side of Data: Regulation and Lawsuits

The industry's regulated nature is rooted in good intentions, but is starting to show signs of confounding the original legislative intent -- that of prompting companies to compete and serve markets in efficient and fair ways.

When you can't measure, you can't manage, but what isn't measured also can't then be a source of new, invasive and potentially disruptive compliance mechanisms.

"Energy commodities organizations are subject to significant regulatory retention and data compliance concerns," noted software company Tarmin CEO Shahbaz Ali in a Pipeline & Gas Journal paper last year. As the subject of electronic discovery requests, the unstructured data already generated but often left unmanaged by oil and gas companies forms "a veritable cornucopia of risk," Ali noted.

The industry needs to find ways to make and measure small errors without fear of massive regulatory implications (and the lawsuits that accompany them), or it will continue to founder in the face of shifting market needs and will wait for large and potentially-preventable failures to occur before acting.

The North American oil and gas industry is on the verge of facing a major step-shift in engagement with world markets it needs to be free to innovate in serving. The imminent expansion of natural gas through both existing liquefied natural gas terminals and the roughly 35 billion cubic feet/day of proposed export capacity currently at various stages of permitting pose huge new data challenges and opportunities for operators. The ongoing debate over the potential removal of an export ban on crude oil that could finally open global markets to U.S. crude oil supply would also challenge existing data collection and analysis practices, all at a time when access to energy is a key component of global security and U.S. geopolitical power. To build a next-generation industry that is appropriately resilient requires moving beyond the "digital oil and gas field" to a fully digitalized oil and gas sector.

"Resilience is fundamentally the certainty of small cheap continual failures against the potential for a massive failure at the core," Egon Zehnder consultant Christoph Leuneberger said in a recent discussion on resiliency practices for extractive industries. Without expanding data collection and analysis practices, oil companies cannot discover what is going wrong, and without discovering what's going wrong they cannot improve on safety or market efficiency metrics.

While no one would sensibly argue for sweeping deregulation of the oil industry, a more constructive dialogue between the major players in the sector, government representatives and broader stakeholders in an atmosphere of learning rather than accusation is needed. Oil companies should be able to go about that process of discovery and learning without fear that any new problem implies a matching new regulatory process.

The popularity of the "big data revolution" at first glance may seem too trendy for application to oil companies, and the issue of how it relates to daily practices and compliance efforts may seem esoteric. But if we want to build a resilient, efficient and powerful energy sector in the U.S., a safe space to explore the implications of the big data revolution in the heavily regulated oil sector is needed.

Peter Gardett is Adjunct Fellow at the Center for a New American Security and Entrepreneur in Residence for the New York State Energy Research and Development Authority. As Founding Board Member of New York Energy Week and chair of the Commodities event committee, he will welcome industry leaders to a panel on the subject of data and commodities on June 17, 2014. Find out more about New York Energy Week here.

Popular in the Community

Close

What's Hot