Fundamentals in the Oil Field Matter Even More with Big Data

Today, everyone talks about data like it is a new idea that has emerged within the digital age, and that it has come to save time and money in the oil fields. The fact that we now use “data” to refer to information in a digital format changes very little about the fundamentals of data pattern seeking and trends association to prioritize high-value problems for production efficiency in the field. With the proliferation of Supervisory Control And Data Acquisition (SCADA) systems and field automation, what highly skilled lease operators already do efficiently is now even more plausible for all field personnel.

Data becomes valuable when it leads to intelligent, repeatable patterns that allow the field to make better decisions. When field personnel examine the vastness of their area, they can look past an existing route, and can now follow a more useful task pattern that compares and lines up large volumes of data to navigate to the most production-efficient next task based on complex variables like high-volume wells, upcoming well inspections, categories of issues, and more.

The objective for success is straightforward for the oil and gas operations: bring the usefulness of data out of the office and make it all about the field. Harness data quality, define clear objectives and make data actionable for the personnel who operate the core of the business.

Data Quality

One of the first objectives to tackle is the quality of data. To gain the acceptance of the field, lease operators must trust what is in front of them. Historically, patterns are considered intuitive and something lease operators pick up over years of experience. Data is worthless unless it drives intelligent patterns that they trust from the beginning. Otherwise, all the automation and data in the world will not result in action in the field.

On the flip side, using bad data to drive intelligence can create more churn in operations and be counterproductive. It only takes a series of inaccurate GPS navigations, miscalculated lease operator priorities, or misinformed dynamic pumper routes for the field to return to paper and pen. This means that conclusions must go through thorough investigation, and field issues with data must be treated with immediate responsiveness, especially during implementation.

Think of quality data as the compass guiding oil and gas firms. Considering the margins for error in the oil and gas sector, poor data quality is an expensive oversight. Oil and gas firms need to ensure their data is trustworthy. To ensure this, data becomes the ownership of the field, as they are in the best position to make data more usable, trusted, and actionable.

Often, incorrect data is fed from SCADA systems to the field. Stuck, missing records and out of range records often rapidly lose the trust of lease operators. Capturing and limiting these issues is critical if we want the field to not undermine investments made on the Internet of Things (IoT).

Define Clear Objectives to Truly Leverage Data

Using big data generated throughout operations is the cornerstone of digital improvements seen in oil and gas, as with other industries. The ubiquity of mobile and IoT devices across departments means that we are now generating data on everything, which poses a new problem: clearly defining what we want to achieve with all that information.

Only by clearly defining objectives can we utilize big data. In the same way that Uber and Lyft do not have scheduled routes, but instead, react to where they’re needed, oil and gas companies can use trustworthy data to define where their crews are most needed. This means lease operators can see a not-so-distant future without defined route maps, just data made available to them through a mobile device highlighting where they’re needed next based on clearly defined objectives. In the routeless oil field, the sequence doesn’t matter. Work is no longer linear. Crews go where and when they’re needed based purely on automated data analysis.

For example, oilfield managers need to reduce well downtime. By drawing real-time, trustworthy data on all wells, they can prioritize the wells that need the most attention. Then, by integrating data on lease operators and foremen, they can adjust their schedules to reach the priority wells in the most efficient way possible.

By using data in this holistic way, other benefits also arise, such as minimizing unnecessary maintenance worker visits by integrating lease operator route data into maintenance schedules. This is just one of many data-driven efficiency scenarios for oil and gas operations.

Making Data Actionable

To enable ROI for IoT investments, oil and gas firms must first make sure that the data is pulling from trustworthy machines. Only when trust is in place can data be used to create actionable intelligence, the true purpose for collecting data in the first place. Thanks to mobility technology, this real-time data can feed into field operation systems in an intelligent way, giving alerts, reports, calling out exceptions, and provide operators with the information they need to make better decisions.

Big data solutions are increasingly being used in the oil and gas sector because they support improvements in all areas of the oil and gas lifecycle, thereby maximizing production and reducing operating costs. Market research has identified the growing need to improve productivity; in fact, it has consistently been listed as one of the primary growth drivers for the global big data market in the oil and gas sector until 2022.

The future of the oil and gas industry is intelligent, dynamic production efficiency patterns readily available in the hands of the field. To get there, companies must understand that the fundamentals matter now more than ever — they must create trust in the data quality, define clear business objectives, and make that data actionable. Oil and gas producers must demand more from their technology vendors to instill trust in their field teams, streamline operations, and gain the efficiency needed to succeed in the rapidly emerging data age.

About the Author

Shiva Rajagopalan is the President and CEO of Seven Lakes Technologies, a niche analytics and technology solutions firm, driven to improve business drivers and enhance execution of customer business strategies for the Upstream Oil and Gas Sector. Rajagopalan founded Seven Lakes Technologies in 2009 and has overseen the development of numerous innovative products. Rajagopalan holds a bachelor’s degree in Mechanical Engineering from the Indian Institute of Technology, Bombay. He received four Chevron Contractor Recognition Awards for professional and technical excellence.

Resource Links:

Industry Perspectives

In this special guest feature, Amnon Drori, Co-founder and CEO of Octopai, discusses how for many organizations, GDPR may be the biggest data challenge they have ever faced – but it also provides organizations with an opportunity to truly own their data. By implementing a smart system that will ensure that they are able to find the data in their systems at will, organizations will ensure that they are GDPR-compliant – and have the opportunity to utilize all their data to help their organizations run more efficiently and profitably. [Read More...]

White Papers

A parallel file system offers several advantages over a single direct attached file system. By using fast, scalable, external disk systems with massively parallel access to data, researchers can perform analysis against much larger datasets than they can by batching large datasets through memory. To Learn More about the Parallel File Systems download this guide