Global Energy Leader Preempts Downtime with Predictive Maintenance

A global oil and gas company, headquartered in the U.S., has delivered results for decades by staying abreast of technology. It recently set out to harness the power of predictive analytics in anticipation of reducing equipment service interruptions. Such interruptions resulted in lost production and increased maintenance and replacement costs.

While it knew what it wanted to accomplish, the energy exploration and production specialist sought a roadmap to success. The organization enlisted BlueGranite’s help to implement a predictive analytics solution that prevents equipment failures.

One area of focus was the company’s Electrical Submersible Pumps (ESPs). As with most any type of machinery, the performance of ESPs declines over time. Fluctuations in certain indicators likely contributed to the timing of those declines and ultimately, machine failures. Pinpointing and tracking those indicators were key to cutting maintenance and replacement costs.

The energy giant tasked its data science team with analyzing sensor data coming off the machines (gas volume, temperature, corrosion, etc.) to begin identifying leading indicators of performance decline and failure among ESPs. The job required using time series data for statistical analysis and to create machine learning models using Python.

Handling the profuse data coming from thousands of sensors across ESPs created technical issues. The sensors collected data points every few seconds, and the organization struggled to find a centralized, easily accessible place to store it all. This meant the team could only perform basic analysis, looking at a handful of sensors at a time for a small set of date or time ranges before computer systems began to hit memory and CPU limits. As the analytics got more sophisticated, the team often had to leave processes running over the weekend. The arrangement wasn’t conducive to iterative model development.

Ultimately, the team needed to scale compute and storage resources beyond a single machine to gain the ability to build more complex models across all data.

The Analytics & Data Science Platform

To solve the organization’s technical challenges, BlueGranite helped its data science team assess Azure Databricks on top of the Azure Data Lake Store as analytics and storage tools.

The architecture laid out above affords the company a single storage repository to process and build models across all data at once. The benefits of the solution included:

The company can now store all its data in a single place. Azure Data Lake Store provides unlimited storage for all sensor data across all pumps and all years of service. The company no longer needs to delete old data to make room for new data.

The data science team can now process all its data at once. Using the distributed compute platform, Apache Spark on Azure Databricks, allows the team to process the data in parallel across nodes of a cluster, therefore reducing the processing time.

The team is now able to perform predictive maintenance at scale. Many of the predictive use cases the company sought to execute were complex time series and regression analyses with factors such as trending and seasonality. This was challenging to perform on smaller, fragmented datasets on single machines. Azure Databricks provides a scalable framework allowing the execution of these complex workloads over entire datasets in a single run.

The solution allows the team to continue using familiar languages, like Python and SQL.

Client Success

The assessment demonstrates a unified analytics platform that proved to be of great value to the energy giant. The solution fostered a major decrease in development and processing time. What previously took days to develop now takes hours -- and what previously took hours now takes just minutes. The company is realizing legitimate savings both in labor and compute costs and better predictive maintenance outcomes.

Wondering how data can drive change throughout your organization? We’d love to help you explore the possibilities.Contact ustoday.

TECHNICAL OVERVIEW

As the analytics became more sophisticated, the team often had to leave processes running over the weekend.

The architecture depicted above affords the company a single storage repository to process and build models across all data at once.

Azure Databricks provided scalable, distributed compute environment that allowed the client to process and build models over all its data at once.

Azure Databricks allowed for the continued use of programming languages, like SQL and Python, that the team was already comfortable using.

Recent Posts

FEATURED EBOOK

Data Lakes in a Modern Data Architecture

What does it really mean to implement a modern data architecture? Like many other technology initiatives, it depends on the implementation objectives.

Today's business leaders understand that data holds the key to making educated and supportable decisions - but with all the marketing hype around data lakes and big data, it can be difficult to understand how a data lake solution could resolve your analytics needs.