Making Big Data Part of the Routine

Nexen Energy has put big data on the desktop of everyone in its company managing oil and gas assets. It takes three clicks to see any piece of data and a couple more to build it into a chart as part of program based on the premise that closely observed operations are likely to be better managed.

“All the technical staffers are trained,” said Mark Derry, team lead, technical analytics for Nexen Energy. The goal is, “We want them to do it. They have the need and expertise.”

He described this change at a panel on data management at the SPE Heavy Oil and Unconventional Resources Conference in Calgary, during a discussion about what it takes to make data use a bigger part of the lives of staffers, and how to persuade them to apply what they see on their screens in the field.

Bigger data use is increasingly seen as a survival skill in western Canada, where operators are trying to grind down the price of finding and producing oil and gas to profitable levels. While quality reservoir rock is hard to find, the data is plentiful.

“I’ve talked to engineers who are overwhelmed by data,” said Doug Crawford, executive lead for oil and gas at SAS Canada. He pointed out that wells where steam-assisted gravity drainage (SAGD) is used to extract ultra-heavy oil are “famous for having more data coming in than a human being can comprehend.”

One company told Crawford that it stopped saving the data because it could not keep up with it. Panelists played up the ability of having machines rapidly sift through a flood of data to help find meaning in it, but their value depends on the humans making the decisions.

Bertrand Groulx, president of Verdazo Analytics, said his consulting business, which was recently bought by Pason Systems, is bringing these tools to smaller operators. Successful adoptions require long-term support from management, clear objectives, and a project leader who understands what is required to persuade skeptical users to change their routines, he said, noting, “It is more about change management.”

While Derry sees machines some day automatically adjusting the settings on steam-injection systems used for heavy oil production, at this stage people are still making the call. And work teams and field operators are expected to make sure that a smart move at one level does not cause havoc for someone else.

An early lesson for Derry, an engineer turned analytics expert, came early on when he had a big idea he thought would be an easy sell. The analysis went to the people managing an oil sands operation and showed how his analysis could increase production by 7-12%.

“We went to management and we were very, very excited,” said Derry, recalling the pitch for a “fantastic idea” that would just require a couple of wells for a pilot test to prove that it worked. The management response was “this is a black box. We do not care. We do not think it will work.”

The story shows the challenges facing an outside expert with a new idea, as well as the resistance to the notion that a machine could see something an engineer in charge of a field cannot.

While business in Canada is coming off the bottom, a lot of ideas are required to move it back to good times. Data-driven engineers are needed to adapt the business to oil prices at current levels, said Groulx said, adding “we cannot hire and fire out way out of business cycles.”

While there were stories told during the panel of users spotting easy fixes when they first saw good data, significantly improving the performance of systems usually entails a long-term commitment. Well-programmed data search and display can cut the time spent by engineers wrangling data and preparing routine reports, giving them more time to look for value in the numbers.

One example of the payoff was a study cited by Groulx that led a company with a field where water flooding was used to sustain production to reduce the pumping rate, saving USD 12 million a year by reducing rod-pump breakdowns.

Even the most advanced software will not hide the fact that the “data quality and consistency has been and will remain a challenge,” said Marcel Preteau, principal for Ramp Ventures. But the consultant said it is a mistake to wait “until you get the data 100% right because you will never get there.”

While users see how the software and data management can deliver insights, he said the value when the data leads to action “was the direct result of cultural change that the software enabled.”

Woodside Energy has moved closest to the image of artificial intelligence as seen in the movies by building a system with IBM. The process included teaching the machine the language of oil and feeding it massive amounts of data and records based on what it has learned on how the documents related to queries, said Jessica Zambrano, upstream lead at Woodside Energy.

Now it is working with IBM’s next-generation system, Willow, which Zambrano said “can interpret and understand the accents of a multicultural staff with many strong accents.”

Making Big Data Part of the Routine

Stephen Rassenfoss, JPT Emerging Technology Senior Writer

21 February 2017

No editorial available

ADVERTISEMENT

STAY CONNECTED

Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.