Planning and Scheduling BP's Oil Refineries

This article is based on a talk given by Dr Roger Main of BP's Technology Centre to the Mathematical Programming Study Group.

Introduction

Planning
and scheduling the activities of oil refineries is one of the
best-established applications of Mathematical Programming (MP). Despite
this, the practice of applying MP models to oil refineries is far from
straightforward.

Oil
refineries consist of a number of process units which turn crude oil
into intermediate components and then blend these components together
to make finished products such as petrol or heating oil.

The
resulting components have a wide range of physical properties: density;
viscosity; octane; sulphur content; etc. On their own these components
would not be suitable for commercial use, but blended together in
various ways they form the products which we know as petrol, diesel,
heating oil, etc.

Traditional Model of a Refinery

The
traditional way of representing the activities in a refinery is as
follows. For each process unit, a number of distinct "modes of
operation" are selected, e.g. for a crude distillation unit these might
be the distillation cut points. Many different combinations of cut
points are possible: for a single crude at one refinery, 62
modes of operation were identified. Multiply this by the number of
crudes processed and add in the representation of further processing
and blending and you can see that the result is a very large model.

Blending can either be represented by:

recipe blends, where the intermediate components are blended together according to some fixed recipe; or

specification blends, where the model itself determines the proportions in which the components are blended.

In most models, blending is done to a quality specification.

Problems with the Traditional Approach

Although it has been used for many years, this traditional approach does not represent reality particularly well.

At any
time, process units convert a single feed stream (which may be a blend
of input materials, e.g. crudes) into a small number of output
materials. Typically you can do 1 or 2 operations a week, i.e. change
the feed or change the processing. By contrast, the traditional model
assumes that there is no limit to the number of distinct feeds or modes
of operation which can be processed.

The
traditional model produces myriad distinct intermediate components and
is able to use these separately in blending. This is unrealistic: in
practice there is a small number of tanks which can be used for each
intermediate component and the model's individual "streams" must
therefore be "pooled" and the pooled components used for blending.
Unfortunately, the quality of such pooled components cannot be
represented in a linear model.

These
two types of extra freedom, the multiplicity of process operations and
the failure to pool intermediate components, give rise to over-optimization. The model has greater flexibility than exists in practice and it exploits this to produce solutions which are "too good".

As well
as these fundamental weaknesses there are other problems. If you run a
process twice as hard you don't get twice as much material. You can go
some way towards representing this by having different modes of
operation for different severities or throughputs, but this is not a
complete solution.

However,
one thing which might be expected to cause severe problems does not in
practice do so. While only a few qualities blend linearly, it is
possible to use linear constraints to control a very large range of
qualities by means of some standard "tricks of the trade". The most
important of these are quality blending indices. These are
functions (e.g. logarithms) of the measured quality of components which
themselves blend linearly. Even this approach falters when faced with
novel qualities such as are being used in the USA to control vehicle
emissions: the definition of the quality "mg of benzene emitted per
mile" is itself a highly nonlinear function of many different qualities
which are measured directly.

The Nonlinear Approach

The
limitations of the traditional model have led to the development of
alternative nonlinear models. These use Successive Linear Programming
(SLP) to represent the refinery more directly. In the traditional model
one has many streams for each intermediate component but assumes that
the qualities of each of these is fixed. In the nonlinear model one has
a single stream but its qualities can vary.

This means that the model uses the average
quality of each intermediate component throughout the period;
similarly, average process conditions and an average mix of crudes are
used.

Although
the nonlinear approach has addressed one of the limitations of the
traditional model, a fundamental problem remains. This is that a
structure of time periods must be imposed on the problem and that,
within each time period in the model, everything happens continuously.
In practice operations in refineries occur in batches, but the batches
are asynchronous across different process units and blenders. This
makes it impossible to represent them directly in a model.

The result is that the nonlinear approach leads to under-optimization.
Refinery schedulers have more freedom than the model. They use the
small number of tanks for each intermediate component to produce
variants of the component with slightly different qualities and then
use these separately in the blending.

Hierarchy of Models

Planning
and scheduling the activities of a refinery is not a single problem but
many. There is a hierarchy of problems, from long-term planning (1 - 5
years) through monthly planning to scheduling the day's activities.
Different levels of detail are used and what is fixed in one model is a
decision variable in another. Some of these aspects are shown in the
table.

Model

Duration

Time periods

Detail

Fixed data

Main decision variables

Long-term

1 - 5 years

1 year

Many refineries; simple representation of each

Capital investments at plants; which refineries make which products for which markets

Medium- term

3 - 12 months

3 months

Single refinery; modest complexity

Plant equipment

Crudes, plant processes and modes

Short-term planning

1 month

7 - 30 days

Increasing complexity

Ditto + plant processes

Crudes, modes

Near-term scheduling

10 days

1 - 10 days

Ditto

Ditto + available crudes

Crude cocktails; modes

Daily scheduling

1 - 2 days

Hours

Ditto

Ditto + crude cocktails; process modes

Blends; allocation of streams to tanks

Horses for Courses

We can
now see that both the traditional model and the nonlinear model have
their uses at different places in the hierarchy of models.

The
traditional approach is over-optimistic, being very selective and
producing unrealistic schedules. But, being linear, it always produces
the guaranteed optimum and provides useful sensitivity analysis. This
makes it valuable in the longer-term models where the data are anyway
uncertain and the model's aim is to provide guidance to decision makers.

The
nonlinear approach tends towards under-optimization, averaging
everything. Its recommendations can be implemented, which makes them
attractive for short-term scheduling. But things do change with time
and so the refinery schedulers still have work to do to disaggregate
them and improve on the recommended solutions.

Concluding Remarks

Real
refineries are highly complex. Building a model, whether traditional or
nonlinear is a large undertaking involving man-years of effort, even
when using a dedicated refinery modelling package. The problem lies in
obtaining the data and deciding how to use it. Models must reflect the
available data and be designed to assist the planners and schedulers in
doing their jobs. For medium- and long-term models, forecast data must
inevitably be used, for instance for prices and demand. Such forecasts
are by definition unreliable. Modellers need to keep this in mind and
remember that there is a limit to what they can achieve