View/Open

Date

Author

Metadata

Abstract

The main goal of core analysis is to reduce uncertainty in reservoir evaluation by providing data representative of the reservoir at in situ conditions. Basic core analysis measurements are unchanged, but advances in core analysis techniques provide the ability to measure the required petrophysical properties at reservoir conditions and to acquire also simultaneous measurements of reservoir dependent parameters. Core analysis has to be integrated with field and production data to minimize reservoir uncertainties that cannot be addressed with other data sources such as well logging, well testing or seismic. These requirements define the coring objectives, core handling and core analysis schedule. These objectives cannot be achieved by coring a single well. Coring program is thus an integral part of the reservoir history cycle.
The quality and reliability of core analysis data have become more important with the ever-increasing pressing pressures to optimize field development. The post eighties economics of the petroleum industry, expressed as a need for ever more cost effective technology and the need to evaluate thin bed and non conventional reservoir by means of vertical and horizontal wells have been both the controlling factor and driving force for development of new techniques of coring and core analysis.
Techniques are constantly being improved or new ones are introduced. In proper core analysis the concept of automatic geological core description is growing with the use of the mini-permeameter and the proliferation of sophisticated analysis methods such as SEM / EDX, X-ray CT, NMR and PIA. These Hi-Tech methods provide a wealth of micro structural and microscopic information previously undreamed of.
This paper provides an overview of recent and emerging developments and trends in coring technology and core analysis. This is to enhance the reservoir evaluation processes. The questions of quality control and quality assurance are discussed that aim at an early detection of systematic challenges in data products, particularly those with a danger of causing quality reduction.