Much of what has been discussed in Chapter 3 could be said to have little significance to most
of the world. Primary and secondary data collection is, in many senses, an expensive luxury-something
which is only realistically applied to those small portions of the earth's surface
which are characterized by being comparatively densely populated and which have a high
G.N.P. per capita. The vast majority of the world is not like this. There are no structures or
means to plan and carry out questionnaire surveys, to equip and organize time series
evaluations for several parameters of water quality, temperature or quantity, or to differentiate
between the relative accessibility of differing production inputs. Life is not organized at this
scale. There has never been the need to gather and collate the type of information which would
be of use to intending fish producers, or indeed to almost any entrepreneur.

However, this is changing. In Chapter 1 we outlined the crucial need to select and reserve
land or water areas which would be suitable for aquaculture or inland fisheries, i.e. in order to
best increase food supplies, employment opportunities and to increase the wealth of an area.
Whilst it is beyond the means for most developing countries to intensively survey their
territories for a large number of varied parameters using ground based techniques, this is quite
within the means of remote sensing (RS) technologies. In this chapter we aim to first look at
RS by examining its definition, its development and its methods, and then to examine how RS
can be of value to aquaculture and inland fisheries. We shall also need to acknowledge that,
although RS has brought many benefits to the spatial analyst, there are still many limitations
with regard to its use. Because there are several recent FAO publications which provide
background information on RS (Lantieri, 1988; FAO-RSC Series 47, 1988; FAO-RSC
Series 49, 1989), we shall try to concentrate on its applications to the search for optimizing
locations. We shall consider the integration of RS into GIS in Chapter 6, and a number of our
case studies (in Chapter 7) specifically consider applications of RS to aquaculture, inland
fisheries or related fields.

Remote sensing is “…concerned with the collection of data by a sensing device not in contact
with the object being sensed, and the evaluation of the collected data, which is then termed
information and is presented in map form or as statistics.” Howard, 1985). Clearly, the
concept of RS covers a huge field - a field within science and technology which encompasses a
vast “applications domain”, i.e. in the sense of inputs of applied science, applications in the
processing field and in the sense of the ways in which RS outputs can be applied.

Since Butler et al (1988) have already outlined the historical growth of RS (for the FAO),
we will confine our resumé of its development to a few key advances. The human eye is a
remote sensor and, although we can capture images which may be stored in the brain and later
retrieved, we can only reproduce them in a subjective sense. The eye can only capture visible
radiation which occupies a very small part of the complete range of radiation (which is known
as the electromagnetic spectrum). To overcome these deficiencies various instruments or
systems have been invented and developed. We will be concerned here with those instruments
or systems which capture data from an aerial perspective, i.e. allowing maps to be easily
created.

As a means of capturing images, the camera was first developed in France in the 1830s, but
it was not until 1858 that the first aerial photograph was taken from a captive balloon near
Paris. During the rest of the 19th century advances were made in cameras and additional
camera platforms were experimented with. The first photograph from an aeroplane was taken
in 1909 over Centocelli in Italy. During the first World War aerial photography was utilized on
a large and systematic scale, both in Europe and the Near East, with specially designed cameras
and film processing techniques being developed. It was during this period that photo
interpretation became a recognized field of expertise.

Civilian use of vertical aerial photography greatly improved during the 1920s and 1930s
because of advances in both photographic methods and in the aeroplane as a platform. Aerial
photography was used in the compilation of topographic maps, by geologists, for esters and
planners, mainly in north America and Europe but occasionally to acquire information from
more remote areas which might otherwise be unobtainable. During World War II further
developments occurred, e.g.:

The water penetration capability of aerial film was recognized which meant that
bathymetric data could be acquired.

Colour infrared film was developed for camouflage detection.

Advances in radar technology permitted the development of smaller transmitting
and receiving equipment, appropriate for airborne use.

A large area of the Pacific war zone was photographically mapped.

During the 1940s and 1950s large-scale, complete country coverages were undertaken using
black and white panchromatic aerial photography, i.e. for many of the colonial countries,
especially in Africa. By the 1960s aerial photography had been operative long enough to allow
for the study of spatial/temporal variations in the environment.

The period from the late 1950s has been extremely active for RS, with developments in the
whole applications field occurring at an exponential rate. With satellite launches occurring
regularly, following SPUTNIK 1 in 1957, the interest in RS concentrated on the use of this
new and unique platform. In 1959 the first earth images were transmitted from EXPLORER 6
and the first meteorological satellite, TIROS-1, was launched in 1960. The next major advance
for RS occurred with the launch of the Earth Resources Technology Satellite (ERTS-1, later
renamed Landsat 1) in 1972. This was the first satellite designed to provide long-term,
uniform global coverage, having the ability to transmit data gathered on a variety of
instruments, for eventual mapping at scales of 1:250 000 to 1:1 000 000. Since Landsat 1 there
has been a succession of earth monitoring satellites launched, first by the U.S.A. and the
U.S.S.R., but more recently by other countries. Their equipment has become progressively
more sophisticated allowing a greater range of imaged data to be interpreted at a more detailed
spatial resolution (Travaglia, 1989).

As both Howard (1985) and Butler et al (1988) make clear, we should not let the
blossoming satellite technologies mask the fact that airborne RS is still thriving and vital.
Aircraft have a number of distinct advantages over satellites, mainly in terms of their flexibility
of altitude, scheduling and payload. They do not have the same cloud cover problems that
satellites have and they can provide low cost, excellent images of smaller areas. Howard
(1985) estimates that airborne techniques, using colour infrared technology at high altitude, can
allow more than 20 000 km2 to be photographed, and be thematically mapped at a scale of 1:25
000 to 1:100 000, each day.

It is the rapid surge in the electronics industry which has permitted the post 1960s boom in
RS, i.e. because as well as providing for actual developments in data capture, data
transmission, image processing, etc., it has spawned the computer technology advances which
have been vital to all aspects of space science and have allowed the huge data streams to be
efficiently handled. This surge in RS has also been aided by the influx of new ideas from a
variety of related disciplines, by the availability of funding for space related activities and by the
access to an increasing range of software and hardware. RS has a number of positive
advantages over other sensing systems:

It allows change to be monitored in a systematic and orderly way.

It is efficient and very cost-effective in per km2 terms.

It overcomes many data collection problems, e.g. in isolated areas and the fact
that normal data collection may terminate at political boundaries.

It can provide for instantaneous updating of information.

Jackson and Mason (1986) report that modern RS has now successfully overcome the
problems which it had in the 1960s and 1970s of being “technology pushed”. As useful
applications of RS imagery have been developed, especially in the fields of food production
and environmental awareness, RS is now also being “user pulled”. This recent trend has been
greatly helped by the growing ability to successfully integrate remotely sensed data into GIS - in
fact, we would suggest that if it was not for the functionality offered by GIS, then the future
for RS might be rather uncertain (see also Ehlers et al, 1989).

“The aim of environmental remote sensing is to utilize sensors, which are mounted on aerial
platforms, to identify and/or measure parameters of an object according to variations in the
electromagnetic radiation (EMR) emitted by, or reflected from the object.” Contained within
this statement are a number of concepts which will receive individual attention in the next three
sections, in an attempt to clarify necessary RS principles. The interested reader should consult
the FAO sources quoted in section 4.1 for further details.

Although we cannot see light (or sound) travelling, we know that it does so. This travel
involves the transfer of energy through space or matter in the form of wave motions. The
waves that make up EMR travel at a constant speed of 300 million meters per second. All
objects reflect and radiate EMR. The amount of electromagnetic energy emitted is a function of
the object's temperature - as temperatures increase, the intensity of the radiation emitted
increases. There are a whole family of waves which collectively are called electromagnetic
vibrations and which vary in their wavelength, and hence (since their speed is constant in
space) their frequency. This family may be displayed as a spectrum of energies, hierarchically
arranged by wave frequency or wave length (Figure 4.1). The spectrum covers a vast
continuum of wavelengths as indicated, and it is usual to rather arbitrarily differentiate between
the major wave bands based on certain properties such as their source, method of generation,
means of detection, selected applications, etc. Only specific wavelength bands are of interest to
RS, i.e. those forming the “windows of transmission”, because at these wavelengths filtering
out of EMR by the atmosphere is at a minimum. These bands can be shown as in Table 4.1.

The energy which is sensed by the different RS systems is a function of various parameters
which might affect the energy before it is received by the sensors. This is shown in Figure 4.2
which indicates that EMR can be natural, either reflected light and other radiations from the sun
(Source 1) or emitted heat from the earth (Source 2), or it can be man-made such as from a
power station or a radar system. The amount and type of radiation emitted or reflected depends
upon incident energy (mainly from incoming solar radiation), the nature of the earth's surface
and on the interaction with the earth's atmosphere.

Figure 4.1 The Electromagnetic Spectrum

The full electromagnetic spectrum is shown below, with the wavelength range marked
in. The sources and detectors for each part are also shown.

Table 4.1 Wave Bands of the EMR Spectrum of Interest to Remote Sensing

Wave Band and Wave Length

Detectors

Some Characteristics

Visible0.4–0.7 um

Black & white plus colour photography.T.V. camera.Optical scanner.

High atmospheric scattering effect. Most EMR is reflected solar radiation therefore only used in day-light. Penetrates water.

This comes mainly from the sun and, in the range of the visible and near infrared part of the
spectrum, it is the proportion of the incident energy reflected by the “object” on the ground.
When the energy sensed is in the range of thermal radiation it comes mainly from the emission
of the “object” on the ground, which is itself a function of the sun's incident energy which has
been absorbed by that object and then re-emitted as thermal radiation. Incident energy from the
sun will vary with season or latitude (affecting the angle of the sun), with the length of time the
sun has been shining and with the angle of the object on the ground. When analyzing remotely
sensed data it is important to consider dates, time of acquisition and relief.

The atmosphere can affect the amount of radiation received by the sensor because the
atmosphere itself is heterogeneous, being made up of many gases as well as having dust
particles and other pollutants. The atmosphere may scatter light in the visible band and absorb it
in the ultraviolet and infrared bands. About 18% of the incident radiation in the atmosphere is
absorbed or scattered and about 35% of the incoming solar energy is reflected by the earth and
the atmosphere, including clouds. Scattering is caused by particles in the atmosphere reflecting
the energy, and the intensity of the scattered EMR depends on the ratio of the wavelength to the
size of the particles. Scattering caused by small particles is selective relative to the wavelength,
affecting shorter wavelengths more; scattering related to large particles is non-selective,
affecting all wavelengths. Because of scattering, the energy received by the sensor includes
reflections from the atmosphere as well as from the target (object). Complex algorithms are
needed to correct this effect. Atmospheric absorption reduces the amount of EMR reaching the
sensor in some wavelength bands. Figure 4.3 shows the percentage of EMR which can pass
through the atmosphere as a function of wavelength (revealing the atmospheric windows), and
the gases responsible for absorption are noted. Microwave radiations are unaffected by
atmospheric conditions, which makes them very useful, especially in cloudy areas such as the
tropics.

Figure 4.3 Percentage of EMR Able to Pass Through the Atmosphere
as a Function of Wavelength (after Sabins, 1978)

The element of the earth's surface which is in the field of view of the sensor (the target or
object)will produce, by reflection or emission, the EMR measured by the RS sensor. The
amount of energy transmitted or reflected depends on what the target consists of and the
thickness of it. A target also absorbs radiation and this can affect the temperature of the target
and thus the amount of energy radiated per second. So all targets (or objects) in the
environment emit and reflect different intensities and types of EMR in different portions of the
spectrum, i.e. they have a so-called spectral signature which is predictable and repeatable.
Figure 4.4 gives the spectral signature of various natural features. These signature curves are
dependent on a number of interactions between incoming radiation and the micro and macro-structure
of the matter irradiated. Spectral signatures may vary temporally, e.g. as plants grow,
or spatially with different types of vegetation, different soil conditions, water availability, effect
of topography, etc.

Sensors are the devices used to gather EMR. They will typically consist of four components,
i.e. collectors, detectors, signal processors and recording units. There are several ways of
classifying sensors-we will describe them under the headings:

Framing systems. These include various types of camera which record
instantaneously an entire image.

Scanning Systems. These employ a detector (electronic sensor) which sweeps
across a scene in a series of parallel lines collecting data in order to record an image. They may
employ passive sensors, which record reflected or emitted EMR from natural sources, or active
sensors which illuminate an object with their own radiation source, and then record the “echo”.

In this section we will be concerned with the sensors theoretically-in Section 4.5. we will
consider the actual sensors carried by operational satellites.

Cameras may be used from various platforms. We will concentrate here on points significant
to satellite photography since airborne photography is well documented in sister FAO
publications (Butler et al, 1987 and Dainelli, 1988).

The still photography camera is the best known, simplest and cheapest of all sensors, and
still photography produces images having a better resolution than those produced by electronic
sensors. Cameras may produce simple, single images in one spectral band which are suitable
for many referencing purposes, or they can produce overlapping pairs of aerial photographs
which, when viewed using a stereoscope, give a three dimensional perspective of the
landscape. But their chief use is in multi-spectral photography. Here a number of cameras
may take simultaneous images of an object, using several band-pass filters, which each allow
EMR information relative to particular wave bands to be recorded. The number, position and
width of suitable colour filters can be optimized in a problem oriented manner so that the
controller is able to discriminate between a wide range of features (within the visible and
infrared bands - in the wavelengths between 0.4 and 1.3 um). The quality of the photographic
image will depend on the inter-related factors of: focal length; the angle of view; scale of the
photograph; the contrast ratio; the picture resolution and the film speed (Butler et al, 1988).

There are a variety of cameras suitable for satellite and/or aerial RS, and their selection
depends upon the nature of the application. In principle mapping frame aerial cameras are
similar to normal cameras except that they have:

Calibrated lenses.

High geometric accuracy.

A medium to large format.

A more complex mechanical and electrical configuration.

Though cameras have the advantages already noted, they do have some disadvantages, i.e.
there is a loss of resolution during photo-chemical processing or copying, or in analogue digital
conversion for subsequent computer processing, and they have a limited spectral sensitivity.
They can also only function in favourable weather conditions.

Cameras will require different types of film for different purposes. The main types, according
to their range of spectral sensitivity, are:

Orthochromatic. These have a very good discrimination in the green bands and are
used mainly for cartographic reproduction.

Panchromatic. These cover the whole of the visible spectrum, with good sensitivity,
except for green bands (0.5 um) which can be compensated for with a filter. This film is
inexpensive, is easy to process, has a high spatial resolution and special filters can be used to
enhance selected objects (targets).

Black and White Infrared. This is similar to panchromatic except that its greater
spectral sensitivity means that near infrared wavelengths can be recorded in addition to visible
light. Usually a dark red filter is used to screen out the visible portion of the spectrum, so that
only the near infrared portion is recoreded, which results in a greater penetration of the
atmosphere. This film is used mainly for detecting different vegetation stages and types, plus
the existence of water.

Natural Colour. The spectral range of this film is similar to that of panchromatic. It
is composed of three layers, each sensitive respectively to the three primary colours-blue,
green and red. Colour images offer a range of about 20 000 natural shades whilst black and
White is limited to only 200 grey tone shades, i.e. Colour film allows the distinguishing of
many more features, because of its greater sensitivity to tints and shades. Colour film has a
number of applications, e.g. its sensitivity to sub-surface water makes it especially useful for
coastline definition and the estimation of water depth and sediment content. However, colour
photographs are more expensive, they have a less good image definition and they cannot be
taken from a high altitude.

False Colour. These films are formed by having three layers sensitive to green, red
and near infrared radiations, respectively modulated into blue, green and red. The film has
been moderated to achieve several advantages, e.g. a high penetration of the atmosphere, sharp
resolution and definition of water bodies and a good response to the infrared reflectivity of
healthy vegetation. However, these films have a limited exposure tolerance and the film
requires refrigerated storage.

This is the main alternative to photographic systems for detecting and recording EMR. Several
bands of the EMR spectrum, either from the ultraviolet to infrared regions (multispectral
scanners) or the microwave bands (radiometers), may be scanned simultaneously by optically
splitting the collected radiation and diverting each part to a separate detector element. Final
image products can be photographs or computer compatible tapes containing digital data.
Scanning sensors can be either passive or active. Passive sensors detect natural incoming EMR
and active systems detect system-generated EMR (so called “echo”).

These sensors are called radiometers and they can detect EMR within the ultraviolet to
microwave wavelengths. Two important spatial characteristics of passive sensors are:

Their “instantaneous field of view” (IFOV) - this is the angle over which the detector
is sensitive to radiation. This will control the picture element (pixel) size which gives the
ground (spatial) resolution of the ultimate image (Figure 4.5), i.e. the spatial resolution is a
function of the detector angle and the height of the sensor above the ground. For more details
on spatial, spectral, radiometric and temporal resolutions see Lechi (1988).

Figure 4.5 The Concept of IFOV and AFOV
(after Avery and Berlin, 1985)

The “swath width” - this is the linear ground distance over which the scanner is
tracking (at right angles to the line of flight). It is determined by the angular field of view
(AFOV - or scanning angle) of the scanner. The greater the scanning angle, the greater the
swath width (Figure 4.5).

There are two main categories of passive sensor:

A mechanical scanning radiometer. This is an electro-optical imaging system on
which an oscillating or rotating mirror directs the incoming radiation onto a detector as a series
of scan-lines perpendicular to the line of flight (Figure 4.6). The collected energy on the
detector is converted into an electrical signal. This signal is then recorded in a suitably coded
digital format, together with additional data for radiometric and geometric calibration and
correction, directly on magnetic tape on board the sensor platform.

Figure 4.6 Optical Mechanical Scanning System

A push broom radiometer. This uses a wide angle optical system in which all the
scenes across the AFOV are imaged on a detector array at one time, i.e. there is no mechanical
movement (Figure 4.7). As the sensor moves along the flight line, successive lines are imaged
by the sensor and sampled by a multiflexer for transmission. The push broom system is
generally better than the mechanical scanner since there is less noise in the signal, there are no
moving parts and it has a high geometrical accuracy.

All active sensors illuminate objects with their own source of radiation. The illumination will
either induce an object to emit radiation or cause it to reflect the sensor produced radiation.
This ability means that no sunlight is required so imagery can be recorded by day or night or
through clouds and light rain. Some active sensor systems are surface-based, e.g. sonar,
others could be carried in aircraft, e.g. Side Looking Airborne Radar (SLAR) whilst others can
be mounted in satellites, e.g. Synthetic Aperture Radar (SAR). We will review briefly
airborne and satellite active systems, which are commonly called Radar, and which are
generally classified either imaging or non-imaging:

Imaging Radars. These display the radar backscatter characteristics of the earth's
surface in the form of a strip map or a picture of a selected area. A type used in aircraft is the
SLAR whose sensor scans an area not directly below the aircraft, but at an angle to the vertical,
i.e. it looks sideways to record the relative intensity of the reflections so as to produce an image
of a narrow strip of terrain. Sequential strips are recorded as the aircraft moves forward
allowing a complete image to be built up (Figure 4.8). The SLAR is unsuitable for satellites
since, to achieve a useful spatial resolution, it would require a very large antenna. A variant
used in satellites is the SAR whose short antenna gives the effect of being several hundred
times longer by recording and processing modified data.

Non-imaging Radar. These are also called scatterometers since they measure the
scattering properties of the region or object being observed, i.e. the roughness of the surface
over a wide swath on either side of the spacecraft. A type of scatterometer is the radar altimeter
which can provide an accurate height assessment for satellites, and these measurements can
yield valuable topographic or sea surface roughness variations.

A further type of active sensor is the laser radar (LIDAR). LIDARs use lasers to generate
short, high power light pulses. These can be used to measure the intensity of light back-scattered
by the target as a function of the distance from the sensor. Because of their size
LIDARs are presently limited to airborne craft.

In this section there will be no need to detail all the various platforms and their sensors since
this has been exhaustively studied elsewhere, e.g. with regard to fisheries applications see
Cheney and Rabanal (1984), Butler et al (1988) and Petterson (1989). Here it will be
appropriate to mention some of the platforms commonly used, to explain the two general types
of environmental satellite system, to summarize the operational systems presently in use and to
exemplify the main sensor types being carried. Since it has received little attention elsewhere
we will briefly mention the scope and availability of RS imagery from the U.S.S.R.

Sensors can be carried on space, air, terrestrial or water-borne platforms but it is beyond our
remit to examine the latter two types. There are various major airborne or space platforms as
follows:

Balloons. These may be free floating or anchored, and the former can be gas filled,
hot air or propelled. They are now infrequently used, because they are slow, although there
has been some discussion on bringing back into service the dirigible balloon.

Helicopters. They may be useful for the essential “ground truthing” work, i.e.
collecting statistical data in more remote areas to verify images obtained from higher and faster
platforms. They have rarely carried sensors directly.

Space Shuttles or Laboratories. These are essentially manned space missions. They
frequently carry experimental payloads which may require human testing or adjustment, or
interactive participation with ground researchers. Any of the previously mentioned sensors can
be utilized, and some of them, e.g. photography, with a great deal of flexibility.

Airborne. There are several sub-categories of this platform, according to flight
altitude:

High altitude aircraft - these usually operate at over 8 000 meters allowing for
photography at about a 1:100 000 scale, or the use of multi-spectral scanners
and Radar systems.

Medium altitude aircraft - which operate at 3 000 to 8 000 meters and can take
photographs at a 1:20 000 to 1:80 000 scale or carry multi-spectral scanners.

Light aircraft - which fly below 3 000 meters and can do aerial
reconnaissance, take large-scale photographs and can supplement missing or
uncertain satellite imagery.

Satellites. Since these are the platforms which provide the bulk of the remotely
sensed data, and they are likely to be of increasing importance in the future, we shall examine
them in more detail in the next section. There have now been many hundreds of satellite
launches, mostly by the U.S.A. and U.S.S.R., and the majority have been for military
purposes. Recently several other countries have launched their own satellites, either
independently or in joint ventures. Initially, satellites were largely experimental, but they are
now increasingly research or operational platforms, with most of them carrying a varied sensor
payload. Their primary capability is to carry sensors which can monitor the entire earth surface
on a periodic basis, sensing a large area during each revolution. They all operate at an altitude
which is sufficient to escape from the earth's atmospheric drag but still remain within the
dominant gravitational field, i.e. between 150 kms and 40 000 kms. Most satellites have been
launched from rockets, whilst some have been launched aboard a shuttle spacecraft, from
where they are unloaded into space. It has been recently demonstrated that the in-flight repair
of satellite systems is now viable, though this is unlikely to be cost-effective for unmanned
craft.

The advantages of satellites include the repetitive coverage of the earth's surface at various
scales and at varying resolutions, with data being acquired on a routine and cost-effective basis.
Often satellite-sensed data is the only information available for large tracts of ocean, mountain,
desert or tropical forest areas. The disadvantages include their large capital costs, which
include permanent monitoring and receiving stations, their relatively poor resolution for many
environmental purposes and the fact that cloud cover remains a problem for many sensing
devices.

These are satellite sensing systems which are boosted into a high geosynchronous orbit at
approximately 35 900 kms above the equator, i.e. at this altitude the speed of the satellite can
exactly match the speed of the earth's rotation. Because of this height, they have a limited
number of uses, e.g. to transmit telecommunication signals or to get a broad view of the
weather. The fact of remaining stationary means that they can achieve a high temporal
resolution, but the great height means that spatial resolution is normally only in the range of
2 to 5 kms, according to wavelength. This type of satellite was first launched in 1966 and there
are currently five geostationary satellites which each cover a different portion of the earth
(Figure 4.9). They can image the earth's surface between latitudes 80N and 80S and they are
able to image and transmit data on their whole viewable area every 30 minutes.

These satellites orbit the earth, with an inclination relative to the equator of nearly 90 degrees,
i.e. so that their orbit nearly crosses the north and south poles (Figure 4.10). Their orbit height
varies between about 270 kms and 1 600 kms and it is usually sun-synchronous - meaning that
it crosses the equator at the same sun time each day. By having this type of orbit, the satellite
visits any particular point above the earth at the same time, which is useful for the comparative
analysis of multi-temporal data. One complete revolution of the earth takes about 95 to 115
minutes (depending on altitude), meaning that 12 to 16 revolutions are achieved each day. The
exact inclination of the flight path will determine the time period between re-visits to any
specific location, but it is commonly once every 16 to 20 days (Figure 4.11). These satellites
have a working life expectancy of about four years.

A number of environmental RS satellite systems and/or programmes have been in operation
since the mid-1960s. Some of these programmes are ongoing, others have ceased. Table 4.2
attempts to list the main programmes. We mention many programmes briefly since a large
amount of data has been acquired from them, much of which may still be valid and available.
We shall describe here the main characteristics of four satellite systems, i.e. Landsat 4 to 5,
SPOT 1, ERS-1, MOS-1 and Kosmos - their sensors will be described in the next section.
These are selected because they are either currently providing data or because they are the most
recent of the orbiting satellite series. It is difficult to select satellite systems for study which are
particularly relevant to aquaculture and inland fisheries, because much of the environmental
data gained from any of the systems is potentially useful. Section 4.7 will be concerned with
the potential applications of satellite data to location analysis for fish production sites.

Landsat 4 and 5 were launched respectively in July, 1982 and March, 1984. They both have
an angle of inclination of 98.3 degrees and an orbital time of 98.5 minutes. The satellites make
14to 15 revolutions per day and it takes 16 days before a revisit track is made. The Landsat
distance between successive orbits. These satellites are a continuation of the original Earth
Resources Technology Satellites (ERTS) programme, initiated in 1972, which later developed
swathing pattern is illustrated in Figure 4.12, which also shows the width of swath and the into
the Landsat series. Landsat 4 and 5 differ from the earlier Landsats by the introduction of the
Thematic Mapper sensor and the exclusion of the Return-Beam Vidicom (Figure 4.13).

There are some 18 receiving stations around the world from where Landsat data is
transmitted to NASA's Goddard Spaceflight Center. It is then passed to a commercial company
(EOSAT) for processing and distribution. Both Landsat 4 and 5 have well outlived their life
expectancy yet continue to operate because some of the on-board systems have been closed
down to save power. There is likely to be a “data gap” when these two satellites finally cease
to transmit, i.e. since lack of funding and technical problems will prevent Landsat 6 from being
launched until the second quarter of 1992.

SPOT-1 (Système pour l'Observation de la Terre) was launched in February, 1986. It was
constructed by the French in cooperation with Belgium and Sweden. It has a sun-synchronous
orbit at an inclination of 98.7 degrees, and a revolution period of 101.4 minutes. It makes 14
or 15 revolutions per day and revisits the same track every 26 days. It's altitude varies from
820 to 840 kms and it can acquire images between 84 degrees North and South. A key feature
of SPOT is the provision for off-nadir viewing, i.e. it can “look” side ways for up to 27 degrees
from the vertical in either direction, extending the field of view by 475 kms each way. This
allows for a much reduced revisit time, although images would then necessarily be from an
oblique angle. This off-nadir facility is steerable from ground control (Figure 4.14). Off-nadir
viewing also allows for stereoscopic viewing - pairs of images of a given scene can be recorded
at different viewing angles during successive satellite passes in the vicinity of the scene
concerned (Figure 4.15).

European data from SPOT is received at stations in Toulouse (France) and Kiruna
(Sweden). On board recording of data is possible in areas beyond the range of ground
receiving stations, for later transmission to Toulouse. Data can be received at stations over a
maximum distance of 2 600 kms, and this means that transmissions can last for up to 800
seconds whilst passing a station. Stations are capable of receiving about 250 000 scenes a
year. Each day, an observation sequence is loaded into the on-board computer from the
Toulouse ground control station. Dissemination of SPOT data is via a commercial company
“SPOT Image”. SPOT-1 will be decommissioned in September, 1990. SPOT-2, with similar
credentials to SPOT-1, was launched in January 1990 and two further satellites in the series are
planned before 1998.

This was the first Japanese earth observation satellite - it was launched in February, 1987 and
has a three-year scheduled life (Figure 4.16). It orbits at an altitude of 908.7 kms, having an
inclination of 99.1 degrees and it makes 14 orbits per day. There is repeat coverage every 17
days and it takes 237 orbits to gain a total world wide coverage. MOS-1 does not carry tape
recorders and thus ground stations are needed to acquire data which are out of range of the
Japanese Earth Observation Centre. The European Space Agency has an agreement with the
Japanese Space Development Agency to acquire, process and distribute MOS-1 products in
Europe. MOS-1b, with identical credentials to MOS-1, was launched in February, 1990 and a
further four satellites in the series are planned.

ERS-1 is to be the first of a series of satellites, in a programme to be operational in the 1990s,
which have been planned by a consortium of countries making up the European Space Agency
(ESA). Following its launch in 1991 ERS-1 will be placed in a sun-synchronous orbit and will
give global coverage, including the polar regions. It will have an altitude of 777 kms, an
inclination of 98.5 degrees and a revolution time of 100 minutes (Figure 4.17). ERS-2 is
planned for launch in 1994. Real-time data will be relayed to stations at West Freugh
(Scotland), Kiruna (Sweden), Fucino (Italy), Maspalomas (Canary Islands) and Prince Albert
(Canada).

This Russian satellite series is a continuation of an older series, and it includes satellites
launched for a variety of purposes. It is managed by the “PRIRODA” State Remote Sensing
Centre. Soviet officials note that launches in the series occur once every two or three months.
The satellites are placed in low (270 kms) orbits and the repeat time for full coverage is 22
days. Their life expectancy varies but it is very short. There is an absence of a network of
ground receiving stations.

Here we will be concerned to look first at the main sensors being carried by Landsats 4 and 5,
SPOT 1, ERS-1, MOS-1 and Kosmos, including their usual applications. We will then
describe two other sensors which have been particularly useful for marine or water based
purposes.

Multi-Spectral Scanner. This system is a line scanning device which records from
four bands, two in the visible and two in the near infrared spectral bands. The system is
comprised of a telescope, a mirror which reflects ground radiation onto a bank of 24 electro-optical
sensors, band filters and a sampling system, an internal calibration system and various
devices which ensure an orderly stream of digital data for each pixel and spectral band. It
images six scan lines in each of the four spectral bands simultaneously giving a 24 scan-line
total. Resolution, as delimited by pixel size, is 83 m and a six-bit quantization gives a possible
devices which ensure an orderly stream of digital data for each pixel and spectral band. It
images six scan lines in each of the four spectral bands simultaneously giving a 24 scan-line
total. Resolution, as delimited by pixel size, is 83 m and a six-bit quantization gives a possible
range of 64 intensity values. The individually scanned scenes of one MSS image covers
approximately 185 x 185 kms and each overlaps its neighbour by about 10%. The original
images have a scale of 1:3 369 000 and one frame encompasses 34 000 km2. Data is recorded
on magnetic tape for later transmission to receiving stations and data is available in digital or
analogue (photographic) form. Table 4.3 shows details of the bands and possible applications
of the imagery.

Table 4.3 Landsat MSS Bands and Applications

Band

Spectral Range

Features/Applications

4

500–600 nm Green band

Imagery from this band emphasizes movement of sediment laden water and shallow water bodies, shoals and reefs, etc.

This imagery provides best penetration of atmospheric haze, and it emphasizes vegetation and land-water boundaries.

Thematic Mapper. This sensor collects, filters and detects radiation in a similar
185 km swath. It records in seven spectral bands which include medium and thermal infrared.
It provides a spatial resolution of 30 meters, except on the thermal infrared band, where it is
120 meters. The high spectral resolution is achieved by sensitive detectors and an 8-bit
quantization in the analog-to-digital conversion process gives 256 grey levels. The Enhanced
Thematic Mapper (ETM), to be launched on Landsat 6, will have an additional 15 meter
resolution panchromatic band. Table 4.4 shows the potential application of this sensor by band
and wavelength.

SPOT carries two identical high resolution visible (HRV) scanners, each of which can function
independently. They can each scan a strip measuring 60 x 80 kms along the flight line, this
width varying with the viewing angle. Every 60 kms the SPOT data is cut to form a scene.
The two sensors are designed to operate in either of two modes - panchromatic (black and
white) or multi-spectral (colour) in the visible and near infrared spectral bands. The sensors are
of the push-broom type. Each consists of a series of fixed linear arrays made up of electronic
detectors known as CCDs (charged-coupled devices). Image data are collected by successively
measuring the current generated by each detector along the array. In panchromatic mode each
individual detector corresponds to one pixel and measures the reflectance of a 10 meter ground
resolution cell. In multispectral mode the detectors are paired and thus measure a 20 meter
pixel across the track. By doubling the time taken to obtain each sample, the along track
measurement of the cell also becomes 20 meters.

We have already mentioned the nadir, off-nadir and stereoscopic viewing capabilities of the
system. The main applications of the stereoscopic imagery are in photogrammetry, for
cartographic purposes and photo-interpretation for geological, geomorphological and
hydrological studies. SPOT's other application are largely for land-use studies, the
assessment of renewable resources and aiding in mineral and oil exploration. The high
resolution makes possible the compilation of topographic maps (at scales of 1:100 000),
with the contour interval as little as 20 meters, thematic mapping of between 1:25 00 and 1:50
000 plus the direct compilation of digital terrain models. Table 4.5 gives the main
characteristics of the HRV.

Active Microwave Instrument(AMI)combining the functions of a Synthetic Aperture
Radar (SAR), a Wave Scatterometer and a Wind Scatterometer. The AMI will measure wind
fields and wave spectra and obtain all-weather images. It will provide for a spatial resolution of
30 meters and have a swath of 99 kms.

The details of the three sensors on board MOS-1a and 1b are given in Table 4.6. MOS is
intended to establish fundamental technologies for earth observation satellites, primarily by
observing oceanic phenomena such as ocean colour and temperature. The satellite observations
are also expected to be of value to agriculture, forestry, fishery and environmental preservation.

Satellites in the series have carried different sensors and combinations of sensors. Of most
interest for environmental observations are their cameras, details of which are shown in
Table 4.7. The KFA-1000 provides for a ground resolution as small as 5 meters, giving it a
great advantage over SPOT or Landsat. Another advantage is the frequency of cover. 98% of
all PRIRODA's RS imagery is obtained from a combination of the three cameras carried on
Kosmos satellites. After imaging in space, the exposed film is soft-landed to earth for
processing.

There have been a number of other satellite sensors launched in the past two decades which
have been of potential value to aquaculture and inland fisheries. We will briefly describe two
of these:

The Heat Capacity Mapping Radiometer (HCMR). This sensor was launched as part
of the Heat Capacity Mapping Mission in April 1978 and operated until September 1980. The
sensor was a two channel scanning radiometer operating in the visible and near infrared band
and the thermal infrared band. The main objectives of interest were:

The mapping of natural and man-made thermal effluents.

The detection of thermal gradients in water bodies.

The mapping and monitoring of snow fields for water run-off
prediction.

The monitoring of marine oil pollution.

Many of the products obtained from this mission are still available.

The Coastal Zone Colour Scanner (CZCS). This was launched aboard Nimbus-7 in
October 1978 and was operational until late 1984. The CZCS was a multi-spectral line
scanner, optimized for use over water. It collected quantitative information on ocean colour,
suspended sediments, chlorophyll concentrations, pollutants and temperature from the upper
few meters of water. Both photographic and digital data are still available. Table 4.8 sets out
the bands and measurements of CZCS.

The electronic images which have been captured by RS devices are either transmitted directly to
earth or are stored on on-board recorders for later transmission. This represents the initial stage
in a complex information flow which is depicted in Estes (1985) (Figure 4.18). The data
scanned are retained in the form of pixel values, with each value representing the amount of
radiation(the spectral reflectance) within a given band-width received by the scanner from the
area of the earth's surface covered by the pixel. Pixel values are digitally coded by a certain
number of bits, i.e. Landsat and SPOT use 8-bit codings which give a range of 256 possible
values, and the values for any one pixel will change according to particular spectral bands being
recorded. The area covered by a pixel (the resolution) is a function of the height of the sensor,
the focal length of the lens or focusing system, the wavelength of the radiation and other
inherent characteristics of the sensor itself. Each pixel will be allocated a co-ordinate in agrid
referencing system.

Figure 4.18 The Flow of Information in the Remote Sensing System

The pixel values are transmitted to earth (downlinked) as a stream of binary numbers. To
reconstitute the images, ground based computers decode the binary data, allocating the
appropriate colour tone to each pixel value. The images can then be displayed on a monitor or
in some print-out form. At the initial stage they will be monochrome and in a pre-processed
state. To perform image analysis processes there is a vast selection of computer hardware and
software systems, for micro, mini or mainframe computers, which we cannot explore here but
which Jensen (1986) reviews in some detail. These software systems should be capable of
executing all, or a number of, specific processing functions, as shown in Table 4.9. Not all of
the functions shown are essential, i.e. this will depend on the type of output required. We will
describe here only the essential functions plus those more commonly used. Images from
Landsat and SPOT can be purchased at various “levels” of processing.

The two functions involved here, radiometric and geometric corrections, will be essential if
meaningful output is to be obtained. They are essential because there are a number of factors
inherent in the RS system which contribute to the images being distorted in some way. These
factors include:

Changes in the attitude, velocity and altitude of the sensing platform.

The forward motion of the platform causes scan skew.

The scanners (in Landsat) do not have a constant scan velocity.

The area covered by one pixel will have its shape distorted when viewing at an
oblique angle.

The geometry of the images is affected by the earth's rotation, its curvature and
atmospheric refraction.

Radiometry is affected by the sensor, e.g. sensor “noise” and poor calibration
between detectors, by the atmosphere, e.g. presence of aerosols and scattering
effect, and by the scene itself, e.g. effect of relief on reflection and type of
reflection of the object.

Radiometric Correction. Detector sensitivity will slowly change over time, making
some detectors more or less sensitive to radiance than its neighbours. This results in images
which have a “banding” or striped pattern which needs correcting. This effect can occur in
both mechanical or push broom scanners, as can “pixel drop” when individual pixel radiance is
not recorded. Radiometric distortions (atmospheric attenuation) is also a problem as radiance is
altered by the atmosphere through which it passes. This is especially a problem over water
when there is a lot of atmospheric water vapour, i.e. radiance reaching the detector may be 20%
from the water, 80% from the atmosphere. There are various correction methods, some of
which are described in Butler et al (1988).

Geometric Correction. This will involve several levels of pre-processing. The data
has first to be corrected for earth curvature, earth rotation and satellite attitude errors. After this
the image may still contain geometric distortions, with the center of the scene located to an
accuracy of only a few kilometers. To improve this, a sufficient number of ground control
points, which are readily identifiable on the image and on a map, are selected for calculations of
a least-square fit, and the results are then used to adjust the image to the map co-ordinates.
Maps of different projections, e.g. Mercator, Peter's Conformal Lambert, etc., can be used.