Andy Philpott, Department of Engineering Science, University
of Auckland

The time has sadly come to pass when we must bid farewell to
Hans
Daellenbach as Editor of the ORSNZ Newsletter. The Council is in the
process of
arranging a new editorial team, the members of which will be announced
at the
APORS 97 conference in Melbourne at the end of this year. I think that
we would
all agree that the members of this team have a hard act to follow. When
I
offered to write an editorial, Hans modestly asked that there be no
eulogies
lamenting his departure, so I shall oblige with his request. However I
would
like to dwell here on some parts of the philosophy of OR which Hans has
espoused
in his editorials over the years, and in particular talk about
implementation.

I think that it would be fair to say that Hans and his
disciples advocate
that OR/MS should make a difference in terms of the practical
improvement of
some situation faced by an individual or organization. This is what I
would call
implementation. Sometimes implementation refers to
the fact that a
client has paid for some OR/MS software. But if the software is not
used, and
its answers used to influence the behaviour of the agents, then OR/MS
cannot be
said to have made a difference, or to have been implemented in the
proper sense
of the word.

Since I am currently on sabbatical leave I have decided to
clean up my files
on one of the departmental disk drives. There are optimization problems
on this
drive of every imaginable flavour, but only a handful of them have been
used by
an actual decision maker apart from me and my students. (Of course we
have
gained insights from the models, which may have an indirect influence
on other
decision makers through consulting advice or conference presentations,
but the
direct coupling to solving a problem is missing.) I bet that similar
graveyards
of unimplemented optimization models exist on disk drives all over the
world. In
some cases these models might have been inappropriate or made invalid
assumptions, but in many others they were realistic representations of
at least
some aspect of the decision problem faced. These models failed to make
a
difference because there was not sufficient support in the organization
for the
methodology, nor sufficient time and energy on the part of the author
to try and
create this support. The approach was "here's the optimal solution,
take it
or leave it", and they left it.

There are at least two ways in which one can improve the
chances that OR/MS
will make a difference. The first is to ensure that the models that the
OR/MS
practitioner develops are used to solve the real decision problem, and
not a
similar but ineffectual model which has some elegant mathematical
properties. Of
course the techniques used must be guaranteed to get the correct
solution to a
given model. But if the model is solely an exercise in mathematics,
with only a
passing relevance to a real problem, then even the ?correct' solutions
will be
of little use. This is an issue which Hans has discussed at some length
in
previous issues of this publication.

The second condition necessary for the successful
implementation of OR/MS is
that the user of the model buys in to the solution. By this I mean that
the user
feels some affinity or ownership of the model. This might be because
they
contribute to its development, or have learned about what makes it
tick.
Ensuring this transfer of expertise is an equally challenging task to
developing
the right model and methodology to tackle the decision problem.

So what can be done to help the user buy in? Though ultimately
it depends on
the circumstances and personalities involved, there are some obvious
strategies
which, I am sure, most OR/MS practitioners will recognize. The first
rule is to
start with a simple model before a complicated one. If the end user can
interpret, or even guess in advance, the solution that the model
delivers, then
it gains credibility, and makes the user feel confident with the
approach. If
the model that the OR/MS practitioner seeks to implement for a complex
problem
can be used with slight modifications on a small subproblem, then
applying it to
this is a good way to get the support of the owners of that subproblem.
In
mathematical programming the modelling languages like GAMS and AMPL are
ideal
for this sort of prototyping. Some would advocate simple spreadsheet
models,
which have many advantages for proving a modelling concept to
management. (A
discussion of the merits and drawbacks of these for mathematical
programming
should be left for another editorial.)

The second rule is to proceed gently and patiently. The people
who need to
accept and use the new model probably have methodologies of their own
that they
are unwilling to abandon or modify. If one can encourage them to
contribute to
the development in the model by incorporating some of the features in
which they
have some expertise, then they will treat the model as one of their
own. (A
useful way of presenting computer models is to sell the model as a
solving
engine, where the user interface is the domain of the end-user, who can
design
it to look and feel how they like. As an aside, it is my observation
that when
buying a computer model, users are typically much more impressed by a
slick user
interface delivering an approximate answer than a complicated,
cumbersome tool
delivering the exact answer. This is not to say that we should provide
them with
the former, but it is a useful first step on the way to a comprehensive
computer
model.)

The third rule is "do not blind with science". Most end-users
of
OR/MS models are smart people, but they are usually not mathematicians,
so they
are not going to be impressed by the mathematical sophistication of the
techniques that the OR/MS practitioner uses unless they can either
understand
the mathematics or understand the solutions that the techniques
generate. Either the methodology one uses must be simple, or there is
an education and
empowerment function that must be performed by the OR/MS practitioner
to help
the user accept the techniques.

One of the difficulties with following the rules above is that
it takes a
lot of time and energy. If it is done on a commercial basis then this
makes it
expensive. The corollary is that such an approach is unlikely to be
adopted by a
(profitable) commercial operations research consultancy. It might be
argued that
(at least in the short term) the economically rational position for
such a
company is to sell just a solution, perhaps in a software package or a
weighty
report to the board of directors. This strategy ensures that the
expertise
remains with the consultant, and not the client. A dependent client who
is
likely to come back to the consultant for more solutions is a useful
source of
revenue. However, it is not a strategy that is likely to result in a
deep
acceptance of OR/MS by the client. This comes with education and
ownership.

The issue of implementation is thus one involving education
and empowerment
of the client. Many academics who claim to work in OR/MS do not spend
large
amounts of their time performing this role. It is a shame that
empowerment of
the OR/MS user is not a common academic pursuit, because as practised
educators,
academics are ideally suited to educate users of OR/MS, leading to the
successful implementation of OR/MS models and methods.

The following is the second installment of James Corner
discussion regarding
the teaching of decision analysis at the University of Waikato. As
promised, he
keeps this installment short! Started in the last newsletter, these
snippets
eventually will culminate in a paper for Interfaces. He
thanks all
those readers who responded to last time's discussion.

James L. Corner, Department of Management Systems, School of
Management
Studies, University of Waikato, Hamilton, NZ

Teaching Philosophy

Consistent with the general teaching philosophy of my
department, elements
of experiential, or autonomous, learning are always used when I teach
decision
analysis (see Scott and Buchanan [1992] for a deeper discussion) . This
is in
part due to a recognition of differing learning styles across students,
but also
because I see decision analysis as a topic which must be applied in
order to be
mastered properly. No doubt this is a common feeling among educators.

Three elements of autonomous learning typically are used:
learning
contracts, buzz groups, and outside applications. Learning contracts
are
contracts negotiated between students and the lecturer to establish
what the
student is to learn, how it will be learned, what is to be produced as
a result
of the learning, and how and by whom the work is to be assessed
[Knowles 1986]. These documents are a wonderful aid in building
motivation to learn and they
help shift the responsibility for learning back to the student. They
are used
during those portions of the course which are not lecture-oriented, as
students
typically will be working in groups and on topics different from one
another. They especially are useful for getting students motivated to
begin their applied
projects as discussed below.

The buzz group is a tool for use during classroom sessions.
The idea is to
periodically (say, every 15-20 minutes) break up the class into groups
of 3-5
students to discuss and reflect on what has been said during lecture or
other
discussion. The lecturer can walk around to help seed the buzz group
discussions or usually just gather feedback from the groups. This
feedback is
invaluable in that it allows the more timid students a chance to
interact with
the lecturer and others in the class. It also is a wonderful way for
the
lecturer to focus in on what the class is actually learning, so it aids
in
mid-class clarification and direction-setting.

Buzz groups are examples of what are more commonly referred to
as
cooperative learning teams (CLT's). Levasseur [1996] provides a useful
review
of how to launch such teams. Just like CLT's, buzz groups can be
extended for
use outside of class. That is, if students are asked to learn in the
group
setting and outside of class, classroom sessions can be spent on
building on
this knowledge, rather than on the initial material to be learned.

Finally, as mentioned earlier, I feel that much of the actual
learning of
decision analysis occurs when students are asked to apply it. I
therefore have
mandatory semester-long projects where students help real organizations
tackle
some hard decision problem currently faced by them. At the
undergraduate level,
example projects include: how to get more hydraulic head on a local
hydro-electric dam, whether to change a nearby town's sewage effluent
from an
ocean dumping to a land-based scheme, and options for choice of a
mixing vat in
a local laboratory. In our shorter MBA classes, we have students break
into
pairs, one to act as analyst and one as decision maker, to deal with a
real-world decision faced by the student decision maker, such as which
job to
take, what car to buy, etc. The experiences of the students make for a
wonderful late course discussion which can easily tie back into the
descriptive/prescriptive discussions which began the course (as
discussed in the
last newsletter). That is, students typically will be able to discuss
their
prescriptive experiences with the projects, but also will have a word
or two to
say about the more descriptive issues they saw during their projects.

This is another one of thoseOR Newsletters
that threatened to be
rather thin. In desperation, I dug out a paper I presented at the TIMS
Anchorage Meeting in 1994. Andy Philpott's choice of topic for his
editorial is
a rather interesting coincidence. So you get a double dose of it!

The aim of undertaking an operations research (OR) analysis of
a decision
situation is to improve the quality of decision making. This is
achieved by
providing new insights into the problem situation that could not be
derived by
other means. Viewing OR in this light, its aim thus goes beyond simple
optimization. Furthermore, OR will only achieve this aim, if its
findings are
used as important inputs into the decision making process. Such use I
refer to
as `implementation'. This is not equivalent with adopting the
recommended
solution. It is conceivable that the insights gained will, in fact,
lead to a
decision which may not or only partially follows the recommendations.
The
following example demonstrates this dramatically. A study done by D. C.
McNickle [1994, pp 423-43] for a large wood processing company showed
that the
installation of a second service facility would save close to one
million
dollars net annually in waiting costs alone for an initial investment
of
$400,000. However, sensitivity analysis revealed that a reduction of as
little
as 5% of the rate of arrivals would make waiting times almost
disappear. The
insight gained from that led to the discovery of ways to divert about
5% of the
arrivals elsewhere at practically no additional cost. This was the
solution
implemented.

It is tacitly accepted that implementation is facilitated if
the model or
suite of models and the modeling process satisfy certain desirable
properties.
(For convenience, I will assume the term ?model' to also include
?multiple
models'.). Every student of OR should thus be fully aware of what these
properties are, and even experienced operations researcher will do well
to
remind themselves occasionally. Therefore, it is rather surprising to
discover
the real paucity in the OR literature on what makes a good OR model.
The
best-known paper is the one by J.D.C. Little [1970].

Little's List of Desirable Properties

Little lists five desirable properties for a mathematical
model to meet the
needs of the user of such models. I briefly list them with some
comments:

(1) Simple: Simple models are more easily
understood by the problem
owner or decision maker, who is often mathematically untrained. The
decision
maker will more easily follow the logic of a spreadsheet than of a
complicated
set of equations, which may do little more than the computations
performed in a
spreadsheet Ä admittedly more elegantly. To get simple models, the
analyst
may have to make suitable approximations to the real situation or even
delete
certain significant aspects, which may later have to be taken into
account in
different ways.

(2) Complete. The model should include all
significant aspects of
the problem situation that affect the measure of effectiveness. The
problem
here is to know whether an aspect is likely to affect the optimal
solution in a
significant way before the model is built. Using a systems approach,
i.e.,
exploring all systemic relationships within the context of the total
problem
situation — not simply the narrow definition of the problem formulation
—
will go a long way toward establishing which aspects are likely to be
significant and which ones may have only negligible effects. Obviously,
extensive experience in modeling will help. However, in many situations
only
by building several models, one without these aspects, the others with
various
combinations of them included, and then comparing their answers can a
confident
judgment be made as to the significance of particular aspects. Few
operations
researchers ever do this.

(3) Easy to manipulate. It should be
possible to obtain answers
from the model, such as the best solution, with a reasonable amount of
computational effort. I am reminded here of the situation faced by the
U.S.
meteorological services in the 70s and 80s, where they could produce
accurate
7-day weather forecast only by having very fast mainframe computers
churn away
for 5 days!

(4) Adaptive. A model should not be
parameter dependent, i.e.,
invalidated by sufficiently wide changes in the input parameters.
However, even
reasonable changes in the structure of the problem situation should be
able to
be accommodated by alternative options in the model. If changes
invalidate the
model, it should be possible to adapt it to the new situation with
relatively
minor model modifications only. This is more likely, if the model
consists of a
sequence of small modules that each perform a reasonably separable task
or set
of computations. Any structural changes in the problem situation may
then only
require modifications to or additions of one or a few modules of the
model. An
adaptive model is often referred to as a robust
model.

(5) Easy to communicate with. It should be
easy for the analyst
and/or the user to prepare, update, and change the inputs and get
answers
quickly. In today's world of interactive user-friendly computer
programs and
software, such as spreadsheets (see, e.g., , ) and the new generation
of
mathematical programming interfaces, e.g., GAMS and AMPL, this property
has
become one of the standard selling points.

Little also states that the model user should become the
`model owner'. His paper then demonstrates how a model meeting these
properties was built for
dealing with a marketing mix decision problem.

Note that some of these properties put conflicting demands on
the modeling
process. In particular, a simple model may be unable to capture all
significant
aspects of the problem situation. In such instances, the analyst may
have no
choice but to build a complicated mathematical model. In such cases,
the
decision maker will gain confidence in the model if he or she has the
opportunity to experiment with it, e.g., by exploring if changes in the
input
parameters produce intuitively reasonable changes in the best solution
and, if
not, whether counter-intuitive results can be explained convincingly. A
robust
model may not be simple. A model that includes all significant aspects
may not
be easy to manipulate. The model builder will have to balance these
conflicting
demands and come up with a suitable compromise. This compromise will by
necessity reflect not only the training of the analyst, but also the
amount of
resources in terms of time and funds available for the analysis. It
should also
take into account the likely benefits that can be achieved. It may be
economically more advantageous to use simple quick-and-dirty rules that
only
capture 50 to 80% of the potential benefits, rather than develop a
sophisticated
and expensive models that may capture 95%. The cost of developing a
mathematical model, collecting the required input data, computing the
best
solution, implementing the model, and finally operating and maintaining
it all
increase much more than proportionately as the sophistication of the
model
increases, while the additional benefits go up less than propor
tionately —
anothr case of increasing marginal cost and decreasing marginal
returns. All
mathematical models are thus to varying degrees approximations to the
real
situation as perceived by the analyst.

Literature on Validation

Few other papers explicitly address this question. The ORSA
guidelines for
the practice of OR [Caywood et al., 1971] cover some aspects,
particularly those
dealing with the professional and ethical accountability of the
modeler, and S.
Gass [1981] introduces the notion of model credibility. However, it is
the
literature on simulation, and model validation and verification that
extensively
contributes to the topic of desirable model properties in an indirect
way, by
studying what facilitates validation of models.

Fishman and Kiviat [1968] define validation as the process of
assessing the
agreement between the behaviour of the model and the real world system
being
modeled. The difficulty with this definition is the meaning of `real
world
system'. Since the `real world' cannot really be observed, all the
modeler can
do is to compare the model with his or her perception of the `real
world'. Furthermore, that perception may be different from the one of
the decision maker
or problem owner. Such differences in perception need to be resolved
during the
problem formulation stage. The basic premise must be that the problem
owner is
the expert on the working of the system, and it would be rather
presumptious to
assume that the modeller's perception is the correct one. So the onus
is on the
modeller to show beyond a reasonable doubt that her or his
interpretation is the
more useful one for modelling purposes. With this difficulty in mind,
let us now
see what we can infer from the literature on validation.

The April 1993 issue on Model Validation of the European
Journal of
Operational Research contains a number of highly relevant
papers that
contributes significantly to the topic of desirable properties of
models [Landry
and Oral, 1993; Déry et al., 1993; Oral and Kettani, 1993; Gass, 1993].
These papers echo the theme appearing in the simulation literature
already in
the mid-eighties [Carson, 1986], namely that whether or not a model
will be used
rests largely on the credibility and confidence
the problem
owner and user has in its ability to produce useful information.
Unfortunately,
neither these papers, nor the simulation literature, offer much
practical
guidance on how to foster credibility and confidence. For example, Law
and
Kelton [1991] imply that `establishing credibility' is equivalent to
`selling
results to management' (Fig. 5.1, p. 299, opus cited). But, as Gass
[1993]
points out, credibility and confidence are not attributes of the model,
but of
its user. Little's paper lists desirable properties of the form of the
model. They may help, but are only incidental to the perception of the
problem owner's
confidence in and credibility attributed to the model. Satisfying
Little's
criteria for a good model may thus be far from sufficient for a model
to be
implemented and used.

Confidence and credibility add a new dimension to the notion
of desirable
properties of models. In fact, it may not be useful to talk about
desirable
properties of models, but of desirable properties of the
modeling process,
since user credibility and confidence are more related to the form of
that
process and the interactions with the modeler, than to the model
itself. Naturally, Little already hints at this when he states that the
decision maker
should `own' the model. This is confirmed by the research on which
factors
enhance the likelihood of implementation. The literature on
implementation
discusses such aspects [e.g., see A. Reisman and C. A. de Kluyver,
1975]. Fostering the feeling of `ownership' of the model and modeling
process may not
be a easy. It requires appropriate people skills of the modeler and a
substantial commitment from the problem owner. The former may not have
the
necessary skills training, while the latter may not be willing or able
to devote
both the required effort and time. Furthermore, in many instances the
model's
sophistication precludes active involvement of the problem owner in the
modeling
process. Hence `ownership' has to be brought about by other means.

It is also clear that with this we enter the murky area of the
psychology of
modeling and its process — another indication that the practice of OR
cannot be disassociated from the people and the modeling environment,
including
the social, cultural, and political facets in which the process of
modeling
occurs. Denying or ignoring these aspects is the major cause for
implementation
failure. Unfortunately, with minor exceptions, the literature on OR
concentrates almost exclusively on the mathematics of OR and the form
of the
models. Even the reports of successful implementations, such as the
accounts of
the Franz Edelman Award winners, published in January issue of Interfaces,
or the Application Papers, regularly featured in each issue of Operations
Research, are mostly silent on this aspect.

It is beyond the scope of this short paper to go into the
extensive
literature on implementation and address the modeling process. Instead
I will
limit myself to draw some conclusions from the points raised in the
validation
literature in the form of two additional desirable properties for
models that
will facilitate adoption and use of models.

Appropriateness of Model and Relevance of Output

First, I will take for granted that the model actually deals
with the
problem situation as perceived by the decision maker. Looking beyond
the
process of modeling, what are some of the crucial aspects that help
building up
credibility and confidence of the problem owner that the model is
useful? [Daellenbach, 1994]

(6) The model is appropriate for the situation
studied. By this is
meant that the model has the appropriate level of resolution
for
producing the relevant outputs at the lowest possible cost
and in
the time frame required for effective decision making. For
example, a
simple financial spreadsheet may well be the appropriate choice of
model if our
objective is to provide a sufficiently accurate estimate of the
company's
profits for the next quarter quickly and with minimal effort, whereas a
simulation study which models the movement of every single widget along
the
production line will not. Even if it also produces suitable financial
variables, its level of detail and resolution will be excessive and
take too
much time. Hence it will not be appropriate for the situation studied.
On the
other hand, if our objective is to estimate the maximum possible rate
of
production, the location of bottlenecks, or the size of buffer needed
between
consecutive production stages, the simulation model will be able to
mimic the
appropriate level of resolution, whereas the financial spreadsheet will
not.

To derive the outputs relevant for decision making, a `good'
OR/MS model may
not necessarily show details of or resemble the physical system we are
studying. What is important is that the model enables the analyst or
problem owner to
measure how well the stated objectives have been achieved by the
proposed
solution, and that this information is provided cheaply and in a timely
fashion. Such output can often be more efficiently be achieved by a
black box approach,
rather than a detailed representation of the system's transformation
process of
inputs to outputs.

(7) The model produces information that is relevant
and appropriate for
decision making. This means that the output of the model has
to bear
directly on the decision process, has to be useful for decision making,
and has
to be in a form that it can be used directly as input for decision
making,
without the need for further extensive translation or manipulation.
This does
not imply that the problem owner may not have to use judgment in
interpreting
the information provided. But the information should lead to new
insights
into the problem situation that the problem owner could not
easily obtain by
other means. Such insight is often obtained through sensitivity
analysis in
terms of the effect of deviations from the proposed solution and of
changes in
various inputs on the optimal solution, as well as the effect of errors
in
various inputs, particularly those that are subject to uncertainty or
are costly
to ascertain at a sufficient degree of accuracy.

For example, referring back to the waiting line study
mentioned earlier, the
model output provided not only an estimate of the potential net savings
due to
the addition of a second service facility, but also information about
waiting
times as a function of the traffic intensity. It was the latter
information that
gave the problem owner the insight necessary to find a much cheaper
solution to
the original problem.

If the model satisfies these two properties and the analyst
can demonstrate
this to the problem owner and the intended user of the model, then this
will
considerably increase the likelihood that the latter will judge the
model as
usable and useful. This will enhance the
problem owner's confidence
in the model and her or his willingness to implement its findings.

As mentioned earlier, confidence in and credibility of the
model do not
necessarily require an thorough understanding of how the model works.
It may be
largely intuitive, based on a demonstration that the model gives
usable,
sensible, expected, and explainable answers in a timely fashion and
with a
reasonable expense. Furthermore, it will be strongly influenced by the
working
relationship between the modeler and the problem owner/user and the
latter's
involvement in the modeling process itself. This is particularly true
and
important if the working relationship between the modeler and problem
owner is
new. Once a sufficient degree of trust has been established between the
two,
then it will be much easier in subsequent projects to gain the problem
owner and
user's confidence in the model and its findings.

One Sunday a few months ago, I made myself a large pot of tea
in preparation
for grading my third-year Operations Research students' papers. As part
of
their assessment, I require each student to write a three to four page
review of
a journal article (a practice I brought over from my teaching in the
States). Halfway through the first article, I was dismayed at its weak
content and
improper construction of ideas. I was further surprised when I realized
that
the paper was written by one of my better students. Disappointed, I
abandoned
the first paper and randomly selected another but I was presented with
the same
inadequate level of quality. In total, I read eleven papers that day
with only
two that I would marginally label as well written technical papers. The
other
nine were filled with circular arguments, illogical conclusions, and
poor
sentence construction that conveyed the authors' poor analytical
skills, as well
as lack of breadth in their educational background.

The Monday after, I visited a few colleagues in other
departments to
complain (a gran 'ol academic tradition) and I was mildly surprised to
hear
similar complaints from those colleagues (even those in the Humanities
and
Social Sciences). I am convinced that we need to consider introducing
"Breadth
Requirements" for our undergraduate students. In our current program we
are producing students who have have a narrowly focused education based
on
fulfilling a single major (or, for some, two related majors) without
any
structured general education. I firmly believe that all students would
benefit
from an undergraduate programme that requires them to take courses from
other
departments or fields. A general education will give students the
opportunity
to become acquainted with intellectual, social and aesthetic
perspective that
can, from the basis for an expanded plan for lifelong learning and
enjoyment as
well, assist students with their programme at the university.

I propose a programme where a semester of mathematics (the
language of
sciences) and English composition would be required in the first year
of each
student's career. Both these courses will help students to develop
their
communication and analytical skills which will facilitate and enrich
further
studies. I would further require an additional four courses outside the
Field
of Students' Major. These courses would be taken one from each of the
Humanities, Social Sciences, Physical Sciences, and Biological
Sciences. This
requirement will only take up about 1/6 of a typical student's
programme and
will not hinder double majoring, but at the same time give a great
opportunity
for students to learn some basic ideas from fields outside their own
major. Departments can also benefit from such requirement by developing
courses that
will cater to a general audience.

The Ministry of Research, Science and Technology (MoRST) have
contracted me,
through my role as Chairman of the Royal Society of New Zealand
Standing
Committee on Mathematical and Information Sciences, to organize a
Review of
Mathematical Sciences in New Zealand.

The first phase was to prepare a report on the underpinning
requirements of
mathematics and its associated disciplines in relation to the
socio-economic
framework used by Government Science in the PGSF and the associated
disciplines
and technology requirements. This was subcontracted to Malcolm Menzies
of
Victoria Link Limited.

The second phase consisted of identifying a time line and
action plan for
the exercise; the development of survey instruments for assessing
research and
user requirements to meet gaps and opportunities for the sectors
identified and
to develop a discussion paper to consider issues identified in the
Terms of
Reference. This paper was presented at a session during the
Australasian
Mathematics Convention and NZ Statistical Association Annual Conference
held
recently in Auckland. (Copies of the paper are available from Jeff
Hunter). The
results of this exercise will feed into the final report.

The final phase centres around the following Terms of
Reference:

General Task

Through a foresight exercise, prepare a report on future
likely developments
in mathematical sciences in New Zealand and internationally, and assess
their
impacts on 1) other science disciplines, 2) socio-economically driven
science
(i.e. PGSF outputs), and 3) the socio-economic sectors of the New
Zealand
economy and society, including supporting mathematical services for
these
sectors over the next 25 years. The report should use the Knowledge
Base and
other reports on mathematical sciences and be able to contribute to any
future
priority-setting for science, particularly for the PGSF.

Knowledge Foresight

For the field of mathematics identify: knowledge trends, and
likely
developments (where is the strength, where is it developing
internationally and
in New Zealand); performance outside and inside New Zealand in
identified "gaps";
"breakthrough" areas of mathematics and its applications; the
implications of new technologies in computing, information and
communications in
the use of mathematics; the opportunities for socio-economic sectors
within New
Zealand in mathematical developments and the supporting needs for
mathematical
services; implications for PGSF priorities including any shift(s) in
socio-economic science priorities; the enabling science capability
required to
meet identified opportunities including infrastructure, human and other
resources.)

Issues such as links with interdisciplinary science and
international
linkages should be considered. Development of the report should involve
consultation with key providers, users and funders. The study should
also fully
assess currently available information and analysis relating to
mathematics
foresight in order to avoid duplicating effort. Bibliometric and other
quantitative analysis should be used where appropriate.

The review will be modelled along the lines of the very
extensive review
carried out in Australia which culminated in the widely publicized
document "Mathematical
Sciences: Adding to Australia". A small Secretariat has been
established
to coordinate the review and subject area coordinators are being
appointed to
facilitate input into the review. Towards the end of the year, survey
results
will be presented in a series of regional meetings (Auckland, Hamilton,
Palmerston North, Wellington, Christchurch and Dunedin). From these
meetings a
set of findings and a series of recommendations will be compiled and
presented
to a Workshop, to be held in Wellington late in the year. This will
then be
followed by the writing of the final report which is due for submission
to MoRST
in April 1998.

Although the review has been commissioned by MoRST, the
mathematical
community needs to take full advantage of this window of opportunity to
inform
government of the problems being faced not only by researchers but also
the
difficulties being experienced across all sectors of the mathematical
and
statistical disciplines.

Your interest and involvement in this exercise will be
welcome. Submissions
on any items of the Terms of Reference can be made at any time to
Professor
Jeffrey Hunter, by mail (Massey University, Private Bag 11-222,
Palmerston
North), by email, or by fax (06 350 2258).

On Friday, 27 June 1997, Professor James Ho presented a
seminar through the
Centre for Continuing Education (and in association with the ORSNZ) at
the
University of Auckland. The seminar was titled "Internet Strategies:
Beyond Web Sites and Home Pages" and was a full 1-day program starting
at
9am.

Jmes Ho, professor of information and decision sciences at the
University of
Illinois at Chicago, visited NZ earlier this year as an Erskine Fellow
at the
University of Canterbury. He is author of "Prosperity in the
information
age: Creating value with technology - from mail rooms to boardrooms"
(1994). His latest work on evaluation of the World Wide Web has gained
international recognition and is featured as an business resource by
numerous
business organizations and international media.

Business on the Internet holds tremendous promises of opening
up global
markets and streamlining transactions. Professor Ho's seminar was aimed
at
examining the critical issue of value creation over the Internet,
assessing
current practice worldwide and offering practical ideas and suggestions
for
those involved in creating strategies for their business on the
Internet.

The seminar was advertised as being aimed at Senior executives
and managers. Regrettably, it was not well attended with approximately
8 people external to
the University with an additional 4 - 5 people being made up of
interested ORSNZ
and university staff.

It was interesting to see the results of the application of
the
purpose-value framework that Professor Ho had developed. It analyses
Web sites
from the customer's perspective of value-added. He applied it to a
sample of
1000 North American web sites (see "Evaluating
the World Wide Web:
A Global Study of Commercial Sites", J. of Computer-Mediated
Communication, 3/1, 1997; http://207.201.161.120/jcmc/vol3/issue1/ho.html)
and then followed this up with comparative studies (with smaller
samples in 20
selected industries) in 8 other countries. It was clear from these
studies that
the use of the Web to process business transactions is largely
undeveloped ( as
at the time of the study - 1996) and this is a key area where Professor
Ho felt
competitive advantages would arise.

This study and the issues that he highlighted would be useful
in defining a
strategic approach to creating a presence on the Web for businesses.
However, I
felt the majority of the seminar was pitched at a level below the
participants
knowledge base. I believe most participants were aware of the Web's
presence
(and hence did not need a long introduction) and were more focussed on
how they
could use the Internet to gain a competitive advantage and whether it
was worth
investing the time and money into having a presence. Not enough time
was spent
discussing how businesses could develop strategies to make use of the
information gleaned from the study. Later in the day, Professor Ho
dived into
the Web and presented some examples of innovative Web sites. This was
fun, but
I think that participants would have liked to have seen more examples
besides
monster.com, and those with links to Professor Ho's own site.

The afternoon session was aimed at giving the seminar
participants time to
use the insights they had gained from the course to build a "live
case". Unfortunately, many of the participants had decided to leave
before this
session, and so the session amounted to a tutorial led by Professor Ho
based on
how one might design a Web site for a school or university. The
workshop
concluded with an enjoyable social hour.

Massey Mutterings

Massey University, like everyone else in this universe, is
going through
some changes, and it will not be an exaggeration if I report to you
that the
former faculties now constituting the College of Sciences (The
Mathematics
Department's future home) are going through rather extraordinary
metamorphoses. As of the first day of January 1998, all departments
cease to exist and
faculties will be replaced by "Institutes". The Department of
Mathematics has "decided" to join the Institute of Fundamental
Sciences (i.e., to be with Physics and Chemistry). The Operations
Research
Group seems to have more choices for its future home, but due to the
lack of
information we have yet to arrive to an optimal solution. Presently we
have
four options:

OR within in the Institute of Industrial Innovation
(a.k.a., Technology+)

OR within the Information Sciences and Technology
Institute (Modified
FIMS)

OR within the College of Business (A radical move)

OR within the Institute of Fundamental Sciences

If we want to keep the Stochastic and the Deterministic parts
of OR together
our choices will be limited to options 2 and 3 with the latter being a
rather
risky proposition. Will keep you posted . . . . .

Of course, I cannot write this column without mentioning the
Journal of
Applied mathematics and Decision Sciences. The first issue of
JAMDS is out,
and I have made it available, free of charge, on our Web page
(http://fims-www.massey.ac.nz/~maths/jamds/). This is a limited time
offer and
I hope that it will give you an opportunity to learn more about JAMDS.

By the way, John Giffin is still in hibernation and will not
come out 'till
Lois & Clark is back on first-run TV.

Mainland News

The big news is obviously that we have Professor Fred Glover
of the
University of Colorado, Boulder, as a Distinguished Erskine Visitor
until early
December. Fred Glover needs no introduction. He is Mr Tabu-Search (a
nice
hyphenated name) in person. He will offer a number of seminars and do
some
joint research with John George and Ross James. He and John have
organized a
Tutorial on 'Tabu Search in Heuristic and Exact Methods for Integer
Programming'
at the Melbourne APORS Conference. So those of you who will not make a
pilgrimage to Christchurch, you will have your chance to meet him in
Melbourne.

After almost 7 years with us, Bruce Lamar has decided to
return to the US
and taken up a research position with the Mitre Corporation in Boston ([email protected]).
He made a valuable contribution to the department during this time
beyond his
single-minded determination to improve the world of networks and talk
to us
through NETSPEAK. True to his traits, he packed the marked assignments
of our
graduate class and shipped it to the US. He promised to airmail them
back to
us!

James W. Bryant from the Sheffield Hallam University ([email protected])
will visit Canterbury for a few days on his way to the APORS
Conference. He
will give a workshop on Drama Theory, which will also be the focus of
his three
papers in the Problem Structuring Stream at APORS.

The Management Science group in our department has repackaged
our current
second- and third-year courses into half-year courses (worth 3 points).
In
part, it simply meant splitting existing courses into two parts. For
others,
the material covered was repackaged, with a slight shift towards a
fuller
coverage of production and operations management by the introduction of
courses
in supply chain management and in project management. On the OR side,
we have a
new case course in modelling at the second-year level with plans to
introduce
another case course at the third-year level.

No Wellington Branch News

They have all taken cover for the imminent explosion of ?NZ
First', or
should they be renamed ?NZ Once'?

No Research at Waikato

MoRST review: Andy Philpott has been asked
to assume responsibility
for the Operations Research component of the MoRST review of
mathematical
sciences in NZ, being conducted by Professor Jeff Hunter on behalf of
the Royal
Society. Selected members of ORSNZ may be contacted by Andy in due
course with
respect to this exercise.

Financial position: Andrew Mason reported
that we are in a good
financial position with unaudited Net profit before adjustments of
approximately
$4700. Our current membership consists of approximately 160 members.

Election of new editor for OR Newsletter:
Hans Daellenbach
has tendered his resignation as Editor of the OR Newsletter.
Council
would like to thank him for his huge contribution. If anyone could
propose a
new editor, please contact Andy Philpott.

ORSNZ Visiting Lecturer: One application
was received for the ORSNZ
visiting lecturer. He was John Ranyard ( see note below).

Representative for Manila conference: The
next IFORS conference is
in Manila. A representative from the ORSNZ has been invited to speak to
them
about OR in NZ. No funding has been catered for this, but Council will
give a
grant in aid of $200.

Other business: Bruce Lamar has resigned
from his position as
Senior Lecturer in OR at Canterbury University and will be moving back
to the
USA. Council wishes Bruce well. Steve Butt has resigned from the
Auckland
University and has taken up a position at Michigan. Council would like
to thank
Steve for his invaluable assistance on affairs of the ORSNZ and wishes
him well
in his new position.

Membership on Internet: Do not forget that,
if you agreed to
publication of your name, it is now available on the web (http://www.esc.auckland.ac.nz/Organizations/ORSNZ/)

Vicky Mabin, School of Business and Public Management ,
Victoria University
of Wellington, Wellington, N.Z.

This will be my last pre-APORS item, as by the time the next
newsletter
comes out, APORS will be already over! With this newsletter, you will
be
receiving an APORS'97 Brochure, containing program information and a
registration/accommodation booking form. As you will see, there is an
exciting
program arranged. There are around 470 invited and contributed papers,
19
invited streams, 10 tutorials, and 2 keynote speakers, including our
very own Dr
Grant Read, plus 2 post-conference workshops. There are also two
further
post-conference workshops, to be run by the Avraham Y. Goldratt
Institute to
complement the Theory of Constraints stream at APORS, for which you
should also
find a flyer enclosed.

In addition to the fine conference program, the APORS team
have also
assembled an array of enticing social events for participants and
partners,
including a welcome cocktail reception, dinners, shopping, and
Aboriginal arts. Additionally, there are no papers on the Wednesday
afternoon: instead there is a
choice of two trips, either to the penguins or the sanctuary and
winery, or
golf! I gather the fairy penguin parade is a "must" for tourists from
many parts of the world, though for me the wildlife sanctuary and
winery tour
will no doubt win the day, and I know many of my colleagues will choose
the golf
at RMGC.

If you have not already done so, just take a look now at the
brochure. And
if you needed an added incentive, our one-and-only ORSNZ Annual General
Meeting
will be held on Monday 1st December from 6 - 7:30pm. There'll be a
chance to
unwind after the day's papers, catch up with old friends and make new
ones,
before the serious business of the AGM, and dinner at Melbourne's
Southbank
complex. Sounds too good to miss?

The organisers estimate that around 400 people will attend.
Many leading
researchers from all over the world will be attending, including some
high-profile people. For example, I have just heard that: the current
and
immediate past presidents of IFORS will be there, the inaugural
president of
APORS will also attend, and there are also quite a few high profile
people from
Europe. And the nice thing about APORS is that you will be on exotic
yet
familiar territory and it will be easy for you to meet these people.

You may like to check the data-base to check on abstracts,
streams,
attendees, etc. The web site address is:

http://www.maths.mu.oz.au/~worms/apors/apors.html

We are also likely to have many delegates come to NZ en route,
and we have
arranged a local travel agent to assist overseas visitors plan their
visits to
NZ. You may wish to expand your networks, and invite some of them to
visit you. If you are interested in hosting visitors, let me know and
I'll see what I can
do.

Any other questions or comments, please feel free to contact
me, and I look
forward to seeing as many as possible in Melbourne in November.

Some key dates:

Early bird Registrations close on 30 September.

Accommodation can be booked up through PRCC till 14 November,
but they
advise to book early to avoid missing out on your preferred choice.

Conference dates: 30 November - 4 December

P.S. The annual conference of the Australian and New
Zealand Academy of
Management is also on in Melbourne, during the latter part of the same
week, and
may wish to take advantage to attend parts of both. I can provide
details for
anyone interested.

Vicky Mabin, School of Business and Public Management ,
Victoria University
of Wellington, Wellington, NZ

The ORSNZ Council has awarded the newly-created ORSNZ Visiting
Lecturer
award to Dr John Ranyard for 1997. John will be visiting us in
December,
following APORS'97. He will be giving talks in Wellington and Auckland,
on 8
December and 12 December respectively.

John is the Editor of the prestigious Journal of the
Operational Research
Society, the official journal of the OR Society in the UK. He will be
talking
on trends in OR practice in the UK. John is well qualified to speak on
this
subject, having been a practitioner in industry for nearly 30 years, a
manager
of a 25-person OR group at British Coal for 20 years, and the co-author
of a
recent study, sponsored by the OR Society. That study looked at the
success and
survival of OR groups in the UK, and noted the changing pattern of OR
practice,
including the dispersal of some central OR groups and the growth of
external OR
consultancy. The study also came up with some lessons for the continued
success
of OR practitioners and OR groups, which he will share with us.

John also chairs the OR Society's membership committee, and he
would be keen
to discuss the results of a recent survey of overseas members, and talk
to ORSNZ
members about future developments of the JORS and OR Society's services
to
remote members.

We hope as many members as possible will be able to attend the
meetings
which, given the time of year, are likely to incorporate a bit of
festive cheer,
as well as an opportunity to discuss the state of OR here and in the UK
(…
and elsewhere if you fancy).

Any enquiries or suggestions about Dr Ranyard's visit can be
sent to Andy
Philpott, or directly to Vicky Mabin, who is coordinating his itinerary.