Seamless Object-Oriented Software Architecture

The book spends roughly equal time on notation, process, and case studies.

Authors:
Kim Wald and Jean-Marc Nerson

Publisher: Prentice
Hall

ISBN: 0-13-031303-3

Reviewer: Dan Wilder

If you have some experience with object-oriented programming,
and you are looking for a compatible design method, read this book.
The authors describe a method (they avoid the term “methodology,”
and in Chapter 6 tell us why) based entirely on object-oriented
concepts. In use since 1990, Business Object Notation (BON) avoids
starting with the familiar data flow, entity-relationship, or state
transition diagrams. System use scenarios, prominent in other
methods, are found here, but not in a fundamental role. What you
will find are classes, featuring inheritance and client relations;
clusters, flexible groupings of classes; and objects, that is, the
run-time instances of classes. Original and quite sensible
graphical and textual notations are described, suitable for garage
floor, white board, or CASE tool. The book spends roughly equal
time on notation, process, and case studies. Several appendices
present condensed information. A nice glossary and a fine
bibliography provide the icing on this cake.

Seamless Object-Oriented Software
Architecture is the first widely available full-length
discussion of BON. For a fresh look at issues of object-oriented
software development, the book is worth reading even if you're
happy with another method. If not, consider this one. The book is
readable; it doesn't get bogged down in minutiae, but it covers a
lot of material. Be warned: these authors hit the ground running.
If you are not already familiar with object-oriented concepts,
start with a more introductory book.

Among BON's key ideas are two I will discuss briefly. First,
reduce the conceptual gap between design and implementation.
Second, provide means to selectively abstract from the welter of
low-level details. The two ideas synthesize well. The resulting
model is of a single piece, even while a view of it may range over
many different abstraction levels. Hence the use of “seamless”.
Take a detailed look at a small piece of the model, in a context of
the most abstract view of the rest, and it fits into place
perfectly.

The conceptual gap between design and implementation is
reduced by eliminating difficult, clumsy, or irreversible
transformations from the picture. Data flow diagrams, state
transition models, entity-relationship diagrams, and so on, while
considered useful for specialized problems, are here dismissed as
foundations for a general-purpose method. Rather, the effort is to
explore the application of class, object, inheritance,
polymorphism, and the software contract, to the higher-level
representation of systems.

Abstraction is facilitated by the easy transition between
levels of detail in the BON models, and also by the rich semantic
content lent to the class interface description by the software
contract. This contract is a part of the class interface, spelling
out the class requirements and obligations, independent of the
program code, which often won't exist when the interface is first
described. This use of contract provides real substance in the
design, in a way that bubbles and arrows just can't do. It does so
in a way that is understandable in a context of the more abstract
bubbles and arrows. Zoom out for perspective. Zoom in for detail.
And the detail always makes sense in the context of the larger
picture. Or else it doesn't, and this tells you either the detail
or the picture must be changed! Better to find this out early,
before the system is nearly implemented, and changes become much
more expensive. A good design method should help you find this sort
of thing.

The focus is always on the design of coherent, well thought
out classes which embody what you know about some concept or idea.
These furnish the basis for software re-use. In the short term,
within the scope of their originating project, they are fastened
together perhaps more than once, as the definition of the project
changes, using relatively transient “glue” classes that give a
particular system its shape and particulars. Thus re-use begins at
home, and the system is not hedged in by premature rigid definition
of what is in many cases the most volatile aspect of a system: its
external interface.

The notion of coherent re-usable classes bears some kinship
to the traditional Unix “small sharp tools” philosophy, where
programs that do one thing well may be combined in unanticipated
ways to perform work not contemplated when the tools were written.
However, the flexibility of the object-oriented framework is much
greater. The key in is having well-focused tools: in the Unix case,
binaries like ls and
find; for object-oriented
programming, classes like LINEAR_ITERATOR or
BINARY_TREE. Or perhaps
PATIENT_ACCOUNT or
STEPPER_MOTOR.

The invention of such classes and their combination with
pre-existing classes to form a working system is an incremental
process requiring many trips back and forth from high level design
through implementation. As in many other methods, you start at a
high level, produce a rough cut at a design, then immediately begin
implementation. Selected subsystems are targeted, usually not the
easiest ones. This provides a reality check on the design. Then,
back to the high level to revise by what you have learned, return
to implementation, and so on. Implementation throws the cold clear
light of day on design, design guides implementation.

From time to time you make a side trip into system use
scenarios. These do not direct the organization of the system, but
rather test the evolving design. The typical situation: here is
something it would be reasonable to do; does this set of classes
support the reasonable behavior? Sometimes it doesn't, so you go
back and figure out what additional useful ideas might be wrapped
in classes. The use scenarios are accompanied by object scenarios,
showing the interplay of objects to accomplish the use scenario. A
novel graphical notation is used, which allows easy depiction of
interactions between many more objects than the conventional ladder
or lattice-like interaction diagrams often used elsewhere.

The middle of the book, chapters 6 through 8, discusses the
process of system development under BON. Some readers may want to
begin reading here, as this part of the book talks a lot more about
the “how” and “why” of the method. Nine standard tasks are
completed, not necessarily in order, each by some mix of nine
standard activities. The tasks, the subject of chapter 7,
are:

Delineate system borderline

List candidate classes

Select classes and group into clusters

Further define classes

Sketch system behaviors

Define public features

Refine system

Generalize

Complete and review system

The activities, the subject of chapter 8, are:

Finding classes

Classifying

Clustering

Defining class features

Selecting and describing object scenarios

Working out contracting conditions

Assessing reuse

Indexing and documenting

Evolving the system architecture

Each task and activity is discussed at some length. These
authors don't just dump a notation on you and leave you adrift;
some care has gone into describing just how you might proceed.
While emphasizing over and again that satisfactory performance is
not subject to pat answers, but rather requires talent, experience,
and insight, Waldán and Nerson nonetheless manage to provide
what sounds to me like good advice about each of the tasks and
activities. In a literature where solutions that are too simple
abound (“Model the physical objects,” “Don't use multiple
inheritance,” “Encapsulate interface, data, and process in
separate classes”) the thoughtful advice in these chapters is
welcome.

I'll be bringing you further report in a few months. With the
help of the Linux port of EiffelCase, the BON tool from Interactive
Software Engineering of Santa Barbara, California, I will attempt a
small freeware project using the advice in this book. My success or
failure, and the delights or frustrations encountered, will furnish
the topic of my next article.

Dan Wilder
(dan@gasboy.com)
writes programs and prose in Seattle, Washington.
A buildmeister by day, Linux fanatic and newsgroup surfer by night,
he also finds time to get outdoors, play with his two darling
children, and pick apples.

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.