B.S. from Duke University, 1977; Magna Cum Laude, four
years on the Dean's list with class honors. Majors were Physics and
Philosophy, with a minor interest (eight course credits, four at the
graduate level) in Mathematics.

``Ferromagnetism in two dimensions'', presented at the
1991 Southeastern Sectional meeting of the American Physical Society
in Durham, NC.

``Multipolar Expansions for Multiple Scattering Theory'',
presented at the 1991 Materials Research Society Fall Symposium
(session V) in Boston, MA; published in the symposium
proceedings.

``The 2d/3D classical Heisenberg ferromagnet'', presented at
the March, 1992 Simulation Methods Workshop at the Center for
Simulational Studies in Athens, GA; published in the workshop
proceedings. (Springer-Verlag).

Monthly column in Cluster World Magazine, December 2003 to
current. A list of column titles and topics follows. The column itself
was initially called ``Cluster Kickstart'', which explains the focus on
topics for the beginner. In January, 2005 the column's name was changed
to ``Cluster Edge'' and topics are more free-ranging.

The beauty of cluster computing is that it requires little
more than a generic workstation LAN to do it. We begin to explore
cluster computing with just that: a "Network of Workstations" (NOW) that
you may well already have!

January 2004

Doing Work in Parallel

Last month we started out by learning how to use pretty much
an arbitrary linux LAN as the simplest sort of parallel compute cluster.
This month we continue our hands-on approach to learning about clusters
and play with our archetypical parallel task on our starter cluster to
learn when it runs efficiently and just as important, when it runs
inefficiently.

February 2004

Amdahl's Law

Clustering seems almost too good to be true. If you have work that
needs to be done in a hurry, buy ten systems and get done in a tenth of
the time. If only it worked with kids and the dishes. Alas, kids and
dishes or cluster nodes and tasks, linear speedup on a divvied up task
is too good to be true, according to Amdahl's Law, which
strictly limits the speedup your cluster can hope to achieve.

March 2004

Serious Parallel Computing: PVM

The idea of a homemade parallel supercomputer predates the actual
Beowulf project by years if not decades. In this column (and the next),
we explore "the" message passing library that began it all and learn
some important lessons that extend our knowledge of parallelism and
scaling.

April 2004

PVM, Part II

In this column we continue our exploration of PVM, the parallel
computing subroutine library that more or less enabled the current
explosion of high-performance parallel compute clusters to happen.

May 2004

PVM, Part III

In this column we write and run a very simple PVM application to "get
started" with PVM.

When the only work being ranked is driving nails, the only tool that is
valued is the hammer. Too bad if your work involves driving screws...

March 2005

Benchmarking and Benchmarketing

Competition is good, but a single measure of performance in one
dimension is not terribly useful for optimizing in a multidimensional
space. We can do better.

April 2005

Newbie Cluster Tasks

So you've built that new cluster, for fun or eventual profit, but had no
specific task in mind. You want to test it out. But how?

May 2005

A Modest Proposal

What if we made a benchmark daemon a built-in component of standard
Linux? Tools with a library interface could optimize in many
useful ways, and automagic resource aware cluster schedulers would
finally become possible...

This website gets over 6 million hits a year from users downloading 66
gigabytes of online content authored by Brown ranging from free physics
lecture notes and online textbooks to computing information and poetry.

http://www.phy.duke.edu/rgb/Class/Class.php
Contains online lecture note-style textbooks on introductory physics in
association with learning support materials, as well as an online
textbook on Classical Electrodynamics. (3216854 hits over 12 months.)

http://www.phy.duke.edu/rgb/General/general.php
Contains a number of Gnu Public Licensed (GPL) software packages
authored by Brown as well as project templates, tool documentation, and
other objects that are useful to the general internet community. The
GPL packages are futher enumerated below. (808300 hits over 12 months.)

The following are Linux-based GPL programs written by R. G. Brown and
made available on the web:

Dieharder

Dieharder is a fully GPL random number generator tester, under
development by Brown. It currently incorporates all of the tests from
George Marsaglia's Diehard tester, several tests from the NIST
Statistical Test Suite (with more on the way), and a number of
tests devised by Brown.

Dieharder is in active use by an increasing number of research
groups because it subjects random number generators to far more
strenuous tests (with user-adjustable parameters that permit the user to
determine the power of the test) than previous generators. A community
is developing that is contributing ideas and code and helping to debug
the tool. Dieharder is available in a linkable library and has
been incorporated directly into the R statistical suite by Dirk
Eddelbeutel. By virtue of its power, Dieharder is serving as a
test of its own code - possible weaknesses in two Diehard
routines have been revealed by it.

Wulfware is a collection of several tools (xmlsysd, libwulf, wulfstat,
wulflogger) designed to support the monitoring of clusters and grids.
xmlsysd is a lightweight daemon that provides xml-wrapped system
statistics and other information extracted from /proc and various
systems calls. wulfstat and wulflogger are ncurses and straight ascii
(respectively) tools for connecting to the xmlsysd daemons running on an
entire cluster and either presenting it with a user-selectable refresh
delay in a tty (xterm) window or printing it in a simple column format
to standard out where it can easily be fed to a log file for eventual
plotting or to other tools (e.g. a builder of a web view of the data).
This is of obvious and immediate use for monitoring cluster status,
tracking particular jobs, determining resource utilization for gridware
schedulers or policy engines.

benchmaster is a microbenchmark program designed to time and test system
performance at a low level. It will eventually be added to the wulfware
suite as a component of xmlbenchd, a new project that provides a
daemon interface to xml-wrapped drop-in benchmark programs so that
applications can be built that can automatically tune their algorithms
to the particular hardware they are running on and so that grid tools
can be built that can dynamically determine the resources available on
an anonymous grid node.

flashcard is a program for presenting simple flashcards to students in a
standard terminal (e.g. xterm) window. Special features include an xml
encoding of flashcard problems and the ability to present auditory cues
(e.g. spelling words out loud) from compressed sound files.

Axioms
This is a draft book on the fundamental basis
of all human knowledge. It covers the assumptions and problems of
knowledge from the Greeks through Hume and the Enlightenment, noting
that most attempts to determine knowledge were ill-founded from a
strictly logical and mathematical point of view, foundering on the lack
of a sound basis for inference. However, it points out that work by
Physicists Richard Cox and E. T. Jaynes as well as Claude Shannon
provide an axiomatic basis for the algebra of inference and therefore
put empirically supported human knowledge on the soundest basis that it
can have. It also examines the spoken and unspoken axiomatic
assumptions underlying most of the world's great philosophies and
religions, exposing the numerous fallacies therein.

Dieharder
This tool has been developed to the point where it is raising some very
hard questions about random number generators and/or some of the tests
that have been developed to verify the randomness of the sequences they
produce. A draft manuscript is being prepared that is targeted at the
Journal of Computational and Graphical Statistics both to
announced dieharder's existence to the broader statistics
community and to bring to the attention of that community the need for a
``gold standard'' random number generator to enable the tests to be
tested.

Other Accomplishments:

Systems engineer who designed (circa 1995) and has subsequently
been building, upgrading, and redesigning the beowulf-class distributed
parallel supercomputer cluster Brahma in the Duke University
physics department. Parts of this system have been funded by the
University, by the Army Research Office, by the Department of Energy, by
the National Science Foundation, and by an Intel equipment grant, and R.
G. Brown gratefully acknowledges this support. Details of the system
can be obtained from:

Linux-smp and beowulf contributor. As a natural extension of
the work on distributed parallel systems, R. G. Brown has actively
participated in the development and debugging of the network and adaptec
(scsi disk) drivers in the linux kernel distributions. In addition,
brahma provides a home for mirrors of the linux-smp FAQ and the beowulf
website. In various linux mailing list groups, he has helped countless
persons get over various humps in developing their own resources and
hence has contributed to international productivity in science and
industry. This help has extended to remotely managing honors projects
and topical dissertations for students all over the world.

R. G. Brown, together with Dave Rahul of the University of Pennsylvania,
organized the Extreme Linux section of the 1999 Linux Expo, which
focused considerable attention on the beowulf effort and the
possibilities of COTS parallel supercomputing. R. G. Brown was
selected to be on the organizing committee of the ``IEEE International
Symposium on Cluster Computing and the Grid'' (CCGrid'2000), held in
Brisbane, Australia in May, 2001, and was on the program committee of
``The 2005 International Conference on Parallel Processing (ICPP-05)'',
held at the University of Oslo, Norway June 14-17, 2005.

Primary author of Discover, a neural network engine that can
be used to do predictive modeling in both scientific and business
contexts. This engine incorporates a number of proprietary improvements
to standard neural algorithms to achieve high predictivity with minimal
training times. This tool is currently being adapted by independent
study students to attempt to solve the many-electron problem in quantum
theory using unsupervised learning (variational) methodology. If
successful, this will represent one of the only truly new methods
for solving the Schrödinger equation developed over last sixty years.
In addition to its sheer novelty, this approach has particular promise,
because neural networks are in principle capable of precisely
representing the so-called ``correlation hole'' in multielectron
wavefunctions.

Chairperson and primary volunteer of the technology commitee of
Immaculata Catholic
School
in Durham, NC. Headed a
project that basically wired the school and installed a proper client
server network. Brown continues to serve as a technical advisor to the
school.

Board member of Copperfield's
Books, a small independent California
bookstore. Currently guiding Copperfields through a difficult
transition from ``homemade'' IT to a proper client/server network
spanning multiple stores.