Have you ever wondered about the reason for the convention
that prioritizes or aggregates logical operators in
parenthesis-free expressions and what the consequences would be
if the order were different and had an empirical foundation?
Questioning such an apparently mundane ground rule can lead
to an upheaval of the way people have been thinking about a
system. What if you learned that the logic and its operators
had more significance than representing the structure of arguments,
that, indeed, they might represent the structure of the cosmos,
itself? In this essay, I argue that the complexity of relations
between/among entities should determine operational prioritization
and that this complexity is the essence of the binary logic as
the language of innate order in the universe.

Our logic has assumed paramount importance as the foundation
of modern computer science and much of artificial intelligence.
Binary logic usually is the first and most common logic students
encounter, but they rarely encounter meaning beyond it being a
mechanical convenience for analyzing mathematical relationships and
attempting to analyze ordinary language arguments. Yet, exciting
mathematical, neurophysiological, and psychological work recently
done in binary logic suggests a connection between our consciousness
and the cosmos.

This essay is conjectural in some parts and cross-disciplinary.
It does not purport to be a deep analysis of competing ideas,
but I wanted to tie together some crucial observations to create
enough of a focal point for questioning present conventions,
re-directing pedagogy, and proffering groundwork for a philosophy
of binary logic and consciousness.

The first section describes logical aggregation and its
importance. Section Two briefly examines three examples
suggesting that the ease of logical thinking depends upon
ordering of operators. Cases found in human learning theory
and Boolean neural networks suggest that each operator has a
unique level of complexity. In Section Three, I propose a
method for finding a natural order of operators that more
closely fits the way in which humans think, and Section Four
advances a procedure to analyze a seemingly unordered phenomena.
The fifth section describes the philosophy upon which this
proposed research scheme is predicated. Binary logic's syntax
displays a semantics of order in the universe, as biophysical
and cosmological research indicate. The syntax, itself, may be
a semantic expressed by a deeper structure. Section Six suggests
a direction in which research should proceed to understand how
the source of our being may be communicating to us.

Prioritization and its importance

In parenthesis-free
expressions, such as p & q v r, the truth value
of (p & q) v r is different than p & (q v r).
Using commonly accepted notation and by convention, the priority
of operators is =, =>, v, &, and ~ in
descending order of scope, or precedence. (Note that these symbols
aren't proper, because of their having
to be ASCII characters. The "=" is equivalence, "=>" containment,
and "v" or.) That is, ~ affects only the
adjacent variable, & affects only the variables on either side,
v affects the & expression inside the
parentheses and the first variable outside, and so forth.
So, p = q => r v s & ~t would be grouped p = (q =>
(r v (s & ~t))), and p v ~q = r => s would be (p v ~q) = (r => s),
with = affecting every variable, and ~
affecting only one (e.g.: Stoll, 60; Copi, 219; Massey, 34-62;
Rosser, 19-23). The same occurs with
arithmetic operators, as 9 + 5 x 4 + 3 would be [9 + (5 x 4)] + 3.
It is generally recognized that the
prioritization of these relational operators in logic is patterned
after mathematical ordering, conjunction
being analogous to multiplication, disjunction resembling addition,
and so forth. Which operator has a
greater scope than another is determined merely by convention
(Church 1992, p. 79-80; Exner 1959, p. 38-
40; Rosser 1978, p. 19; Margaris 1967, p. 26; Copi 1979, p. 219).

As to the values
that the variables may assume, there are 16 relationships generated from
the four
ways
(00, 01,10, 11) the two elements in a basic linear order may be permuted. Each
of these relationships may be seen as a way we describe how we know that
the first element
is related to the second. For example, the
value of p as 0011 (the "0" traditionally regarded as "false,"
and 1 as "true") can be related to the q value
of 0101 as 0111 (0 or 0 = 0, 0 or 1 = 1, etc.), because the
relationship is "or." In standard truth table form:

p

q

p
or q

0

0

0

0

1

1

1

0

1

1

1

1

Prioritizing enters as a problem in parsing a sequence of
variables connected by operators in a
parenthesis-free expression. There are two cases. In the first
case, the components of the idea are already
known and determine the grouping. The second case is significant
in this paper, where groupings of ideas
are not known, but we group according to the convention
described above. Logic texts, such those by Copi
and Rosser will discuss the convention, but the reader if
left wondering about how the problem of
ungrouped expressions arises in the first place.

The
convention serves convenient purposes, not the least of which is to preserve
consistency
in logical
computation. For mathematics, it is easier from a visual
perspective (because of the more closely spaced x
and y) to multiply x times y first in xy+p, rather than separate
the x from the y and add y to p. Even while
maintaining the visual preference for doing the xy calculation
first, the xy just as easily could have meant "x
plus y" and the plus "x times y," the order now being addition
first and multiplication second. Signs have
not always meant the same thing throughout the ages, nor have
they always been used. The Bakhshali
(Indian) system, used "+" for "negative." In European mathematics,
the plus and minus appeared at the end
of the 15th century (Cajori 1993, p. 77).

Aggregation did not assume much importance until the end of
the 15th century, when Pacioli in his
Summa found a need to compute roots in polynomial expressions
(Cajori 1993, p. 385). Subsequent
treatments of roots depended upon appropriate aggregation
techniques. However, notation usually arose
after the concepts were formulated and had the purpose of
punctuation, or separating the symbols that
represented concepts. Roman numerals ultimately were replaced
by the Arabic system we see today. Long
division by Roman numerals demonstrates the superiority of the
present method.

A literature search has failed to yield a convincing philosophical
rationale for the current is prioritization
convention. Quite to the contrary, as indicated above, standard
logic texts refer to the ordering as a
convention. Notations represent concepts, and the question at
hand is why one operation should precede
another. What we are looking for, of course, is an ordering of
operators based on concepts rather than
simple arbitrary agreement. Truth value or computational results
obviously depend upon how operators are
prioritized, but the importance of prioritization is greater than
obtaining consistent calculation. Logical
operators do not have the same degree of complexity, and the
operators may be hierarchically ordered
according to what constitutes complexity, as the following three
briefly discussed examples indicate. The
essence of the complexity may carry through to the result of the
computation.
Consequences of different aggregations
Piaget wrote that the building block of a child's ideas of movement
and speed is an awareness of serial order.
A simple order is linear and "requires only a simple perceptual
situation." (Piaget 1958, p. 36) Through
adulthood, people's cognitive abilities depend upon apprehension of
operational complexity. Piaget and
Inhelder demonstrated that children learn in logical stages,
i.e., "... memory is a function of operational
developments)." (Piaget and Inhelder 1973, p.160) For example,
a randomly selected five year old child
will do conjunctive operations before disjunctive ones when given
a task depending upon the conservation of
length. The meaning of the addition (or) function is apprehended
more easily than the multiplication (and)
one, suggesting that each operator represents a level of
intellectual complexity that affects the ability to
memorize. More recent investigations confirm the same type of
phenomenon in adults. (Taylor 1987, passim.) While Piaget's and Inhelder's research methodology
may be slighted, the general thrust of
a decade or more of their work points to differences in operational
complexity.

Research
indicates that learning in Boolean neural nets depends upon the arrangement
of operators.
A
network is designed to search for its own structure to solve a
problem by accepting data and attempting to
discover the rule governing the relationships among items in the
data. The rule must be recognized with the least number of errors.
Operators, as logic gates, are serial with no feedback or back
propagation.
Researchers seek to discover the ordering of operators based
upon ascending energy levels required for a
successful discovery to occur. Energy is defined as "the
discrepancy between the correct result of the
operation and the one obtained from the circuit averaged over
the number of examples Ne shown to the
level required by that gate. Only when the result is zero can
the correct gating be identified by the system.
It is apparent that if the network is configured differently
with the ordering of operations changed, the
energy levels will differ as well. While the methods and schema
for Boolean neural network computation
are quite complex, the results indicate that there is an
optimum configuration of operators the net uses to
learn a task.

Patarnello's work on performance energy of neural nets
corresponding to configurations of input bits as
a way of ordering operators is supported by Martland's work
classifying network behavior corresponding to
truth table elements. The complexity of operators in Boolean
neural nets has not been studied extensively,
but it seems that the density of truth outputs varies with the
type of connective involved, thus suggesting a
basis for classification (Martland 1989, p. 222-234). In the
same fashion, Kauffman has shown that when
specific functions are forward fed into an element (propositional
schema) specific patterns of function orders
emerge. There are attraction points found within autonomous
random Boolean network (BN) state-space
(Kauffman 1993, Chapter 5). With respect to ordering in Boolean
complexity, it can be demonstrated that
numerous random couplings of operations result in patterns
resembling those of cellular automatons
(Wuensche 1993, passim). If logical operators are randomly coupled
to produce patterns, it would be
interesting to see what patterns emerge if the operators were
coupled according to an empirically determined
scheme.

Discovering a natural aggregation

A method exists for determining how the concepts of operations are
prioritized according to the way we
think. Piaget and Inhelder built a foundation to show that
operators have differing degrees of intellectual
complexity. While it is beyond the scope of this paper to explain
the details, suffice it to say that an
approach exists to constrain the operator to a specific learning
parameter (color, size, speed, and so forth)
and make their experiments mainly phenomenological, rather than
having the testing rely upon words. One
form of such an experiment I have created builds upon a Miller's
Analogies-style of presenting information
but pictorially represents the meaning of each operator, with
the subject being asked to identify the meaning.
Complexity is registered as a function of rapidity and accuracy of
response. Each parameter may result in a
different order or priority. For example, recognizing an operational
concept (such as implication) may be
more difficult with weights than with sizes. The priority given
the operator would depend upon the time
taken to recognize the operation and the accuracy of recognition.
Several orderings may emerge from these
experiments. For an artificially intelligent device, a task
involving so many parameters may have that
number of processors operating in parallel, each one configured
to an ordering.

Applying empirically derived ordering

How can one use
an empirically derived ordering? The significance of ordering may be seen
by
comparing
the computational outcomes of several different orderings of
a bit stream. For example, take a sequence
like 0 v 0 => 0 & 1 v 0. Normally, this would be (0 v 0) =>
((0 v 1) & 0)), the final result being 1. If the
new grouping were ((0 v (0 => 0)) v 1) & 0, the result would
be 0. Without knowing how the ideas are
structured, what are we to do? What if the source meant:
(0 v 0) => (0 v (1 & 0))? For ordinary language,
the problem is somewhat simplified when the person starts
the sentence with "if," for we know often this is
the main operator. However, what do we do with the second
part? Does the source want us to use & or v
as the main operator? This does not mean that there is a
universal way of parsing ordinary language
utterances expressed in a parenthesis-free manner, but it
does illustrate a problem of parsing a bit stream
from what could be an intelligent source. As mentioned above,
logic texts often raise the issue of
unaggregated expressions but do not say when, where, or why
we would encounter them. However, one
can think of the binary digitizing of any phenomenon without
pattern (seemingly non-repeating decimals, like
sqrt(2) and pi, cellular automatons, electroencephalograms,
or even data from the esoteric former Search for
Extraterrestrial Life (SETI) project. one can think of the
binary digitizing of any phenomenon (seemingly
non-repeating decimals, like the square root of 2 and pi,
cellular automatons, electroencephalograms,
or even data from the esoteric former Search for Extraterrestrial
Life (SETI) project.

How could an empirically derived prioritization of operators
be used to derive meaning Many ways exist
for using different aggregation schemes to extract patterns
from an ungrouped bit stream?. The following
example procedure is vastly oversimplified, possibly flawed
technically, quite abstract, and merely suggests a
general direction in which research might proceed in observing
emerging order from such an ungrouped bit
stream.

Taking 0s and 1s to represent values (as opposed to operators):

Identify an
experimentally-derived prioritization scheme, such as v, =, => (using the commonly accepted
notation), where v has the greatest scope of operation, = the
next, and => the least. (Other analyses may use
different operators and orderings.) The present example might be
the ordering found for persons doing
logical operations involving sizes. That is, persons
might find that it is easier to do logical operations
involving size with = than with v and =>.

Divide the bit stream into n+1 bit segments, where n is
the number of operators in the prioritization
scheme. With three operators, each segment would be four bits
long. For 01100001110011001010..., this
would look like 0110-0001-1100-1100-1010... .

Aggregate
each segment according to the prioritization scheme by inserting the operators
in between
the bits. In the above example, in the first segment of 0110,
the grouping would be (0 ( 1) ( (1 ( 0). The
next would be (0 = 0) v (0 => 1), and so forth.

Evaluate the
segment. Evaluate (1 => 0) first, to get 0.
Next is (0 = 1), with 0 also being the result.
Finally, 0 v 0 is 0.

Do the rest
of the four bit segments the same way, the next being 0001, the aggregation
being (0
= 0) v
(0 => 1), and the value being 1.

Concatenate the results, 010 up to a length to be
determined experimentally.

Observe the patterns and compare to those generated by
cellular automatons, or
electroencephalograms, as suggested by Wuensche.
(Wuensche 1993, p.11) These patterns, however,
would be generated from logical operations based upon an
empirically derived ordering of operators.

Philosophy of aggregation

Why would a natural
ordering scheme tell us more about how we think? Why should this research
tell us
about innate order? Part of the answer lies in re-thinking
how we view logic. Wuensche, Kauffman and
others have demonstrated that patterns emerge from variously
randomly coupled operators, each operator
seeming to have a different degree of complexity. Two arguments
may be advanced for regularity in the
random coupling, each resting upon a view of causality. Either
the patterns originate from something within
the entity itself, or something outside the entity imparts that
which is needed for the entity to generate the
pattern. Modern philosophers refer to the first view as
autopoiesis, or the theory of self-organization. How
do systems self-organize? What "propels" organization?
Artificial life and automatons are two types of
apparent self-organization that have current interest.
Adherents of the second view of causality would argue
that an external agent is responsible for the entity; entities
don't arrange themselves. Patterns don't simply
"happen." The first assumes independence, the latter
interconnectedness. Not denying the first for now, I
will focus on making a case for the second, which, in turn may
help clarify the discussion of autonomy and
the nature of the complexity.

Operators and their ordering(s) are a reflection of complexity,
as illustrated by the three examples above,
and its structure in human thinking, complexity, and the universe itself, that logic represents. According to
Piaget,

There exist outline structures which are precursors of logical
structures,... It is not inconceivable that a
general theory of structures will...be worked out, which will
permit the comparative analysis of structures
characterizing the outline structures to the logical structures
characteristic of the higher stages of
development. The use of the logical calculus in the description
of neural networks on the one hand, and in
cybernetic models on the other, shows that such a programme is
not out of the question. (emphasis included)
(Piaget 1958, p. 48).

Other researchers
hold that propositional logic reflects an order innate in the universe and
human
thinking. The arrangement in the universe is according to "pregeometry as the calculus of propositions,"
such that "...a machinery for the combination of yes-no or true-false
elements does not have to be invented.
It already exists (Misner et al. 1973, p. 1209)." Everything is
reducible literally to the primordial -
first ordering. The binary structure may be a very natural
expression of the way the universe exists in a
fundamental and profound way. That is, the logic is a discovery
more than a creation.

Our
universe began from a singularity, or an undifferentiated phenomenon. Hesiod
in the
Theogeny and
Lucretius in The Nature of Things spoke of a chaos, or unordered
condition, prior to the beginning of our
universe. It may be compared to Peirce's state of doubt, a feeling
of undifferentiated or uniform energy.
Bound up with the singularity was process; potential changed to
kinetic, manifested by movement. Out of
this "condensed chaos" came what we have in our dimension. We
see this emergence of being into our
dimension today at both the infinitesimal and the infinite ends
of the spectrum of our discernible world. At
the infinitesimal end of existence, it has been found that there
is a pressure exerted by elements within a
space deemed to be a vacuum. In this space of "zero point energy"
are particle density fluctuations as
photons enter and exit this discernible vacuum space with no known
reason (Science "The Subtle Pull..."
1997, p.58 ). At the infinite end of the spectrum, Stephen
Hawking's latest research indicates that
microscopic black holes "..eat one kind of particle and emit
another" (Science "Visions" 1997, p. 476).
While p articles entering the event horizon may be "flattened" with their primordial
constituents scattered over the boundary layer, and ultimately the lost information
may
be ejected from the black hole, this does
not say anything of the force inside the black hole
(Susskind 1997, p. 52).

The dialectic between the discernible (what we see) and the
unformed (bound up within the singularity)
is the first and most basic of processes. Out of this process
came and still does come, from that which exists
in terms of what is not apparent, or what we do know in terms of
what we don't. Order was born as the
object of this dialectic process, allowing us to discern existence
through existents (our world around us
through the things in it). This order is expressed by the
language of logic.

What
is the nature of this binary logic? Minimally required for order is a set
of two
elements, and each
operators establishes a relationship between these two elements.
In binary logic, the two elements normally
are semantically regarded as "false" and "true" and "true"
(often symbolized by 0 and 1, respectively). What
we say is true or false depends upon our knowledge. Logic displays
a structure of existents that comprises "natural" semantics, a structure standing as an ontology of
knowing. (An example of such an ontological
system may be found in James K. Feibleman's Ontology.)
A thing must exist in order for us to know
it, knowing being a way of accounting for an assertion.
Setting aside criterion of how we determine whether
something is true or not, the primary existents for what I call
an "epistemological logic" are two: that which
is known, or measured, and that which is not known This is
not the same as equating the unknown with
"false." Actually, "false" would fall into the category of
"known," for one knows in order to state
something to be false. Likewise, true also is in the category
of known. For the other existent, unknown,
something can be true but unknown and exist, such as the actual
number of stars in the universe. (While
asserting a truth presumes its existence, existence of something
does not mean that one has knowledge of its
nature. Hence, my use of the binary logic is not as a truth-functional
calculus of propositions (the former),
but as a structure displaying epistemological relationships
(the latter).

Symbolically, we can say 0 represents the unknown
(undifferentiated, etc.), and 1 represents known; 0 is
prior and 1 subsequent. The unknown becomes the known. Another
way of expressing this is that 0
contains 1, for the universe of the unknown is larger than
and precedes the known.. Containment is
the subject of deductive logic.

As a quantum semantics, 0 means a wave, and 1 the collapse of
the wave function, or unity, where the
observer apprehends the particle density fluctuation as information
bounded by space-time. As a maximal
expression of information, this would be regarded as an entropy,
that which has emerged from chaos, or
energy that has been expended (dissipated) to reveal information.
Once the information has been expressed,
it isn't expressed again from the same source in that space-time.
In particle physics, the one is an absolute
unit, a dimensionless number representing the speed of light,
Planck's constant divided by 2 pi, and the
gravitational constant (Hameroff and Penrose 1996, p. 520).
Wave function collapse apparently is a
constituent of our consciousness, i.e., those 0s and 1s may very well
represent our thought processes.

This wave function collapse to the value one is seen in the
cytoskeleton, or microtubules, of neurons.
Tubulin subunits make up the microtubule and are dimers (a bipolar
entity that can assume either a positive
or negative state), and these act as binary computational structures
(Rasmussen et al. 1990, p. 428-449).
When polarization occurs in the gigahertz range (10^9 to 10^11 Hz)
(Frohlich 1975, p.1412) among groups
of these dimers, the neuron assumes a shape that seems to modulate
the neural pulse (Hameroff and Penrose
1994, p. 517-518). However, the phenomenon seems to have
cosmological correlates.

About
100 (10^11 Hz) to 1000 GHz most clearly shows the uniformity of cosmic background
radiation
(CBR at 2.73 K +/-.01 with a 95% degree of confidence)
(Smoot 1995, p. 5), the same as black body
radiation and about the same value as the natural logarithm e
(2.718). Frohlich's upper boundary of 10^11 is
the lower boundary of CBR, or the unit measure of 1 mm. More
than being simply "true" in a semantics
table, 1 very well may signify a resonance with CBR and what
gave rise to it. As Penrose said: "...there
should vibrational effects within active cells which would
resonate with microwave electromagnetic
radiation, at 10^11 Hz, as a result of biological quantum coherence
phenomenon" (Penrose 1994, p. 352).
If the 10^11 frequency is what "activates" consciousness, more
support is given the view that the universe
is, itself, conscious. Binary logic is the language describing this
consciousness. What is the mechanism of
the language, and, more importantly, its meaning?

Sixteen
operators with four sets of relationships between placeholders for two entities
(p and q) spatio-
temporally relate the unknown to the known and the wave function
(symbolized as 0) to its collapse
(symbolized as 1). The conditional, so often the focal point of "paradoxes of material implication,"
consistently and faithfully describes the spatio-temporal nature
of deduction, or the structure and processes
of closed systems. One should note that the often used proper
subset symbol (p < q , with p=/= q) for
the traditional material implication "horseshoe" is incorrect,
since it is the improper subset symbol, >=
denoting deduction, that says that the first set can contain the
second either totally or partially. With this
correct use, there is intuitive, as well as logical sense of
material implication. An element contains itself (0
>= 0, 1 >= 1). In popular logical parlance, the relationship is
true, or 1. Obviously, 0 >= 1 is the case, and
that leaves 1 >= 0 being false, or 0, hence completing the truth
table for >= (described before as "=>") .
What is known consists of a smaller universe than the universe of
the unknown. Quantitatively, the universe
of the unknown contains what is known. A particle (evidence of
an instance, or "collapse") is a constituent
of the wave. (Or, in Kant's view, the appearance, or instance,
is bounded by the reality of the whole. Kant
1963, p. 185-186 and passim.)

Where do we go from here?

To bring the philosophical speculation and theoretical system
constructs into the tangible domain, it would
be useful to demonstrate empirically that the links exist among
the 1011 frequency, the binary logic, and
consciousness. While the technology currently may not be available,
the following offers one possible route
of exploration for such a test.

Technology is
such that simultaneous Positron Emission Tomography (PET), functional Magnetic
Resonance Imaging (fMRI), and electroencephalogram (EEG)
measurements can be taken (and mapped
onto each other) of an individual doing a mental task, such as
learning the meaning of a logical operator
(Science "New Dynamic Duo" 1997, p.1423). That is, the EEG can
measure various structures exhibiting
mental activity. With each logical operator, there should be two
frequency ranges: that of the neural pulse
(1-40 Hz) and the high GHz frequency that causes the microtubule to
assume the shape that modulates the
wave to produce the EEG matching the brain activity associated
with processing a particular logical
function. A confirmation that this approach has merit would be
to re-introduce the measured electrical
signals back into the brain structures to induce the subject to
perform the mental task and possibly to report
other thoughts that may be embedded in that code. While on the
surface it may be that only the thought of
an operator would emerge, there may be associated thoughts "grabbed"
from other areas of the brain to
create a more complete idea of what thoughts are associated with the random bit stream For example, see
Newman for how stimulating one area of the brain induces activity in
other areas (Newman 1993, p. 267,
270-271). In principle, the object would be to correlate the EEG
with the logical operator or series of
logical operations done by the subject. Would it be farfetched
to suggest that an extended "truth table" of 0s
and 1s might pictorialize the EEG wave form or that the 0s and
1s could be mapped to the EEG?

A
similar approach of correlating 0 and 1 patterns to EEGs may exist for the
random concatenation of
operations done by Kauffman and Wuensche. Wuenche suggests that
his basins of attraction diagrams
resulting from a random concatenation of logical operations
densities may even indicate an "...electroencephalogram (EEG) measure of the mean excitatory
states of a path of neurons in the brain" (Wuensche 1993, p. 11).

A third area of investigation would be to correlate the patterns
of conformational collapse on the surface
of the microtubule to EEGs and the patterns exhibited in the
work by Wuensche. If the display of the
conformational collapse does correspond to the 1 in binary
operations, and the 1 is symbolic of quantum
collapse, then, this might bring us closer to showing that binary
logic is the language of at least one form of
consciousness.

Summary

I have presented the issue of aggregating logical operators in
parenthesis-free expressions and discussed the
importance of finding a method based on how we think. Three
studies suggest that it is the complexity of
the operators that determines the priority of operations in a
parenthesis-free expression. If a string of 0s and
1s representing absences or pieces of information is generated by
the complexity represented by operators, it
would not be unreasonable to analyze such a bit stream using a
prioritization that more closely resembles
human thinking. Discovering such a natural order or orders is
predicated upon a philosophy that is being
borne out by emerging research in biophysics and cosmology.
It gives new reason to an old logic.

Our logical thought processes, as expressed by 16 operators,
are ordered according to a type of
intellectual complexity. These processes are mappable to brain
structures, and the frequency against which
cosmic background radiation is measured drives these brain
structures, thus making logic as a language of
innate order in consciousness. Consciousness as we know it is
immanent in the universe.