Chamberlin convinced me that in science and in life we must challenge ourselves to imagine a comprehensive array of possible explanations (hypotheses). Only in this way can we get sufficient perspective to clearly see the kind of questions, observations and experiments that might tease out Truth from the inherent complexity of Universe. Wow, isn’t that the essence of Buckminster Fuller’s Synergetics: comprehensive thinking?

[In practicing the method of multiple working hypotheses] the mind appears to become possessed of the power of simultaneous vision from different standpoints. Phenomena appear to become capable of being viewed analytically and synthetically at once. — T. C. Chamberlin, 1890

Timelines give a helpful context, but must be seen as a gloss, a superficial story from a deeper history. Why do the hundred year periods 1590-1687 and 1865-1964 have almost all of the key dates? Which omissions should be added to the timeline?

Kelly argues that self-reference is the impetus of scientific advancement. He finds support for this in Thomas Kuhn’s meta-science analysis of the so-called Copernican revolution. But Stearns is a critic of Kuhn’s theory of revolutions. He notes that Kuhn’s view of science as a social activity suggests that its development may simply involve the consecutive overthrow of yesterday’s mass hysteria with today’s new and better delusions (Kuhn’s “paradigm shift”). Stearns objects: there is a ratchet on science: we make progress in our understanding of the world. It piqued my curiosity. How has science really developed? Can we look back into its history to get a glimpse of the trendlines of the ways of knowing? What might such a study suggest about future ways of knowing?

I do not know enough about the history of science to pursue this idea too far. But it seems to me there are important ways of knowing that are not often considered. For example, dance is a way of knowing. Moreover, our bodies store capabilities that are latent until we are in action doing things. Opening a door knob is something that we do effortlessly until we try to accurately describe the process (Do I move my hand along the 3D vector that points directly at the knob or is trial and error involved?). All people and all animals have tuned their bodies to do complex things (work assembly lines, drive a car, climb trees, etc.). These can be extremely challenging to specify. Shouldn’t such ineffable knowledge be included in a theory of science? Does knowledge have to be mathematized to be recursively developed? Can all knowledge be specified mathematically? If the method of multiple working hypotheses is considered, should we expect many different mathematizations vying to explain any given knowledge?

The more we learn the more we realize how little we know. — R. Buckminster Fuller

Another important way of knowing is what we absorb from others: book learning, explicit and implicit transmission of our cultural heritage, spiritual and religious learning. This learning comes almost exclusively from other people. Tacit learning is embedded in the social structure. Shouldn’t we try to accommodate such ineffable knowledge in our theories of knowing? Even most scientific knowledge comes to us from our teachers, books, journals, and, increasingly, on-line videos. How much of our understanding is subject to the influence of others? Of language? I hope this gives some sense that the ways of knowing are vast ranging from corporeal to poetic to mathematical to the products of the experimental and descriptive sciences and to the social and theological sciences.

Is it possible that the integrity, the whole, incorporating and accommodating each of these ways of knowing yields a greater wisdom than any subset considered separately?

It seems to me the common thread, the essence, of all these ways of knowing is interrelationship. It seems fundamental: in every way of knowing there are relations among the units of knowledge, among the facts, among the raw experiences. Any given basis for knowledge would lead to an infinite regress as the skeptic could ask the incisive question “how do we know that that basis is fundamental”? Any answer would define an interrelationship between the “basis” and other experience. Knowledge seems to be built upon interrelationships which form a fabric integrating all experience.

Effectively, science recursively integrates more and more experience data including the growing set of scientifically generalized principles into more and more comprehensive knowledge. This corroborates Kevin Kelly’s thesis that recursion drives science.

These data and the principles interrelating them can be structured to form models or paradigms. Science is now deluged in such models. Some models such as physics and chemistry are thought to apply to everything in the Universe. But even there, physicists often use (and teach!) several of their great but distinct models: Newtonian mechanics, relativity, and quantum mechanics. These models became increasingly comprehensive; while others became increasingly incisive (e.g., modern weather prediction models); and some provided perspective on the situation such as with gross estimates or other glosses; and others depicted fine detail in some aspects with distortion elsewhere.

Moreover, old models like old technology (according to Kevin Kelly) never seem to vanish. I often wondered why such “obsolete” models were still taught. Now I understand: not only do they give us perspective on historical developments, but often the old models have strengths of explanation despite their weaknesses. For example, we can see that the Ptolemaic system with its epicycles is still valid when we view planetary motion from Earth’s perspective! It is a kind of relativity. The Earth-centered perspective is incisive when you want to see a movie of the motions of the heavens from your local roof telescope. Even models that philosophers like Thomas Kuhn say were overthrown by “revolutions” are still useful!

My favorite model to illustrate this point is the persistent utility of Euclidean, flat-plane geometry in carpentry even though we’ve known since at least Aristotle’s time (died 324 BCE) that the Earth is spherical. Using spherical trigonometry to build a closet may be technically more accurate, but it would be silly because the error involved in the Euclidean model is negligible and its calculations are simpler. The “best” model is the one that solves your problem within a specified error tolerance and timeframe.

I now realize it is a virtue to be fluent in as many models as possible. It is impossible to know beforehand which ones will give insight into any given problem. There may even be multiple solutions and different models may be needed to find each: the method of multiple working hypotheses illuminates the situation again!

We need to think carefully about the factors that help us choose a relevant model for a particular situation from the inventory that Humanity has built over the past several millennia. One way to find an appropriate model is to draw strengths from several partially correct models into a synthesis. Often, we find that the model that is most general is too cumbersome for daily computations. That is why carpenters do not use QED even though physicists tell us it is one of the most accurate models ever devised. Synthesizing several less general models may give better predications.

With that in mind, I see a future trend in science to define, clarify and specify the geography of the relative strengths and weaknesses of our inventory of models. Not just the ones that are maximally comprehensive. Perhaps, such a development will help reduce the tendencies leading people to adopt “favorite models” that blind them to the relative merits of alternatives?

How do you think the unfolding development of ways of knowing a.k.a. science will change in the future?

Where does Buckminster Fuller’s Synergetics fit into this schema? Could it be a "grand central station" model that provides a conceptual bridge to connect them all? I think that is what Bucky was trying to do, but I do not think he finished the job. Synergetics provides some tantalizing pointers toward such a “promised land”. We can probably build a Design Science that provides the kind of modelability that Bucky imagined. I hope the 2011 Design Science Symposium at RISD will help identify some of the elements needed to build Synergetics and Design Science into a linchpin for science. Do you think the challenge is feasible? How should we begin?

2 Responses to “An Enquiry Concerning Scientific Understanding”

Beautiful essay!
Note also how the evolution of music fits the timeline you presented here. It’s written up in some of Rudolf Steiner’s work. In the 1960s, just as he predicted, we started seeing more octave resonance, which is the key to understanding scientific concepts such as chemical periodicity, catalysis, and gravity waves.
Also during that period we started seeing repetitive interpositions of tone structures and rhythms moving massively against each other — what i’ve called the “field-ground” perception. Works by Phillip Glass, Constance Demby, and Terry Riley especially demonstrate this characteristic. On a more base level, you see it all the time in techno music now.
One way to facilitate the understanding we seek is to cultivate further exploration in music, especially in live contexts. Already it’s becoming possible to create musical illustrations of molecular structures.

Fuller’s strength was being a generalist and learning from nature. He also had a good intuition. The advantage is that most people have been schooled to be specialists,thus having a hard time looking at the big picture, crossing the divide between the different fields of science.
Synergetics and Design Science could provide the tools to discover the Grand Central Station, nature’s coordinate system, or how nature builds and evolves. Natures’ geometry? The thread that connects everything?