Analogue Computation

A different approach to biomimicry — using digital and analogue methods to better understand natural phenomena.

Digital computation makes use of some of the most basic natural laws: that of binary logic. Computers employ this at scale; they boil down abstract operations into ever more simple abstractions that are eventually enacted as a series of electronic on/off states. Other ways to calculate exist. Analogue computation, for example, is a method that employs real-world phenomena to ‘solve’ complex calculations. For example, wind tunnels could be considered as ‘analogue wind computers’ because they allow us to test a particular natural phenomena by reproducing it (albeit at a smaller scale) in a controlled environment.

The use of physical models as test-beds for landscapes has a long history.1 The US Army Core of Engineers built many hydraulic models to investigate how to best implement flood control measures in a manner that accounts for the holistic operation of the chosen area.2 By using physical models as test-beds, engineers could simulate various landscape conditions at smaller scales that still accurately reflected complex behaviours.

This form of modelling — where real landscape materials are used to test real landscape phenomena — is greatly improved when combined with digital techniques for easily gathering data from physical models. For example, the use of laser-based scanning methods allows for topographic data to be continuously recorded while cheap sensors and actuators precisely control the release of water or light. For many phenomena that depend on the behaviour of fluid flows, this hybridised form of modelling may be one of few accurate options available given the need for high degrees of accuracy that3 preclude purely-digital simulations.4

Enriqueta Llabres and Eduardo Rico’s work at the Bartlett identifies a lineage of material computation in design practices, with Frei Otto’s work as a key “project of extracting logics of distribution and form derived from the careful study of material behaviour.”5 In particular they look at his experiments at a territorial level, where he identified large scale organisational patterns in existing landscapes and then reproduced these through physical tests that employed self-organising phenomena. As an example, one of these investigations used floating magnets and needles to identify a minimal-energy state of equilibrium while maintaining connectivity between key nodes,6 creating a self-organising pattern that could apply to planning tasks such as road networks. While most of Otto’s experiments were highly dynamic — i.e. the magnets and needles would push and pull before settling into a steady state — the final equilibrium is still a single endpoint; a ‘frozen blueprint’ that represents an optimised outcome.7 While dynamic in their ‘solving’ process the static result nevertheless becomes problematic when modelling many time-based phenomena where a steady-state outcome is rare.

Taking Otto’s experiments as a point of departure Enriqueta and Eduardo look to new forms of ‘proxi modelling’ that better approximate the means by which landscapes transform in response to natural and designed events.8 One project looks at a Canadian site and the hydrological and geomorphological effects of mineral extraction. Here the mining process has dammed and diverted existing rivers to capture water for industrial use, which in turn creates new ‘trailing pods’ and new patterns of sedimentation. The new and disrupted hydrological flows are a starting point for imagining interventions that better re-naturalise these industrial outputs; a process complicated by the dynamic formation of both the new and existing water courses.9

The particular topography and substrates present on site inform simulations that examine new potential water flows over time.

Physical models, working in conjunction with digital sensing systems, explored these dynamics by simulating the process of delta formation that results when sediment infiltrates slower-moving water bodies:

“In the physical model generated for the project, water and sediments are dropped from the corner of a tank (assumed upstream) which allows water to flow from the other corner. As the process goes on, layers of sediment will expand away from the corner (Figure 7), generating sedimentation fans and channels that accumulate in layers. … Experiments are then carried out to evaluate as well as instigate potential interventions and modifications to the main river branches, in this case through the introduction of obstacles to the flow at the point where the first channels begin to emerge.”10

The results of these tests were recorded using laser capture and chromatic filtering to create a 3D model that can identify particular patterns in the direction and distance of water flows.11 Designers can then intervene into the form of the model to their understanding of the sedimentary dynamics against both the existing landscape state and against alternative states that introduce new physical formations.12 However, there are difficulties in ‘miniaturising’ such a simulation, in terms of both ensuring the dynamics are correct reflection of the larger scale system13 and in terms of ensuring the sensing techniques are of sufficient resolution to capture small scale changes.14 Nevertheless, the hybrid analogue/digital system allows the design process to become more intuitive as seeing and modifying a physical model creates causative relationships between complex non-linear phenomena that can be examined and tested with a specificity that exceeds the designer’s immediate understanding.15

The model tests Hydrological flows against a variety of different morphological interventions (left) while a digital capture of the model’s water flows over time depict the water’s trajectory and volume (right).

This method is placed in contrast to traditional methods of simulation. The advantage of the ‘proxi’ or ‘hybrid’ model is that it is “constantly in flux and shifting, with sand and water changing the overall configuration of the landscape and the urban environment”16 creating a method that is not “just a projective tool purely emanating from the designer.”17 To an extent this is a characterisation that derives from the nature of the phenomena investigated and the simulative methods — parametric models themselves are also capable of rapidly changing their configuration in terms of their initial state and the simulated outcomes. What obstructs this in cases of many hydrological- or climate-driven systems is that the computationally taxing complexity of fluid phenomena renders simulations too cumbersome and thus difficult to integrate into rapid feedback systems. Without the ability to quickly test intuitive design decisions, the capabilities of either digital or physical modelling limit the ability of the designer to build up an understanding of the phenomena that consciously informs (rather than just validates) design intent.

This approach is particularly effective for many classes of hydrological phenomena because they enact simulations of fluid dynamics; a process that is difficult to enact digitally, particularly in the context of simulating landscape form at landscape scales. While there may be some changes in behaviour between the model and real scale, the basic behaviour at least reflects the general dynamics of the chosen phenomena. There are, however, limits to what phenomena can be or need to be simulated using physical modelling. Biological phenomena can be measured and tested for design purposes, such as say the phytoremediative potential of a given species, but for these results to drive design iterations (rather than merely validate a design proposal) there needs to be an iterative dialogue where new results can be easily generated to gauge new design moves. For biological phenomena, such as say a species’ growth rates given a particular shading or soil condition, the time required to simulate these phenomena cannot be greatly accelerated and so are generally too slow to assist in rapid design iteration.

Similarly, the setup costs are high for both the initial model and for each simulated iteration. Creating and setting up the sensor networks and linking them to digital models requires a high degree of expertise, with sensor and software setup likely explicitly tied to the conditions investigated. Physically creating a model — particularly if it is to represent a real-world location — is difficult and often the course of each test requires that the model is reset to this base form if the tests have physical effects — such as with simulations of erosion.

As a result the use of analogue computation is to an extent niche. It remains one of the best — if not only — method for modelling many classes of phenomena where digital methods are unworkably slow or inaccurate. Moreover, even if digital methods for modelling such phenomena become more advanced, physical models would likely remain useful as methods for validating results. In the cases of many other landscape phenomena, the simulated action may be unsuitable for physical modelling, either due to scalar or temporal dependencies that cannot be compressed so as to enable easy testing. However, because the use of sensing technologies ties these physical simulations to digital simulations, ‘hybridized’ models can merge the methods and results of both in order to create a more cohesive understanding of landscape performance.