Adam rambles

Menu

孟子

Duke Ching of Ch’i asked Confucius about government. Confucius answered, ‘Let the ruler be a ruler, the subject a subject, the father a father, the son a son.’ The Duke said, ‘Splendid! Truly, if the ruler be not a ruler, the subject not a subject, the father not a father, the son not a son, then even if there be grain, would I get to eat it?’ — Analects XII.11 (Lau)

This is one of The Analects clearest statements of the feudal and patriarchal social order that would later get the name Confucianism. Detach it, for a moment, from that overwhelming cultural context, and it’s also an expression on separating design concerns. The two can be contrasted. Every political pundit is a social engineer. They either advocate improvement to the design of the state, or argue a change will break the existing system.

Mencius (孟子) expanded on this sentiment for one of the earliest recorded defenses of the division of labour (Book 3 part 1 chapter 4, 3-4). Labour specialization works because humans have limits on the complexity of a task they can undertake, and are not cloneable or particularly fungible.

Software, by contrast, is highly specialized, but also cloneable at near zero cost. Software complexity has different boundaries. There are physical limits inherent to what Harrison Ainsworth calls engineering in a computational material. These are physical characteristics of algorithmic complexity or computability – limits on how fast a particular problem can be solved, if it can be solved at all.

There are, by contrast, few physical limits to the conceptual complexity of a software component. Those measures like cyclomatic complexity – number of subtasks, variables and choices in a method – have high values, orders of magnitude short of the physical limits imposed by compilers and interpreters. (I once worked on a system where other team members had, in their wisdom, exceeded the limit for the size of a single Java method in a long list of simple business transformation rules. Pushed by the very essence of the language to refactor, they proceeded to – what else? – push the remaining rules into longMethod2().)

The limits which measures like cyclomatic complexity indicate are human limits. They mark the soft edges of a space where humans can effectively create, manage, or even understand software. There are different ways of describing coding conventions, but they all seek to indicate a limit beyond which code becomes illegible.

Legibility is the term James C. Scott uses to describe the social engineering needs of a nascent or established state (Seeing Like A State). The mechanics of a working state require internal legibility. Those working for it must be able to measure and understand their environment in mutually compatible terms which also promote the success of the government. This is why feudal states have such a profusion of titles which become the name of the person (not Bob – The Duke of Marlborough). It is also why courtly dress has such systematic rules. This is seen particularly in bureaucratic feudal states as seen historically in East Asia, eg in feudal Korea, but also in the Vatican, or the badges at the postmodern World Economic Forum. These codes serve the dual purpose of defining the interfaces of the state and of making the role of the person instantly legible to one familiar with the system, all while tempting people with the markings of social status.

Marking lexemes by colour and shape according to their role is exactly what IDE pretty printing achieves. This is also the intent behind decoupling, encapsulation, and well-named entities (name oriented software). It makes the role of a component, from lexical to method, class and class pattern levels, readily legible to humans who much maintain and extend the system.

This strictness of role works well for machines made of non-sentient digital components. For systems where components are sentient meat, there are inevitable side effects. This is, perhaps, the core ethical dilemma Confucius concerns himself with: the demands of The State and The Way (道).

FUNCTIONS SHOULD DO ONE THING. THEY SHOULD DO IT WELL. THEY SHOULD DO IT ONLY. — Robert Martin, Clean Code

Amy Harmon at the NYT has a good overview of a generation of robots specifically designed to trigger emotional cues in their easily manipulated meatbag slaves, er, that is, people. The key example is Paro, a therapeutic robot baby seal.

Jamais Cascio has suggested this empathic reaction should have moral weight. Like Mencius’ heart of compassion, he plausibly argues it is a marker of the complexity of the synthetic creature and our responsibility to it. Don’t Kick The Robot, Cascio advises. Developing the idea, he proposed it as one of five laws of robot ethics.

While SF stories focus on robots analogous existence to humans, for the forseeable future the link with animals will be far more relevant. Most people, and even philosophers like Peter Singer, suggest an animal has a different moral character to a person due to its lack of awareness about or plans for the future. It’s also worth remembering what most people’s ethical codes allow towards animals: of course empathy and affection, but also humane husbandry for profit, and slaughter for the dinner table. We treat many animals much like we treat these robots: as tools.