On Meaning and the Culture of AI

The tendency to weave stories where evidence is missing is the human brain’s
sustaining feature, precipitating heroic actions, senseless love, and
mindless hate.

Over the last decade, Heller explains, the corpus of Enron emails released to
the public have become the data for a number of analyses about business
etiquette and communication styles. Heller concludes that while these
scientists and researchers may use these textual data to uncover something
fixed and predictable (e.g., who uses “Dear” to open a message), the
humanities—and, I would add here, most humanists—are more interested in
meaning and how the body of Enron emails take on a particular meaning within
their broader context. For instance, an email letting employees know about the
company’s policies on proper corporate conduct sent ten weeks prior to the
CEO’s resignation takes on a sense of irony for us. Heller suggests no model a
data scientist creates would be able to “read” this sense of irony in such an
email.

I’m not interested here in whether or not machine learning algorithms could be
developed to ascertain these broader meanings that apparently only the human
mind is capable of weaving a story around. Rather, I imagine such a model could
be developed through the use of neural networks that would make meaning out of
the data. The model could weave a story meaningful to the “black box” of the
neural network, which is not able to be completely deciphered by the designers
of the model.1 The inner
meaning—the intermediate stages of information being processed by the system
and the mechanism that are manipulating and gauging that information for
usefulness—is like the brain to B. F. Skinner and other behaviorist
psychologists: largely responsible for a response to some stimulus but how it
processes the stimulus exactly cannot be fully known.

As a psychological anthropologist who is interested in how people make sense of
themselves and the world around them, I am especially interested in narrative.
Narrative is one way in which individuals take the cultural material around
them (e.g., beliefs, customs, myths, tropes, stereotypes, etc.) and use it to
construct stories about themselves and their societies. As such, narratives and
narrative styles vary as cultures vary. By studying individual narratives, we
can glean something about the cultural milieu in which one lives. Conversely,
given a particular cultural context, we can often predict what kinds of tropes
and narrative devices individuals will use to talk about their own life
histories.2

An underlying presumption in the study of narrative is that narrative
style—like culture for anthropologists—is relative, not normative. That is,
it is not the job of the researcher to determine how well or how poorly a given
person’s narrative of himself or herself makes sense or fits the way it’s
supposed to be structured based on the researcher’s notion of a well-structured
narrative. Rather, it is the job of the researcher to understand how the
narrator is structuring his or her life history or narrative. With several
samples of narrators from similar backgrounds, we can perhaps begin to see
patterns emerge: common plot devices or turns of phrase or story arcs that are
used the narrators. These patterns, then, can be used to understand how a
particular culture or subculture makes sense of their world such as what it
means to be a person or to have agency, a sense of being able to act and affect
the world through those actions.

If an AI system is able to construct meaning out of the data it is fed that is
meaningful to it but which may not have much meaning or explanatory power to
the system’s designer, how is that similar to people’s personal narratives? If
the system is drawing on heuristics only it fully “understands,” can we say
that this AI system has created its own culture? And what would an ethnography
of such an AI culture look like? I do not have answers to any of these
questions. Instead, I’m simply raising them here in the hopes that myself or
others will address them more in the future.