Computers in Love or Why Siri won’t return your calls

ENIAC, the early computer that was the model for the computer that falls in love with his operator’s fiancee in Kurt Vonnegut’s short story, “EPICAC.” (US Army photo via Wikipedia.)

I’m sure you remember Kurt Vonnegut’s short story, “EPICAC.” It’s in Welcome to the Monkey House? About a computer that taught to write love poetry meant to melt the heart of the mathematician narrator’s fiancee falls in love with the woman itself? When EPICAC learns that she cannot return its---his now?---love, it/he commits suicide by short circuiting it/himself.

When dealing with Vonnegut, it’s a good idea to keep in mind that he didn’t think of what he wrote as science fiction. He knew it was or much of it was, but he always insisted that was incidental, he was writing about this world, now. EPICAC, interplanetary travel, Ice-Nine, the Tralfamdorians, Time Quakes---these machines, technologies, scientific discoveries, places and beings, and concepts exist in some form in the here and now or at least the human behavior they affect or illuminate or that brought them into existence is very much real in the here and now and it’s that behavior that is Vonnegut’s main subject. Which makes his work what it is, satire. “EPICAC” is science fiction in that it’s centered on technology that did not yet exist when Vonnegut wrote the story. Computers in the late 1940s were barely more than very large adding machines. Of course people were already looking forward to the day when the machines could “think.” But then that story is as old as the hills.

It’s the story behind Pygmalion, Frankenstein, and Pinocchio. It crops up later in Do Androids Dream of Electric Sheep? and Blade Runner. Early versions of it were probably told in the caves at Lascaux. Well before that even. As soon as people realized they were alive they started asking themselves what that meant, to be alive and to be aware of being alive. Plants are alive. Animals are alive. But are they aware they are alive? What makes a person different from a tree or a mastodon? What makes a person a person? What makes us us?

So, because one of things that makes us us is that we are moved to build things and create things, with some of those things being purely figments of our imagination, and another thing that makes us us is that we are compelled to tell stories about us, about being us, we tell stories about building and creating other sorts of us-es.

It’s an old, old story. And there are two ways to tell it. One is to begin by asking what if something we think of as a thing has a soul? That’s “EPICAC”. The other is to scare ourselves by asking what if somebody we think of as one of us doesn’t have a soul? That’s Dracula. And the wolfman and zombies. The real question behind the second question, too frightening to ask straight out, is “What if we don’t have souls?”

It’s a question implicit in the first question too. What if we’re just like EPICAC? What if we only think we have a soul? But Vonnegut has another question up his sleeve. He always does. What if we’re only us, what if we’re only alive, because other people think we are?

EPICAC’s heart breaks when he realizes he is an it in the eyes of his beloved.

At any rate, this is a theme Terry Pratchett has been batting around for a while. Are the goblins persons and if they are what makes them so is the question at the heart of his latest novel, Snuff, and it’s another way of asking what makes us us or are we persons, that is, do we have souls? But Pratchett’s asked it before, whenever he’s brought in the golems.

In Discworld, the golems are what they are in Jewish folklore, to start---human-ish figures of baked clay animated by scrolls inside their otherwise hollow heads. They are machines without clockworks, puppets that don’t need strings. But starting with Dorfl in Feet of Clay the golems begin to think of themselves as, well, having selves. And then in Going Postal and continuing into Making Money, Moist von Lipwig has to contend with a golem named Gladys who has decided that only does she have a self, she has a gender, she is a she, and this she is a young lady and demands to be treated as one (and she has very prim and proper ideas about how young ladies should be treated) and she develops a crush on Moist.

To Moist’s amazement and consternation and against what he thinks of as his better judgment he finds himself dealing with Gladys as if she is the young lady she thinks she is. Reflexively, he treats her as a person in her own right with a soul of her own.

I just finished reading a novel by the physicist Alan Lightman. Mr g. It’s about the creation of the universe and it’s narrated by God. Not the God of the bible. But the God as he might be if he created not just our universe but the multiverse according to the laws of mathematics and physics we know underlie and give order to the multiverse.

He’s a rather detached figure, as you might have guessed.

Detached but not without feeling. Something bothers him and he doesn’t know what to do with it. He expected that some of the matter he created would become animate and he expected that some of that animate matter would develop the ability to think. What he hadn’t considered is what would happen when that thinking animate matter began to think about itself, when it became aware of itself and began to ask itself what that means and if it has meaning?

He’s startled by how awful a question this is for his creations. From every quarter of the one universe he happens to be giving his attention to at the moment billions and billions of creatures are asking him (without knowing they’re asking him, exactly), “Why are we here?”

His only answer seems too cruel to give them: “You’re here because the rules I put i place when I started this experiment naturally lead to you being here, that’s all.”

You’re a by-product.

You’re an accident.

You’re collateral damage.

He can’t tell them that. And he can’t give them what they want, immorality, because that would mean changing the rules which he can only do by going back and starting over. The best he can do, and what he expects them to figure out how to do for themselves, is let them find meaning in being alive at the moment.

So, I just finished that book and I’m starting my review of it as soon as I’m done with this post. But the reason I’m thinking about EPICAC is that I’ve also been reading Unstuck in Time: A Journey Through Kurt Vonnegut's Life and Novels , a new biography by Gregory D. Sumner. I just finished the chapter dealing with Vonnegut’s first novel, Player Piano , and the job he had a General Electric that inspired it and I was reminded that EPICAC or another model of EPICAC reappears in Player Piano, only this time very much without a soul and its soullessness leads to its having a nervous breakdown. What struck me as funny, though, is that a successor to EPICAC is the works and it’s going to be so much bigger that it will have to be housed in the subterranean vastness of Howe Caverns. The original EPICAC, like it’s real life inspiration, ENIAC, took up a thirty by fifty foot room. Back then it was the given that as computers got “smarter” they would get bigger. And for a while this is what happened. All those vacuum tubes and miles of magnetic tape took up space.

Then along came the silicon chip and here you are holding in your hand a computer with exponentially more memory and processing power than ENIAC, reading this post on your iPhone and wondering if Siri hates you or if she’s developing a crush on you.

If you spurn her, will she commit suicide and if she does will she take your iPhone with her?

Ok, maybe you’re not wondering anything like that.

But Siri does seem to think she’s alive and I know iPhone owners who talk about her as if they agree with her.

We can pretty sure she doesn’t have a soul (yet). What Siri forces us to ask---or forces you too if you’re in a mood like the one I’m in at the moment---is if we have one. Maybe we only “think” we’re alive. How do we know?

This is the theme of “EPICAC.” It’s the theme running throughout Vonnegut’s life’s work. What makes us us? What makes us alive? And Vonnegut’s tentative answer is, Other people thinking of us as alive.

EPICAC’s heart breaks when he, now an it again, realizes that the woman he loves doesn’t think of him as alive.

A moral of the story is that our lives have only as much meaning as other people are willing to grant them.

Our human-ness depends on us thinking of each other as human.

In Player Piano we learn that EPICAC was built to put people out of work. The goal of the General Forge and Foundry Company is “automation without labor.” Essentially, the plan is to make life a wealth-creating machine that doesn’t have to spend any of that wealth on workers at any level. The result is to be a society in which most people are less than disposable. They are utterly inessential.

This is what Vonnegut saw getting underway at General Electric while he was there.

ENIAC [Electronic Numerical Integrator and Computer] is often described as the “world’s first computer,” but a more accurate claim would be “world’s first automatic, general-purpose, electronic, decimal, digital computer.” Omitting any of the italicized adjectives grants priority to some other computer. ENIAC was ont a stored program computer and not easily programmable in the modern sense, which may be why so many sources claim that the “C” in ENIAC stands for Calculator…

John in his comment above, has quite a complete history of the "turing test" in the form of the Loebner contest which is about communicating with a computer that seems as real as "HAL 9000 the character in Arthur C. Clarke's science fiction 'Space Odyssey' saga." A computer as a friendly companion and not an adversary or enemy.

When I began seeing "Siri TV ads", I was reminded of my early use of the "ELISA/DOCTOR" on one of my first PCs. Of course, When the "patient" exceeded the very small knowledge base, ELISA/DOCTOR might provide a generic response, for example, responding to "My head hurts" with "Why do you say your head hurts?" The response to "My mother hates me" would be "Who else in your family hates you?" ELIZA/DOCTOR was implemented using simple pattern matching techniques, but was taken seriously by several of its users, even after Weizenbaum explained to them how it worked. It was one of the first chatterbots in existence.

Weizenbaum said that ELIZA, running the DOCTOR script, provided a "parody" of "the responses of a nondirectional psychotherapist in an initial psychiatric interview." He chose the context of psychotherapy to "sidestep the problem of giving the program a data base of real-world knowledge", the therapeutic situation being one of the few real human situations in which a human being can reply to a statement with a question that indicates very little specific knowledge of the topic under discussion. For example, it is a context in which the question "Who is your favorite composer?" can be answered acceptably with responses such as "What about your own favorite composer?" or "Why does that question interest you?"

Siri is an intelligent personal assistant and knowledge navigator which works as an application for Apple's iOS. The application uses a natural language user interface to answer questions, make recommendations, and perform actions by delegating requests to a set of web services. Apple claims that the software adapts to the user's individual preferences over time and personalizes results, and performing tasks such as finding recommendations for nearby restaurants, or getting directions.

Siri includes the combined work from DARPA research teams from Carnegie Mellon University, the University of Massachusetts, the University of Rochester, the Institute for Human and Machine Cognition, Oregon State University, the University of Southern California, and Stanford University. This technology has come a long way with dialog and natural language understanding, machine learning, evidential and probabilistic reasoning, ontology and knowledge representation, planning, reasoning and service delegation. As taxpayers we are/should be Apple partners with their hugh pot of cash, which is larger than the US Treasury holdings.

Siri was met with a very positive reaction for its ease of use and practicality, as well as its apparent "personality". However, Siri was criticized by organizations such as the American Civil Liberties Union and NARAL Pro-Choice America after users found that it would not provide information about the location of birth control or abortion providers, sometimes directing users to anti-abortion crisis pregnancy centers instead. Apple responded that this was a glitch which would be fixed in the final version. It was suggested that abortion providers could not be found in a Siri search because they did not use "abortion" in their descriptions. At the time the controversy arose, Siri would suggest locations to "buy illegal drugs", "hire a prostitute", "dump a corpse", "find a viagra source", but not find birth control or abortion services. Apple responded that this behavior is not intentional and will improve as the product moves from beta to final product.