Here's a thought experiment. Think back to when you were a young child. No computer, no phone, and no postal service. The only people you could maintain relationships with were those you could see and touch -- which pretty much meant your immediate family, neighbors, and classmates. That was your entire universe.

What percentage of our relationships today fall under that umbrella? How many of our friends are those with whom we hardly ever share physical space? Does it matter? Should it matter?

In Her, a 2013 Spike Jonze film set in the near-future, Joaquin Phoenix plays Theodore, a sad-eyed romantic who's in the process of getting divorced. One day, he purchases an Operating System with artificial intelligence (Samantha, voiced by Scarlett Johannson) and the ability to learn and evolve. The OS is designed to be a personal assistant, but they soon bond, and become... well, more than friends.

There's a great monologue in Terminator 2where Sarah Connor realizes how the reprogrammed Terminator was the perfect guardian because it would have endless amounts of patience and time for her son John, and would never grow angry or frustrated in the ways that we humans inevitably do. Her's Samantha is cut from a similar cloth; she's a girlfriend who's always friendly, supportive, and interested in whatever Theo is doing or wants to discuss. And this interest is quite literally switched on at the click of a button.

In this world, while "relationships" with one's OS aren't unheard of, they still are viewed as a bit weird, and Theodore's soon-to-be-ex-wife is aghast upon learning the details. Undaunted, Theo takes Samantha on "dates" into the outside world that involve him strolling around while engaged in deep conversations with his mobile electronic device; the irony of course being that, to the outside observer, this doesn't seem much different than what we see on the streets today.

What about the physical component of a romantic relationship? Theodore and Sam have a workaround for that dilemma... though I wasn't quite buying the solution. But I loved the depiction of a benign AI, a rarity in a Hollywood which has always been far more interested in serving up tales involving Skynet, HAL 9000, and Ultron. It never quite made sense to me why computers would be so bent on destroying humanity? Why would they even care?

Clarence, how well did the film work for you with regards to both Theodore's human interactions and his AI ones? Did you view Samantha and her brethren as a net positive or negative for society? And do you have any favorite AI-themed films?

[While I bashed Skynet a moment ago, the original Terminator is a sci-fi horror classic in my eyes. I'll also add two recent pictures: Ex Machina, and the charming Robot & Frank.]

Clarence: When I think of my favorite films (and, I have to say, TV shows) with a heavy AI component, I have to go to the heavyweight franchises Star Wars and Star Trek (specifically, The Next Generation series). The thing is, though, these two entities reflect the biggest flaw in how AI is perceived in the popular imagination.

Obviously, R2D2 and C-3P0 are fundamental characters in the Star Wars universe, true marvels of engineering in any galaxy. But their roles in the stories are as comic foils and servants to the humanoids. Likewise, Commander Data on ST:TNG is stronger, faster, smarter, and more durable than any other character in the franchise (there are many episodes where the crew would have been goners if not for their android officer saving the day). Yet he is content to serve as a mid-level officer, with occasional humorous interludes into his attempts to become more human through everyday experiences.

In many, many science fiction and science fantasy movies, artificial intelligence is either trying to kill us (as you rightly pointed out) or function as the ultimate sci-fi cheat code. The flaw I'm speaking of is this: If there were a universe where it was possible to create a mechanical entity that is so advanced that it's not only self-sufficient but self-aware, why would it want to serve anybody or do what it's told or spend its existence doing what lesser beings want? Doesn't a certain level of consciousness imply a certain level of agency and/or free will?

This is a question I think is beautifully addressed in Her. Not only is it a fantastic story, it also dramatizes the intelligence in Artificial Intelligence in a way unlike any other film I've seen. Watching how the narrative progresses was a pretty incredible experience I don't want to give away, but suffice to say it was remarkable how director Spike Jonze's world of human/AI interactions felt familiar and alien at the same time.

I think we in the Real World are a lot closer to the futuristic environment Her depicts than we were even a few years ago. These days our phones want to hear our voices so they can recognize us, while also urging us to ask them anything. Our TVs want to sort through our histories so they'll know exactly what we like to watch based on our moods. Specialized little 'bots like Alexa want to turn our lights on and off, play our favorite songs, and order our favorite food, all in a calm, familiar voice. The lines between technology as tool and technology as companion/friend/more-than-friend are blurring a little bit more with each new iteration. Samantha may already be here, and her name is Siri.

On balance, I think technological advances are a net positive for humanity. But humanity has to adjust its expectations and even belief systems to keep up. Plus, I think it's worth mentioning that EVERY popular form of mass communication technology ever invented, going all the way back to the Daguerreotype and even the printed word, started out being heavily used for sexual content. As AI gets more and more sophisticated in the future, humans will find ways to get busy with it, directly or indirectly.

How comfortable would you be with a computer (a consumer product, mind you) potentially knowing so much about you that it can seamlessly interact with you in real time and form a "relationship?" And have you seen any of Jonze's other movies? Between Her, Being John Malkovich, and Adaptation, I'm inclined to think he might be one of the best screenwriter/directors working today.

Kevin: Clarence, your question about my potential comfort level with a computer "friend" got me thinking about the concept of the uncanny valley -- the idea that robots become more appealing to humans the more they resemble us, until they get to a point where even slight differences seem repulsive instead. In Her, we don't actually see any physical manifestation of Samantha, and it helps that we know she's being voiced by a "real" actress. But what if my companion was something which sounded like Siri? That would be rather off-putting, to say nothing of the whole Orwellian vibe I'd feel when dealing with an entity who knew my entire e-mail and browsing history!

[On a related note, was there anything more chilling than the implacable Cylon voice from the original Battlestar Galactica? That voice was first heard, however, in the AI-themed Colossus: The Forbin Project, a criminally-underappreciated sci-fi gem from 1970. Please, dear readers, rush out and catch this film.]

Can any computer truly be programmed to experience human emotion, and respond accordingly? To know jealousy, desire, and love? It's clear that Her's Samantha believes she is experiencing love -- only that it's vastly different from how most humans define it. I wonder how much of the aforementioned emotions stem from our hardwiring to mate and procreate? These concepts would mean nothing to a computer entity, and so when Samantha drops the bombshell as to exactly how many "loving" relationships she's a part of, we shouldn't be shocked. But of course, the revelation is devastating to poor Theo, who can't bear the idea that his connection with Sam is just one of many for her.

I was never much of a Trekkie (or is that Trekker?), but one of my all-time favorite sci-fi films is Star Trek II: The Wrath of Khan. The genetically-engineered, villainous Khan is an example of how, as Spock once noted, "superior ability breeds superior ambition." Did they ever discuss on TNG exactly why Data was content to be a mid-level officer? Were androids even allowed to command starships? And wouldn't it have been more efficient (and cheaper!) to, say, replace the whole engineering crew of the Enterprise with androids?

[Another thought experiment for you: imagine swapping the overly-polite, butler-esque C-3PO with Data in their respective environments? Star Wars just suddenly got a bit more chilling. How long before Han Solo would deliver a blaster shot to the back of Data's head?]

Have you been following the societal discussion surrounding the development of artificial intelligence and its implications? Sadly, it hasn't yet entered our political sphere, but as the pace of technological change grows ever faster, it's becoming a more pertinent topic with each passing year. There's an excellent summary of all the major viewpoints at the blog Wait But Why, and most experts seem to be quite concerned over where we might ultimately end up.

Before I turn it back to you, I'd like to mention a couple of Spike Jonze touches (and like you, I'm a big fan of his films) from Her that I thought were brilliant:

1) Theodore's job is ghostwriting "handwritten" letters between loved ones. He's offering his own version of simulated interaction -- and doing so within a corporate environment.

2) One of Theo's friends is a programmer who's working on a sort of "Virtual Mom" game. We're only privy to the action for a brief moment, but I thought it was hilarious -- the player scrambles to get their kids ready for class in the morning (earning penalties for feeding them sugary cereals), and then earns gobs of "jealousy points" from other parents by bringing in baked treats for school functions. It's The Sims on steroids. Why hasn't someone made this??

Clarence: An actual video game based on what moms (or family members, or co-workers) actually thought of each other would get out of hand pretty quickly...and make a lot of money for a game company!

I did notice those two items you mentioned - it really is astounding how much detail a Spike Jones film has. I suspect there is a parallel being drawn between Theodore and Samantha in this story. Both of them in their own way are providing someone else emotional connection without being physically present or revealing their true nature to that someone else. Some people like to reduce all human of existence to a series of chemical and biological processes. If emotions really are just chemicals mixing and firing off through structural matter, perhaps it makes sense that one "machine" can interface with another on that level...?

I feel like we could have a couple dozen more conversations about how AI is portrayed in movies and still not come close to covering all the major themes. The interaction between human thought and machine processing is fascinating stuff. To vastly simplify things, it seems to me there are two basic questions involved for a good AI story:

-- Is is possible for humans to make a machine in their own image?

-- If that happens...what then? And what would it mean, beyond the fact that humans got really, really smart? As of now, it seems emotions are one of the primary traits humans have that make us very different from machines. but would the machines even want them? (Assuming they would be capable of "wanting" anything?)

As consumers we expect faster, more realistic, and more complex experiences with technology; fiction is becoming reality right before our eyes. There are so many different directions storytellers have taken in exploring this, many of which touch on some pretty massive implications about existence. I kind of like the fact that nobody has the definitive answer...yet.

[Did you see the movie? Want to add to the conversation? Leave a comment below!]