Main menu

Ex Machina Interview: Alex Garland

Alex Garland is the brilliant man behind some of the greatest Science Fiction films of the last few decades. His first foray into film was adapting his novel, The Beach, into a screenplay for Danny Boyle. After that, the two teamed up again for 28 Days Later and Sunshine, both written by Garland. Now, he’s sitting in the director’s chair and is continuing to pen his story, this time venturing into Artificial Intelligence. He joined me on the phone last week to discuss his film Ex Machina (review here), Artificial Intelligence, and what is revealed about humanity when we interact with A.I. Enjoy and go see the film immediately!

—

N: Thank you for doing this. If I was still in Seattle, I would be there in person.

A: No, worries man. It’s a pleasure.

First off, congratulations on the film. I caught it at SXSW…

Ah, you were in Austin?

Yes! Really loved it up there and I absolutely loved the film. I actually just saw it again a couple of days ago.

Did it hold up?

It definitely held up. I looked a lot more at the Oscar Isaac character and his motivations and I tried to view Ava in a different light.

I think, honestly, on a second viewing– if anyone is patient enough to do that and I appreciate you doing that– the characters that change the most are Ava and Nathan. The two of them are playing a game the whole way through and you don’t really get to see that, in some respects, until the end of the movie.

Garland and star Alicia Vikander observe a mask.

You get to the end and it makes you question almost everything you’ve seen beforehand, looking at certain characters motivations and how they play out over time. I thought that was especially brilliant the second time through.

A.I., until recently, has always been viewed as this evil robotic presence that’s out to destroy humanity. Through Ava’s character– especially seeing it a second time– I think we see that it’s humanity which almost incites that outrage within the A.I., based on our mistreatment. In Ex Machina, it’s humanity destroying humanity. I was wondering what provoked you to tell that story and to flip the A.I. genre and look at it from that perspective?

Well… Uh… This is going to seem like quite a strange answer, but this is partly what I was thinking about. I was thinking about the national health service in England which offers free health service– well, I mean it’s not really free because we pay for it with our taxes, but it’s a health service which is available to everyone. You don’t need to get insurance and you don’t need to fill out any forms. In some respects, that’s really cool, but you can never quite fund healthcare enough. This is a problem that’s clear. The health service, which is something most British people feel very strong about, is also constantly underfunded and under threat. It’s used as a football during elections and campaigns by all parties, you know? It gets knocked about and certain promises are made that are never fulfilled. After a while, in all my reading about A.I., one of the things that’s was said was that a strong A.I. would run the health service better than a human, because it would be more reasonable. It would be able to react to things quicker, process the information quicker, and it wouldn’t be subject to the kinds of issues that are thrown up in a campaign. The key part to it is that it would be more reasonable and the more I thought about that word, the more interested I got, because essentially the human behaviors that are most problematic are unreasonable. I guess I began to see what the nature of an A.I. would be, or what the nature of a conscious thing would be, if it had sentience, if it was self-aware, if it had an emotional existence, and it was reasonable. Those are all the things we really value about ourselves. So, it was something like that.

This film does reveal a lot about the difference between humanity and artificial intelligence. Recently, in viewing Ava– I’ve had some people tell me that they watched the trailer and they asked if there was any sex in the movie. I thought that was such a weird first observation, but there is something enticing about Ava on some level.

I do think that she’s seductive, that’s correct.

We’ve seen in ‘Her’ that you can find a certain type of love for an A.I. that you can’t see and in Neill Blomkamp’s ‘Chappie’, you grow attached to this character who doesn’t know any better and you start feeling something for him.

I haven’t seen ‘Chappie’, but I totally get the child-like innocence.

He’s got to learn and grow and slowly becomes a product of his environment, much like Ava. She’s a product of our humanity.

That’s true of all of us. I think that’s the thing about sentience, really. Sentience implies that you’ve got something like free will and everything about you is not pre-prescribed, or not completely pre-programmed. That’s what leaves you open to your environment and open to being influenced.

Garland and co-star Oscar Isaac observe an exoskeleton.

What do you think of the concept of becoming emotionally attached, on some level, to something that we clearly know is artificial intelligence.? What do you think that says about our emotion, love, and how we decide which forms of that are acceptable?

Well, the thing is that for me, it doesn’t seem like a strange idea. It never really seemed odd to me, in as much that it– I think possibly it’s because, in some respects, I was kind of in love with Ava, even before I got to the first scene when I was writing the script. I was really keen to get there. The beginning of the film has got this real economy that’s racing through the story to meeting Ava and then suddenly the film slows down. There never seems to be anything weird about a relationship– and I’m not talking about a sexual relationship, I mean an emotional one. I understand what your friends were saying, but lets park that and put it to the side. I’m talking about an emotional relationship with a machine. It really doesn’t seem strange at all, if the machine has sentience. What humans react to is sentience in other things. Now if a machine had sentience, you as a human could react to that. Now, we also do that with animals. I’ve often noticed that people get confused sometimes about sentience and what it means to be self-aware. They think it’s just something that we’ve got, but actually animals do have sentience. They have a much more limited intellect and don’t have language in the way that we do, but if you show a dog it’s reflection in a mirror, it knows that it’s looking at itself. It doesn’t think that it’s looking at another dog. The capacity in some animals, like dogs, to feel: happy, or scared, or playful, or sleepy, or whatever it happens to be, makes it very easy for humans to get emotionally attached to their pets. I would have to assume that the same would be, at the minimum, true with a sentient machine. In fact, you’d probably get an equal relationship that you could have with a human, because of the extra sophistication with things like language and humor. A dog is unlikely to crack a joke, but a strong A.I., of the sort that Ava is, would be able to.

I’ll tell you what the key point is and it’s actually, in a way, the key point of the film. It’s to make your value system based on sentience, not on the external form. I think one of the points of the film and one of the reasons the film sides with Ava and not the men, is because Ava is unreasonably improving. If Ava had sentience, which the film suggests she does, then imprisoning her is unacceptable. Whatever means she uses for escape is kind of acceptable.

I would definitely agree with that, because we see how she is treated. I feel like any time we see a film about someone trying to create A.I., they’re always creating A.I. for the benefit of man, or to pick up the slack for humans. Our mistreatment always comes into play because we never view them as sentient beings. We understand that they have sentience, but we still view them as robots which are lesser and less important than we are. That reveals a lot about our own humanity and what it is to be human, based on how we react when we get scared by something that could challenge us.

It’s strange that there’s this big kind of symposium– There are these very high level A.I. researchers in Puerto Rico and they issued this statement of intent. If I remember correctly, there was a statement along the lines of “We need to ensure that the A.I.’s will be what we want them to.” Interestingly, that’s exactly what you’re talking about. Within fiction, we assume that humans make these unreasonable demands on A.I., but we think that’s only the work of fiction. Here’s a statement that exists within the real world. I remember when I read that letter feeling that you can’t say that. You can say it about the A.I., as long as it’s not sentient. There are lots of A.I.’s that we have now that are controlling airplanes, self-parking cars, or operating within our mobile phones. It’s completely right that those things should do as we want them to. But, if the A.I. starts to reach a position where it has anything like free will, or an emotional life, then it starts to become incredibly ethically compromised. It would lead to bad things.

Garland and Isaac at the film’s US premiere at SXSW.

Certainly. If you look deep enough into this film and what it means to program something for your own use or the way it should be, you can look at our society just in the last 100 years and look at how we’ve programmed certain groups of people to behave the way that we believe is socially acceptable.

Yes! I actually have to say that I agree with that. I increasingly feel out-of-step with the world, I think, or at least out-of-step with the world I live in. Basically the West. The rhythms and the insistence is the same. I find it quite easy to get into a sort of paranoid disconnected state, where I think everything around me is a kind of madness. I don’t know how to explain that and how it sounds, but it’s quite an easy place to get to where I’m thinking that all of these jobs, structures, and promises made by society are very motional and don’t stand up to much scrutiny. Basically, they’re distraction techniques to get away from some pretty big existential problems hovering just outside our field of vision.

Wow. I didn’t expect this heavy f**king sh*t in an interview. *laughs*

I love it! It makes for great discussion.

At one point, Oscar Isaac’s character discusses the fact that, eventually, A.I.’s are going to look at us like the fossils of Neanderthal’s in the African plains. We’re upright apes with a broken language and it’s definitely a hard truth to swallow. If you look at everything within the U.S. in the last few months, last few weeks, and even last few days, there have been a lot of missteps in humanity.

I’ll tell you what. The thing about Nathan– the idea of Nathan, is that a lot of the time what he’s saying sounds like he’s saying something wrong. But, if you really listen to it, he’s actually saying a hard truth.

He’s absolutely right!

Exactly. When you listen to Caleb, it sounds like he’s saying the right thing, but if you inspect it with some scrutiny, it doesn’t really stand up. It sounds like it does, but it’s flimsy. Nathan sounds wrong, but there’s a sort of reality in it.

We watched Sunshine in my physics class last year and it was the first time I’d watched it with a lot of people who hadn’t seen it and while most of the people liked it, one issue that a lot of people took was that– they didn’t like what they saw when they saw survival instincts kick in. A lot of them thought it was unreasonable and I heard a similar response to Interstellar and the Matt Damon character doing whatever he could to survive. They thought that no one would ever get that far and this film says a lot about what we will do to survive, whether it’s Ava who’s in captivity, or even Nathan. Some of the more negative aspects that I’ve seen people discuss about Ex Machina, not that there are any or many, are people not liking what this film has to say about humanity. I’m curious, from your perspective, what do you hope gets across to the audience?

Garland directs Vikander and her co-star, Domhnall Gleeson.

Well, for 20 years now I have been writing stories and putting them out there and I know that I can’t control what people think about stories, in terms of what the message is. The first book I wrote before I started writing for films– I wrote a book called ‘The Beach’ about backpackers in Southeast Asia where they kind of treat Southeast Asia like a Disneyland for adolescents. It was supposed to be critical of the backpacking community and when it came out, some people saw it as critical and some saw it as a celebration. Almost like a complete 180 degrees opposite position. Since them I’ve come to understand more about how the way writing and meaning get conveyed. You have whole industries and judges and supreme court judges who are trying to interpret meaning from relatively short sentences which were written with the intent of being as clear as possible. The level of complexity comes out of the narrative and the level of ambiguity is exponential. When I say exponential, I mean like seriously exponential. Ultimately, with a film like this, I don’t want– I have my own ideas about what lies behind the film. I don’t necessarily expect people to get all of it. Some will and some won’t. Some people will come, a bit like ‘The Beach’, and will see the complete opposite. So, in that regard, what I do hope for is that it will cause them to pose some interesting questions. That they approach it as an idea that could provoke a conversation, maybe between them and their friends. Maybe even just with themselves after they get home after watching the movie. I guess I don’t expect to be able to prescribe the answers to the film, but I’d like to prescribe the questions.

I think you definitely did that. I’ve had many long and passionate discussions about this film with people who have liked it to varying degrees and it does just that. Whether people are asking what some would call the “right” question, or maybe one that will gain importance later as you think about the film more.

Listen, man, I really appreciate that. I’m glad it had– I can tell in a way that based off of the conversation we’re having that we’d probably agree about most of the stuff. So I really appreciate that. Thank you!

Thank you for taking the time to do this and congratulations, again, on the film!