Is Artificial Intelligence the biggest threat to mankind?

The existential threat from AI has been a preoccupation of mainstream Hollywood since HAL and Colossus, and possibly before then. The genre gained popularity with The Terminator and continues with films about the singularity such as this year's The Machine. Now Elon Musk has waded into the debate.

Mass surveillance of citizens, and automated buying and selling of shares is done by computers analysing human behaviour. The resulting decisions are made without any human intervention or control. Any thoughts on this and what it might mean for the future?

Hundreds of scientists and technologists have signed an open letter calling for research into the problems of artificial intelligence in an attempt to combat the dangers of the technology.

Signatories to the letter created by the Future of Life Institute including Elon Musk and Stephen Hawking, who has warned that AI could be the end of humanity. Anyone can sign the letter, which now includes hundreds of signatures.

I'm concerned that we have a legal system incapable of addressing our quick technological development. The problem with rapid tech development is that it leaves little time for reflection.

I'm not panicky about AI, which will likely neither give us the Singularity or Skynet. But I am concerned about the careless automation of society and that technologies we don't understand have been implemented into our daily lives (i.e. the automated trading computers that briefly crashed the stock market).

bitchcraftChief Executive

Posts : 3118Member Since : 2011-03-28Location : I don't know what things will be like in 2 years. I don't have 2020 vision.

I'm not panicky about AI, which will likely neither give us the Singularity or Skynet.

Why do you think the likelihood is not there? I'm thinking it's 15-20 years away.

I'm with Kim Stanley Robinson:"I'm fine with AI, because I don't believe in it in the usual way it is interpreted, as machine consciousness. I don't think that will happen, because brains and machines are very different things, and will end up always doing different things. The tendency to regard the brain as a machine should be easy to dodge by considering how we have successively considered it as a clock, a steam engine, a hologram, and now a computer; none of them are good analogies, not even our current favorites.

So 'artificial intelligence' really will come down to machine algorithms designed for human uses, and when we understand AI as that, we can begin to think about the algorithms and the uses, without getting into anything more metaphysical or fantastical. We will certain project personalities onto machines, we already do that, but it is a projection and we have to keep that in mind."

But the star is undoubtedly Vikander, whose surgically calibrated performance as Ava locates the character squarely in the tradition of sci-fi’s great artificial women: Maria in Metropolis, Motoko Kusanagi in Ghost in the Shell, Pris in Blade Runner and, more recently, Scarlett Johansson in Under the Skin. Vikander’s training as a ballerina (she danced with the Royal Swedish Ballet for nine years) is there in every glance and gesture. This is bewitchingly smart science fiction of a type that’s all too rare. Its intelligence is anything but artificial.

Seems to be getting very good reviews. The interesting thing is that it has divided the feminazis/feminists straight down the middle in their opinion of whether or not it is a "feminist" picture.

Interesting interview with writer/director Alex Garland on Den of Geek. He claims to be making observations about the sexes rather than statements, an important distinction:

Quote :

[Ex Machina's] partly an argument about the objectification of women in a particular way. In this sense, it’s a literal objectification.

Eva’s not actually a woman. She’s a machine that does not have a gender. So the question is, why is she presented as a girl in her early 20s? It’s because we fetishise girls in their early 20s. In a particular kind of way. Sometimes you read about that being shunted onto the media: advertising does it, film does it. It’s bullshit. It’s passing the buck. We all do it. Men do it and women do it. Right?

The reasons we do that are complicated, and I could make guesses as to why it is. But what seems to be beyond debate is that it does actually happen. And there are various tricks - games being played within the film. One of them is that the process that is happening to the young man who appears to be the protagonist of the film, is also happening on the audience. And if he goes down that path and we follow him down that path, which is the intention, what should happen - it’s hard to talk about this without blowing the movie - but the means by which that happens necessitates that Eva be presented as a girl in her early 20s.

In the same interview, Garland talks about GamerGate and whether it was really about the way women are represented in games or that most games are male-centric. Again, an interesting distinction.

Ex Machina is smart, swanky, a little inhuman and preternaturally well crafted. Garland debut-directs as smoothly as a supercomputer. The young tech-firm nerd (Gleeson) who believes he has “won” a week with his CEO is choppered to the CEO’s estate. The gleaming mountains look Scottish, are Norwegian and play the Rockies. (Even the settings in this tale of myriad designer misdirection are multiple-choice.) Here Isaac’s tycoon steers him into a “Turing test”. Gleeson will interact for seven days, under observation, with a new android (Vikander). Object: to determine if she has artificial intelligence, that holy grail of robot creation. The movie goes dapperly, teasingly mad from there.

James Cameron's thoughts on the singularity. This is his throwaway line:

Quote :

People ask me: "Will the machines ever win against humanity?" I say: "Look around in any airport or restaurant and see how many people are on their phones. The machines have already won."

This is his serious and well-made quote:

Quote :

“[AI] will reflect our best and worst nature because we make them and we program them. But it's going to take a lot of money. So who's got the money to do it and the will to do it? It could be business, so the Googles and the other big tech companies. And if you're doing it for business, you're doing it to improve your market share or whatever your business goals are. So you're essentially taking a machine smarter than a human and teaching it greed. Or it's for defense, in which case you're taking a machine smarter than a human and teaching it to kill. Neither one of those has a good outcome in my mind.”

Quite fascinating to think that in the 200,000 or so years that modern homo sapiens have existed, it's only in the last 72 years - a mere sliver in that stretch of time - that we've developed the means to eradicate our species entirely.