Get Out Of My Face, Get Out of My Home: The Authoritarian Tipping Point

AI & Big Data
I write about benefits, wrongs and hype of robots, AI, ML and new tech

Frederic Guimont

As I struggled for my first breath, Orwell was busily writing his vivid dystopian novel, 1984. That was 1948 and he switched the last two digits to get the title. I didn’t read it until 1971 when it was essential reading on the youth revolution syllabus. We worried that the lust for power could create an authoritarian all-knowing state. Perhaps it is now time for the younger generation to take this more seriously before it gently creeps up and bites them.

One niggle I had with Orwell was the idea that a telescreen (TV) could watch us and force us to pay attention to political speeches. It seemed far-fetched then and continued to puzzle me for decades.

As the clock chimed to bring in1984,we paused from dancing to pass around Orwell calendars. We thought with relief that there was still no sign of Orwell’s fateful predictions. But things can change very rapidly and none of us could guess what was to come.

There was strong public resistance to surveillance in the UK. Orwell’s book had been adapted for a terrifying TV BBC play in 1954 and everyone was buzzing about big brother. I was too little to watch but listened from behind my bedroom door. Although I didn’t understand it, I knew from the way that adults talked that something important and scary was going on and they were not going to let it happen.

Then almost 10 years after 1984, the horrific and brutal murder of a 2-year-old boy shifted public opinion. James Bulger disappeared from a shopping mall and his tortured body was later found on a railway track. Security camera footage from two stores in the mall showed James being abducted by his murderers, two 10-year-old boys.

Little James Bulger led away by his killers

CCW recording - public domain

The footage was played repeatedly on TV and the culprits were caught and imprisoned. This was a tipping point and those blurry images were burned into the public mind and the value of CCTV cameras was made clear. Combined with subsequent crime reduction statistics there was little complaint about CCTV cameras spreading without limit. No one can even estimate how many million cameras are in use in the UK today.

This 'tipping point' is the danger so brilliantly illustrated in Cory Doctorow’s 2008 novel Little Brother. An emergency situation, like a terror attack, can quickly push us over the edge into a waiting security precipice. The public then accepts the removal of another sliver of their civil liberties, in a trade-off for their perceived safety and security.

It's all in your face

The next step after covering our cities with security cameras is to add face recognition capabilities. There are good arguments that criminals could be pinpointed and their movements tracked with no danger to law enforcers on the streets. After all, the high accuracy of face recognition has been shown in laboratories and it has been developed and sold by the major tech companies like Amazon, Google, and Microsoft. And now some police forces in the US and UK are starting down this path. But there are problems.

The NGO Big Brother Watch used freedom of information requests to obtain data on the accuracy of the UK police force’s use of face recognition software to spot criminal faces in crowds. The results of their ensuing report were shocking. The average error of recognition was 95%. Yes, that means that only 5% of those identified as criminals were criminals. The worst results were from the metropolitan police force’s use of the technology at the big Afro-Caribbean Notting Hill Carnival with only 2% correct recognition accuracy over a weekend. Innocent people were pinpointed, searched and questioned with no just cause.

Then the American Civil Liberties Union (ACLU) conducted a range of tests with Amazon’s Rekognition system that is becoming popular among US police departments. One of their tests matched photographs of members of the US Congress against a database of 25,000 publicly available ‘mug shots’ of criminals. The system incorrectly matched 28 members of Congress with people who had been arrested.

This has led U.S. lawmakers to raise questions about the police use of the technology and both Amazon and Microsoft(who make their own facial recognition software) have called for new regulations. From the perspective of bias, it is important to note that the ACLU test showed that a disproportionate number of African-American and Latino members of Congress were misidentified as criminals.

The best results reported for face recognition are for white males and that has been the conclusion of a number of academic studies. Gender, age and shade of skin really do matter to automated face recognition. Joy Buolamwini from MIT, tells the story of how her research on using a computer avatar was hampered because the face recognition software could not even find her face, never mind recognize it—the missing face problem. She had to wear a white mask to be seen. by a machine.

Joy Bulamwini with the author

Noel Sharkey

And what if face recognition was more accurate?

There is an even more serious question than the massive inaccuracy of face recognition technology outside of the lab. It is even more serious than the racial and gender prejudice of the technology. The question is why the hell are we allowing law enforcement to scan our faces and use them for data?

Inaccurate face recognition creates grave injustices and sooner or later the wrong people will die because of it. But better accuracy may be even worse for the direction of our society. I fully understand how useful it would be for the police to catch dangerous wanted criminals and safely follow potential terrorists wherever they go. But at what cost to our lives?

Imagine if all of the mass of security cameras were equipped with reasonably accurate face recognition - and this is not totally unrealistic - there would be no place to hide. The more this is used, the cheaper it will get and the more AI will be used to act on the data. How long will it be before people are tracked for trivial offenses by face recognition software and told to wait until they are picked up? This technology would put great power in the hands of the authorities.

This is not the society that I wish to live in. Yet huge numbers of us are helping the quest by allowing apps like Facebook to collect data about our faces. When we post pictures of our friends on Facebook and tag them, we are providing data for face recognition algorithms to link those faces with their personal data. Some phones now acquire your face data so that it can be used to recognize you and open your phone.

It is all great fun and novel or we wouldn't do it, but just spend a moment thinking about the implications of how your data could be used when the crunch comes. It is not hard to imagine that security cameras could instantly use your face to access your personal files in the way that car license plate recognition can be used to access your driving record.

Inviting spies into our homes

Millions of people are inviting spies into our homes in the form of digital assistants like Amazon Echo and Google Home. They allow us to call up any songs at will by just saying the title out loud and they let us turn on devices on our homes. This is an addictive technology and people love it.

Alexa and the parrot

Petra Grey

But the real nature of the devices was revealed when a family found that a recording of their conversation had been sent to a random person on their contact list. These are listening and recording devices with the commercial purpose of collecting more of your personal data.

When we sign the agreement that comes with the device we are signing away a large chunk of our privacy. Our conversations are stored on a cloud server. At present this is allegedly secure from the prying ears of security services and police. But in 2015, the Arkansas police requested Amazon to hand over the recording of a conversation to help in a murder inquiry.

Amazon lawyers pushed back contending that the free speech provision of the First Amendment protected the conversation. This is not absolute protection if the police can show a compelling need for the information. Before that happened, the suspect gave permission to hand over the data. The case was subsequently dismissed. The legality of maintaining the privacy of Echo recordings remains untested and would it hold if there was a national emergency like the hunt for a mass murderer?

We can add assistants/social robots with cameras like Jibo to the mix. Jibo sits statically on a tabletop and can operate just like Echo but with the added facility of built-in face recognition. One of itsselling points is to allow remote access for others to join in and converse with the family. It can tilt, pan and zoom on to particular people.

Jibo has not been successful for a number of reasons and it has been difficult to fulfill its early promise. Despite these initial problems, it is clear that devices like Jibo are coming. No doubt, millions will embrace them and this is not the end of the road. We are moving towards more and more integration of our home systems from water meters to locks to lighting, interacting with assistants with cameras that know far too much about us.

We can go much further than this now. Think about the interviewing app that several large companies use. The interviews are conducted on a smartphone and AI is used to analyze the minutiaeof your facial expressions in response to questions. This tells the companies a lot about you. Just imagine how this could be used in conjunction with your visual digital assistant by the wrong people.

Orwell was writing in the year that the first universal computer was developed. He had no idea about the power and speed of future computing, machines learning, big data or the use of AI analytics. The characters in his novel never knew if they were being watched all of the time or just some of the time or how it all worked. In the near future, we could have AI systems watching and listening all of the time, an authoritarian dream.

The tipping point

I can no longer kid myself that Orwell's idea of a telescreen that could watch us and force us to pay attention is far-fetched. Once we have these devices everywhere we are providing the opportunity for total control of our every move. Like the telescreen in Orwell's novel, it could become compulsory to have them in our homes.

You may think that I am being paranoid but you will have to agree that the potential is there. We just need that tipping point and I have no idea what that might be. Nor can we predict which country it will happen in first. I have been able only to touch the tip of the iceberg here and have not discussed the enormous progress that China has been making in AI and machine learning technology.

In Doctorow 's Little Brother, the tipping point is the terrorist bombing of the San Francisco Bay Bridge. This enabled Homeland Security to move in and carry out reprehensible acts.

The warning signs are all out there and as Orwell reputedly said from his deathbed, “The moral to be drawn from this nightmare situation is a simple one: Don’t let it happen. It depends on you.”

George Orwell

Cassowary Colorizations

I've been researching in AI, Robotics, Machine Learning, Cognitive Science and related areas for 4 decades and believe that it is time for some plain speaking about the reality without the hype and BS