Wylie became a household name when he blew the whistle on the Cambridge Analytica data scandal. We talk to him about the effect it has had on his life, his fears for the future of creativity and the need for regulation of social media.

In the early hours of 17 March 2018, the 28-year-old Christopher Wylie tweeted: "Here we go…."

Later that day, The Observer and The New York Times published the story of Cambridge Analytica’s misuse of Facebook data, which sent shockwaves around the world, caused millions to #DeleteFacebook, and led the UK Information Commissioner’s Office to fine the site the maximum penalty for failing to protect users’ information. Six weeks after the story broke, Cambridge Analytica closed.

Wylie was the key source in the year-long investigation. In the months following publication, he has been variously described as "the millennials’ first great whistleblower", a "fantasist charlatan" and, as he calls himself, the "gay, Canadian vegan" who was responsible for creating a "psychological warfare mindfuck tool".

Now, as attention has shifted to this month’s US midterm elections as a test of meaningful change at social-media companies, the bright-orange-haired Wylie is sitting under Campaign’s lens. He talks about his Facebook ban, the need for regulation and his love of the John Lewis ads: "The creative is just brilliant. Any time I see those ads I think John Lewis should run the nation!"

He is articulate, passionate, style-conscious and, perhaps surprisingly for someone who is a data scientist, he is a huge advocate for human creativity. "I don’t believe in data-driven anything, it’s the most stupid phrase. Data should always serve people, people should never serve data," he says.

He believes that poor use of data is killing good ideas. And that, unless effective regulation is enacted, society’s worship of algorithms, unchecked data capture and use, and the likely spread of AI to all parts of our lives is causing us to sleepwalk into a bleak future.

Not only are such circumstances a threat to adland – why do you need an ad to tell you about a product if an algorithm is choosing it for you? – it is a threat to human free will. "Currently, the only morality of the algorithm is to optimise you as a consumer and, in many cases, you become the product. There are very few examples in human history of industries where people themselves become products and those are scary industries – slavery and the sex trade. And now, we have social media," Wylie says.

There are very few examples in human history of industries where people themselves become products and those are scary industries – slavery and the sex trade. And now, we have social media

"The problem with that, and what makes it inherently different to selling, say, toothpaste, is that you’re selling parts of people or access to people. People have an innate moral worth. If we don’t respect that, we can create industries that do terrible things to people. We are [heading] blindly and quickly into an environment where this mentality is going to be amplified through AI everywhere. We’re humans, we should be thinking about people first."

His words carry weight, because he’s been on the dark side. He has seen what can happen when data is used to spread misinformation, create insurgencies and prey on the worst of people’s characters.

The political battlefield

A quick refresher on the scandal, in Wylie’s words: Cambridge Analytica was a company spun out of SCL Group, a British military contractor that worked in information operations for armed forces around the world. It was conducting research on how to scale and digitise information warfare – the use of information to confuse or degrade the efficacy of an enemy.

Wylie was a 24-year-old fashion-trend-forecasting student who also worked with the Liberal Democrats on its targeting. A contact introduced him to SCL.

As director of research, Wylie’s original role was to map out how the company would take traditional information operations tactics into the online space – in particular, by profiling people who would be susceptible to certain messaging.

This morphed into the political arena. After Wylie left, the company worked on Donald Trump’s US presidential campaign and – possibly – the UK’s European Union referendum. In February 2016, Cambridge Analytica’s former chief executive, Alexander Nix, wrote in Campaign that his company had "already helped supercharge Leave.EU’s social-media campaign". Nix has strenuously denied this since, including to MPs.

It was this shift from the battlefield to politics that made Wylie uncomfortable. "When you are working in information operations projects, where your target is a combatant, the autonomy or agency of your targets is not your primary consideration. It is fair game to deny and manipulate information, coerce and exploit any mental vulnerabilities a person has, and to bring out the very worst characteristics in that person because they are an enemy," he says.

"But if you port that over to a democratic system, if you run campaigns designed to undermine people’s ability to make free choices and to understand what is real and not real, you are undermining democracy and treating voters in the same way as you are treating terrorists."

One of the reasons these techniques are so insidious is that being a target of a disinformation campaign is "usually a pleasurable experience", because you are being fed content with which you are likely to agree. "You are being guided through something that you want to be true," Wylie says.

To build an insurgency, he explains, you first target people who are more prone to having erratic traits, paranoia or conspiratorial thinking, and get them to "like" a group on social media. They start engaging with the content, which may or may not be true; either way "it feels good to see that information".

When the group reaches 1,000 or 2,000 members, an event is set up in the local area. Even if only 5% show up, "that’s 50 to 100 people flooding a local coffee shop", Wylie says. This, he adds, validates their opinion because other people there are also talking about "all these things that you’ve been seeing online in the depths of your den and getting angry about".

People then start to believe the reason it’s not shown on mainstream news channels is because "they don’t want you to know what the truth is". As Wylie sums it up: "What started out as a fantasy online gets ported into the temporal world and becomes real to you because you see all these people around you."

Some conservatives have argued that the Trump campaign has been unfairly criticised for its use of data, while former President Barack Obama and his digital agency Blue State Digital were lauded for their use of social-media data in his successful 2008 election campaign.

But Wylie, who has worked with Obama’s former national director of targeting, claims the two campaigns took different approaches. For example, the Obama campaign used data to identify people who were eligible to vote but had not registered.

"When the Obama campaign put out information, it was clear it was a campaign ad, and the messaging, within the realm of politics, was honest and genuine. The Obama campaign did not use coercive, manipulative disinformation as the basis of its campaign, full stop. So, it’s a false equivalency and people who say that [it is equivalent] don’t really understand what they’re talking about."

There’s a difference between persuasion, and manipulation and coercion, he adds – and between an opinion and provable disinformation. "Data is morally neutral, in the same way that I can take a knife and hand it to a Michelin-starred chef to make the most amazing meal of your life, or I can murder someone with it. The tool is morally neutral, it’s the application that matters," he says.

It's the application… I can take a knife and hand it to a Michelin-starred chef to make the most amazing meal of your life, or I can murder someone with it. The tool is morally neutral

Psychographic potential

One such application was Cambridge Analytica’s use of psychographic profiling, a form of segmentation that will be familiar to marketers, although not in common use.

The company used the OCEAN model, which judges people on scales of the Big Five personality traits: openness to experiences, conscientiousness, extraversion, agreeableness and neuroticism.

Wylie believes the method could be useful in the commercial space. For example, a fashion brand that creates bold, colourful, patterned clothes might want to segment wealthy woman by extroversion because they will be more likely to buy bold items, he says.

Sceptics say Cambridge Analytica’s approach may not be the dark magic that Wylie claims. Indeed, when speaking to Campaign in June 2017, Nix uncharacteristically played down the method, claiming the company used "pretty bland data in a pretty enterprising way".

But Wylie argues that people underestimate what algorithms allow you to do in profiling. "I can take pieces of information about you that seem innocuous, but what I’m able to do with an algorithm is find patterns that correlate to underlying psychological profiles," he explains.

"I can ask whether you listen to Justin Bieber, and you won’t feel like I’m invading your privacy. You aren’t necessarily aware that when you tell me what music you listen to or what TV shows you watch, you are telling me some of your deepest and most personal attributes."

This is where matters stray into the question of ethics. Wylie believes that as long as the communication you are sending out is clear, not coercive or manipulative, it’s fine, but it all depends on context. "If you are a beauty company and you use facets of neuroticism – which Cambridge Analytica did – and you find a segment of young women or men who are more prone to body dysmorphia, and one of the proactive actions they take is to buy more skin cream, you are exploiting something which is unhealthy for that person and doing damage," he says. "The ethics of using psychometric data really depend on whether it is proportional to the benefit and utility that the customer is getting."

Creativity trumps data

This also means using caution over how much data is being amassed. Adland must take responsibility for its insatiable desire for data, and the pressure it applies to social-media companies to provide it. Its usual defence is the more data it has, the better, because consumers don’t object to personalised ads.

Wylie disagrees. If, he says, a crossword app uses data to personalise your experience, but as well as basic information also harvests your religion, sexual orientation, text messages and photos, this is disproportionate to the value it provides.

"Why does it need to know my sexual orientation to create a crossword puzzle? Is it going to make it glittery and rainbowy? Is that relevance? No. You can create something that’s relevant without that amount of information, and you can have a lot of information and create something that’s not relevant."

Wylie argues that obsession with data can strangle creativity. "Data informs you, it doesn’t tell you what to do. You [as a human] should always understand what you should do," he says.

"If you work in a creative team, you shouldn’t have to do something because an algorithm said so. What makes us different from animals? It’s the fact that we’ve got culture. We paint things, we listen to music, we watch TV, we wear cool clothes. These are all of the things that literally make us human and make life worth living. So, the idea of eroding that because some database or neural net said so is just bullshit. The computer can’t imagine a situation that is different from what it has observed."

If you work in a creative team, you shouldn’t have to do something because an algorithm said so. What makes us different from animals? It’s the fact that we’ve got culture

Wylie contends that data should be used to help creatives by finding niche audiences, for example. He also believes brand-building should sit alongside targeting for best effect. "There is a role for creating universal narratives for everyone to understand even if they’re not in your market. Your consumer interacts with society. They’re not these narrow wedges. There must be some sort of shared meaning between us as to what a brand means."

The son of a doctor and a psychiatrist, Wylie grew up in British Columbia. At school he was bullied and diagnosed with ADHD and dyslexia. He left at 16 without qualifications. But by the age of 20 he had worked for the leader of the opposition in Canada, taught himself to code and moved to London to study law at LSE. He was working on a PhD in fashion-trend forecasting when he encountered psychographic profiling research. But his curiosity has turned sour.

A bleak future?

Wylie is concerned that tech developments – such as the rise of AI – could fundamentally damage society. Google Home has recently launched an option to give only good news to its users, for instance.

"What they are actually starting to do is warp that person’s perspective from the very beginning of their day," Wylie points out. Once you have AI in every part of your life, it would be everywhere making decisions about you and for you.

Wylie believes we could get to the point where AI replaces creatives in 20 years. "If your definition of creativity is the generation of novel outputs, then you can have ‘creative algorithms’," he says. "This is why as a community we need to come up with principles of how to engage with technology. Just because we can do something, doesn’t mean we should."

He adds: "If we replace everyone with robots, what’s the point of humanity, then? Shall we all just sit in those floating chairs they have in the film WALL-E and be fed through a tube and entertained through AI-generated TV shows which are hyper-personalised to my profile? What a shit future that would be, right? We shouldn’t be endeavouring to replace human creativity with artificial creativity."

Regulation – a glimmer of hope

Aside from reprioritising creativity over data, Wylie is adamant that regulation is the answer to end immoral practices on the internet.

"As a society we regulate things we come into contact with that could cause us harm, such as air travel, doctors or electricity. Currently, software technology, social media and online advertising is the Wild West," he says.

"You eat food four or five times a day, you check your phone on average 150 times a day. People sleep with their phones more than they sleep with people," he adds. "The fact that people are engaging so much more now with advertising and online content warrants a discussion on whether there should be statutory rules that are enforceable as to the conduct and behaviour both of social-media and tech platforms and the advertisers that use them."

Wylie argues that regulation is not a bad thing for commercial viability. After all, seat belts and air bags haven’t stopped people buying cars. It will, he says, also help create consumer trust and confidence in the long run and prevent a backlash.

It will also create a level playing field, where those that behave ethically are not at a disadvantage if competitors do not adhere to the same principles.

He adds that: "A lot of tech companies have their backs up. They’re like a dog in the corner. They’re going through these existential conversations like ‘OMG what’s happening?’" He goes on to argue that the sector relies on everyone behaving well to maintain itself. "If I were them, I’d be talking about how we can help each other do better."

The barriers to regulation include the international nature of tech companies, the concern that governments are too far behind the tech companies and that consumers don’t really care about privacy.

Wylie rebuts each of these. There are common rules for other international industries – such as regulating airport codes, aeroplanes taking off and landing in different countries and sending post around the world. He describes the suggestion that MPs don’t understand the industry well enough to act meaningfully as "a bullshit argument".

He continues: "Tell me what congressman or MP understands how aeroplanes fly or cancer medicines work, and what is safe and not safe? Or what is the appropriate level of pesticides to use on farms? They don’t. These are all highly technical, highly complicated, ever-moving industries, and before they were regulated they were using the same arguments."

Clashes with Facebook

Wylie is opposed to self-regulation, because industries won’t become consumer champions – they are, he says, too conflicted.

"Facebook has known about what Cambridge Analytica was up to from the very beginning of those projects," Wylie claims. "They were notified, they authorised the applications, they were given the terms and conditions of the app that said explicitly what it was doing. They hired people who worked on building the app. I had legal correspondence with their lawyers where they acknowledged it happened as far back as 2016."

He wants to create a set of enduring principles that are handed over to a technically competent regulator to enforce. "Currently, the industry is not responding to some pretty fundamental things that have happened on their watch. So I think it is the right place for government to step in," he adds.

Facebook in particular, he argues is "the most obstinate and belligerent in recognising the harm that has been done and actually doing something about it".

In words that might resonate with marketers burned by Facebook’s measurement issues, he says the company needs to be more proactive about fixing issues rather than "requiring a hell of a lot of public pressure before it does anything".

He adds: "Facebook needs to acknowledge it has an institutional cultural problem it needs to address. I really hope it can get to a place where it will actively fix itself."

Since our interview with Wylie, Facebook has hired the former UK deputy prime minister Nick Clegg as its communications and global affairs head. Wylie has accused Clegg of "selling out".

But what responsibility do consumers have in all of this? Absolutely none, according to Wylie.

"What responsibility does somebody have walking into a dangerous building, or when prescribed medicine by their doctor? Do they inspect the engineering when they step on to a plane? They don’t, because they shouldn’t. It’s not the role of the consumer to make sure that they’re safe, it’s the role of industry who’s profiting from them.

"I don’t want to just attack Facebook. There’s a real problem within Silicon Valley," he says. He explains that tech companies reward friendly "white hat" hackers with money for bringing system vulnerabilities to their attention. "But when a journalist, whistleblower or civil society does it, and they do it in public, there are threats, legal threats."

At this point in the interview Wylie becomes angry: "[Facebook] sent me threatening letters. Then they demanded all of my personal devices because they think they are the police for themselves. I said ‘no, I can’t because I’ve handed over the evidence to the police, who are the lawful and rightful authority to investigate Facebook, not you.’

"Because I refused to give in to their legal threats and hand over my devices and information that would interfere with a police investigation, that’s why they banned me."

He says an additional ban by Instagram, "shows the disproportionate market power that [Facebook]can exert. The fact that a whole different company can ban someone with no due process for something that doesn’t involve it at all."

Although Facebook declined to comment for this piece, it referred us to previous statements it has made on the issue. In these, it says it banned Wylie because, like Cambridge Analytica, he received a copy of the quiz’s Facebook data, which was a breach of Facebook’s terms and conditions.

Despite this, Wylie insists he is not anti-social media. "I don’t believe that people should have to delete Facebook. I’m not a supporter of #DeleteFacebook because it’s like saying if you don’t want to get electrocuted, get rid of electricity. It’s stupid. No, demand better standards for your electricity so you don’t get electrocuted," he says.

"Social media is now an essential part of most people’s lives. You can’t apply for most jobs now without LinkedIn. You can’t communicate practically with friends if you don’t have a form of social media. What job can you get if you say to an employer: ‘I’m really great but because I want to enforce my privacy standards and maintain my mental health, I refuse to use anything that touches Google’s services’?

"So the solution is not to delete these platforms, or attack them and make them the enemy, it’s to make sure they are doing their job to make a safe environment for people."

By coming forward, Wylie has put his own safety at risk. "I’ve had quite a few threats, there have been some incidents, but I’ve had a lot of support from my lawyers and the police," he says.

"You can’t live your life running away from something. I actually have this tendency to run at things. You can’t dwell on stuff, otherwise you’ll just paralyse yourself."

The story shows no signs of slowing down. Ofcom, the ICO and the DCMS Committee as well as the US authorities are running ongoing investigations. Meanwhile, Wylie is flying across the world to give speeches and interviews.

When asked what key message he would like the industry to remember, he offers up this: "Don’t be dicks. Understand that we should all be serving and helping people in anything that we do. You don’t need to be coercive, manipulative or creepy. If you can’t win over voters or consumers through being honest, then you shouldn’t work in advertising or marketing because you’re clearly not good enough."

The story in brief

In 2014, a Cambridge University academic created "This Is Your Digital Life", a personality quiz app on Facebook.

About 270,000 people took the test. But, like many apps at the time, it was also able to access the data – including the "likes" – of their friend networks. Facebook has since tightened its third-party rules to stop this happening. Data from about 87 million people’s profiles were taken without their explicit consent.

This data was passed to Cambridge Analytica. Wylie claims it used it to psychographically profile people and target them with pro-Trump content.

The passing of data from the academic to Cambridge Analytica was a breach of Facebook’s data policy.

Facebook says when it learned the rules had been broken in 2015, it removed the app and asked for assurances the information had been deleted.

Cambridge Analytica and the Trump campaign claim they did not use this data.

Cambridge Analytica closed down six weeks after the story was published. In a statement, it said: "[Cambridge Analytica has] been vilified for activities that are not only legal, but also widely accepted as a standard component of online advertising."

The company remains under investigation by the UK Information Commissioner’s Office. The Digital, Culture, Media & Sport Committee inquiry into fake news, to which Wylie gave evidence, is ongoing.

Since the scandal, Facebook has made a series of changes. It will no longer send employees to work at the offices of political campaigns during elections. It now requires political campaigns in the UK to verify their identities. It will remove developers’ access to data if an app hasn’t been used in three months. Apps can no longer ask for data about a person’s friends unless their friends also authorise the app.