‘Our minds can be hijacked’: the tech insiders who fear a smartphone dystopia

The Google, Apple and Facebook workers who helped make technology so addictive are disconnecting themselves from the internet. Paul Lewis reports on the Silicon Valley refuseniks who worry the race for human attention has created a world of perpetual distraction that could ultimately end in disaster

Justin Rosenstein had tweaked his laptops operating system to block Reddit, banned himself from Snapchat, which he compares to heroin, and imposed limits on his use of Facebook. But even that wasnt enough. In August, the 34-year-old tech executive took a more radical step to restrict his use of social media and other addictive technologies.

Rosenstein purchased a new iPhone and instructed his assistant to set up a parental-control feature to prevent him from downloading any apps.

He was particularly aware of the allure of Facebook likes, which he describes as bright dings of pseudo-pleasure that can be as hollow as they are seductive. And Rosenstein should know: he was the Facebook engineer who created the like button in the first place.

A decade after he stayed up all night coding a prototype of what was then called an awesome button, Rosenstein belongs to a small but growing band of Silicon Valley heretics who complain about the rise of the so-called attention economy: an internet shaped around the demands of an advertising economy.

These refuseniks are rarely founders or chief executives, who have little incentive to deviate from the mantra that their companies are making the world a better place. Instead, they tend to have worked a rung or two down the corporate ladder: designers, engineers and product managers who, like Rosenstein, several years ago put in place the building blocks of a digital world from which they are now trying to disentangle themselves. It is very common, Rosenstein says, for humans to develop things with the best of intentions and for them to have unintended, negative consequences.

Rosenstein, who also helped create Gchat during a stint at Google, and now leads a San Francisco-based company that improves office productivity, appears most concerned about the psychological effects on people who, research shows, touch, swipe or tap their phone 2,617 times a day.

There is growing concern that as well as addicting users, technology is contributing toward so-called continuous partial attention, severely limiting peoples ability to focus, and possibly lowering IQ. One recent study showed that the mere presence of smartphones damages cognitive capacity even when the device is turned off. Everyone is distracted, Rosenstein says. All of the time.

But those concerns are trivial compared with the devastating impact upon the political system that some of Rosensteins peers believe can be attributed to the rise of social media and the attention-based market that drives it.

Drawing a straight line between addiction to social media and political earthquakes like Brexit and the rise of Donald Trump, they contend that digital forces have completely upended the political system and, left unchecked, could even render democracy as we know it obsolete.

In 2007, Rosenstein was one of a small group of Facebook employees who decided to create a path of least resistance a single click to send little bits of positivity across the platform. Facebooks like feature was, Rosenstein says, wildly successful: engagement soared as people enjoyed the short-term boost they got from giving or receiving social affirmation, while Facebook harvested valuable data about the preferences of users that could be sold to advertisers. The idea was soon copied by Twitter, with its heart-shaped likes (previously star-shaped favourites), Instagram, and countless other apps and websites.

It was Rosensteins colleague, Leah Pearlman, then a product manager at Facebook and on the team that created the Facebook like, who announced the feature in a 2009 blogpost. Now 35 and an illustrator, Pearlman confirmed via email that she, too, has grown disaffected with Facebook likes and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesnt have to.

One morning in April this year, designers, programmers and tech entrepreneurs from across the world gathered at a conference centre on the shore of the San Francisco Bay. They had each paid up to $1,700 to learn how to manipulate people into habitual use of their products, on a course curated by conference organiser Nir Eyal.

Eyal, 39, the author of Hooked: How to Build Habit-Forming Products, has spent several years consulting for the tech industry, teaching techniques he developed by closely studying how the Silicon Valley giants operate.

The technologies we use have turned into compulsions, if not full-fledged addictions, Eyal writes. Its the impulse to check a message notification. Its the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later. None of this is an accident, he writes. It is all just as their designers intended.

He explains the subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create a craving, or exploiting negative emotions that can act as triggers. Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation, Eyal writes.

Attendees of the 2017 Habit Summit might have been surprised when Eyal walked on stage to announce that this years keynote speech was about something a little different. He wanted to address the growing concern that technological manipulation was somehow harmful or immoral. He told his audience that they should be careful not to abuse persuasive design, and wary of crossing a line into coercion.

But he was defensive of the techniques he teaches, and dismissive of those who compare tech addiction to drugs. Were not freebasing Facebook and injecting Instagram here, he said. He flashed up a slide of a shelf filled with sugary baked goods. Just as we shouldnt blame the baker for making such delicious treats, we cant blame tech makers for making their products so good we want to use them, he said. Of course thats what tech companies will do. And frankly: do we want it any other way?

Without irony, Eyal finished his talk with some personal tips for resisting the lure of technology. He told his audience he uses a Chrome extension, called DF YouTube, which scrubs out a lot of those external triggers he writes about in his book, and recommended an app called Pocket Points that rewards you for staying off your phone when you need to focus.

Finally, Eyal confided the lengths he goes to protect his own family. He has installed in his house an outlet timer connected to a router that cuts off access to the internet at a set time every day. The idea is to remember that we are not powerless, he said. We are in control.

But are we? If the people who built these technologies are taking such radical steps to wean themselves free, can the rest of us reasonably be expected to exercise our free will?

Not according to Tristan Harris, a 33-year-old former Google employee turned vocal critic of the tech industry. All of us are jacked into this system, he says. All of our minds can be hijacked. Our choices are not as free as we think they are.

Harris, who has been branded the closest thing Silicon Valley has to a conscience, insists that billions of people have little choice over whether they use these now ubiquitous technologies, and are largely unaware of the invisible ways in which a small number of people in Silicon Valley are shaping their lives.

A graduate of Stanford University, Harris studied under BJ Fogg, a behavioural psychologist revered in tech circles for mastering the ways technological design can be used to persuade people. Many of his students, including Eyal, have gone on to prosperous careers in Silicon Valley.

I dont know a more urgent problem than this, Harris says. Its changing our democracy, and its changing our ability to have the conversations and relationships that we want with each other. Harris went public giving talks, writing papers, meeting lawmakers and campaigning for reform after three years struggling to effect change inside Googles Mountain View headquarters.

It all began in 2013, when he was working as a product manager at Google, and circulated a thought-provoking memo, A Call To Minimise Distraction & Respect Users Attention, to 10 close colleagues. It struck a chord, spreading to some 5,000 Google employees, including senior executives who rewarded Harris with an impressive-sounding new job: he was to be Googles in-house design ethicist and product philosopher.

Looking back, Harris sees that he was promoted into a marginal role. I didnt have a social support structure at all, he says. Still, he adds: I got to sit in a corner and think and read and understand.

He explored how LinkedIn exploits a need for social reciprocity to widen its network; how YouTube and Netflix autoplay videos and next episodes, depriving users of a choice about whether or not they want to keep watching; how Snapchat created its addictive Snapstreaks feature, encouraging near-constant communication between its mostly teenage users.

The techniques these companies use are not always generic: they can be algorithmically tailored to each person. An internal Facebook report leaked this year, for example, revealed that the company can identify when teens feel insecure, worthless and need a confidence boost. Such granular information, Harris adds, is a perfect model of what buttons you can push in a particular person.

Tech companies can exploit such vulnerabilities to keep people hooked; manipulating, for example, when people receive likes for their posts, ensuring they arrive when an individual is likely to feel vulnerable, or in need of approval, or maybe just bored. And the very same techniques can be sold to the highest bidder. Theres no ethics, he says. A company paying Facebook to use its levers of persuasion could be a car business targeting tailored advertisements to different types of users who want a new vehicle. Or it could be a Moscow-based troll farm seeking to turn voters in a swing county in Wisconsin.

Harris believes that tech companies never deliberately set out to make their products addictive. They were responding to the incentives of an advertising economy, experimenting with techniques that might capture peoples attention, even stumbling across highly effective design by accident.

A friend at Facebook told Harris that designers initially decided the notification icon, which alerts people to new activity such as friend requests or likes, should be blue. It fit Facebooks style and, the thinking went, would appear subtle and innocuous. But no one used it, Harris says. Then they switched it to red and of course everyone used it.

Rosenstein, the Facebook like co-creator, believes there may be a case for state regulation of psychologically manipulative advertising, saying the moral impetus is comparable to taking action against fossil fuel or tobacco companies. If we only care about profit maximisation, he says, we will go rapidly into dystopia.

James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the companys global search advertising business, he has had a front-row view of an industry he describes as the largest, most standardised and most centralised form of attentional control in human history.

Williams, 35, left Google last year, and is on the cusp of completing a PhD at Oxford University exploring the ethics of persuasive design. It is a journey that has led him to question whether democracy can survive the new technological age.

He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. It was that kind of individual, existential realisation: whats going on? he says. Isnt technology supposed to be doing the complete opposite of this?

That discomfort was compounded during a moment at work, when he glanced at one of Googles dashboards, a multicoloured display showing how much of peoples attention the company had commandeered for advertisers. I realised: this is literally a million people that weve sort of nudged or persuaded to do this thing that they werent going to otherwise do, he recalls.

He embarked on several years of independent research, much of it conducted while working part-time at Google. About 18 months in, he saw the Google memo circulated by Harris and the pair became allies, struggling to bring about change from within.

Williams and Harris left Google around the same time, and co-founded an advocacy group, Time Well Spent, that seeks to build public momentum for a change in the way big tech companies think about design. Williams finds it hard to comprehend why this issue is not on the front page of every newspaper every day.

Eighty-seven percent of people wake up and go to sleep with their smartphones, he says. The entire world now has a new prism through which to understand politics, and Williams worries the consequences are profound.

The same forces that led tech firms to hook users with design tricks, he says, also encourage those companies to depict the world in a way that makes for compulsive, irresistible viewing. The attention economy incentivises the design of technologies that grab our attention, he says. In so doing, it privileges our impulses over our intentions.

That means privileging what is sensational over what is nuanced, appealing to emotion, anger and outrage. The news media is increasingly working in service to tech companies, Williams adds, and must play by the rules of the attention economy to sensationalise, bait and entertain in order to survive.

Williams saw a similar dynamic unfold months earlier, during the Brexit campaign, when the attention economy appeared to him biased in favour of the emotional, identity-based case for the UK leaving the European Union. He stresses these dynamics are by no means isolated to the political right: they also play a role, he believes, in the unexpected popularity of leftwing politicians such as Bernie Sanders and Jeremy Corbyn, and the frequent outbreaks of internet outrage over issues that ignite fury among progressives.

All of which, Williams says, is not only distorting the way we view politics but, over time, may be changing the way we think, making us less rational and more impulsive. Weve habituated ourselves into a perpetual cognitive style of outrage, by internalising the dynamics of the medium, he says.

It is against this political backdrop that Williams argues the fixation in recent years with the surveillance state fictionalised by George Orwell may have been misplaced. It was another English science fiction writer, Aldous Huxley, who provided the more prescient observation when he warned that Orwellian-style coercion was less of a threat to democracy than the more subtle power of psychological manipulation, and mans almost infinite appetite for distractions.

Since the US election, Williams has explored another dimension to todays brave new world. If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves faculties that are essential to self-governance what hope is there for democracy itself?

The dynamics of the attention economy are structurally set up to undermine the human will, he says. If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on. If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

Will we be able to recognise it, if and when it happens? Williams replies. And if we cant, then how do we know it hasnt happened already?