Silicon Valley's 'gut-wrenching' year confronting its dark side

If Twitter has a soul, it probably looks something like Claire Diaz-Ortiz.

She was one of Twitter's first employees. Somewhere around number 50. Over five years at the company, she got the Pope to join Twitter, live-tweeted the birth of her daughter, and wrote at length on how organizations can use Twitter to spark positive social change. She wrote a book called "Twitter For Good."

"Everything about the last year has made me stop and think about that [book] title, and how it would be received now," Diaz-Ortiz said. "In 2017, those of us embedded in the world of social feel the need to defend the medium. Meanwhile, a small blue bird is chattering away on our shoulders, asking, 'Is this Twitter for bad?'"

Claire Diaz-Ortiz, an early Twitter employee and author of the book Twitter For Good, now wonders if we're witnessing "Twitter for bad."

It's not simply the role social media has played in providing a platform for Trump, a president many leaders in the industry openly oppose. In the year since the election, there's been a steady drumbeat of damaging headlines about fake news, polarizing filter bubbles and Russian propaganda campaigns across social media.

The latest innovation in Silicon Valley may be doubt. After years of maintaining an unwavering optimism about technology, some who've worked at Twitter, Facebook and Google have spent much of 2017 reckoning with the unintended consequences of the products they built.

It reached a fever pitch last week as executives from the three tech companies were grilled by Congress over Russian meddling in the 2016 election. As Senator Dianne Feinstein, a Democrat from the companies' home state of California, put it in one hearing: "You've created these platforms, and now they are being misused. And you have to be the ones to do something about it, or we will."

Chamath Palihapitiya, a former Facebook executive and influential venture capitalist, says it's been "gut-wrenching" for tech employees to see tools intended to connect people instead be abused by bad actors to tear at the very fabric of American society.

"Do I feel guilty? Absolutely I feel guilt," Palihapitiya told CNN.

The tech industry fails to predict the future

For Silicon Valley, part of the sting comes from being blindsided. Suddenly, an industry that prides itself on predicting the future has been forced to admit its failure to foresee the darker side of its products.

The same social media tools that helped propel the candidacy of the first African American president, who the industry widely supported, could also help the campaign of a populist celebrity real estate developer. The same algorithms that surfaced the most relevant content for users could create seemingly impenetrable ideological echo chambers. And the same advanced ad targeting tools powering dominant billion-dollar businesses could be weaponized to sow discord among voters.

Wesley Chan, an early product manager at Google who left the company in 2014, says he was always more concerned about people abusing Google for economic reasons, not "political sabotage" from foreign actors. "This political stuff never crossed my mind once," he says. "You only deal with things when you get burned. It's a terrible lesson to learn."

Palihapitiya says the sense of shock is evident among tech leaders. "Nobody ever thought that you could have such a massive manipulation of the system," he says. "You can see the reaction of the people who run these companies. They never thought it was possible."

Chamath Palihapitiya, a former Facebook executive and venture capitalist, says this year has been "gut-wrenching" for tech employees.

In the first days after the election, Facebook CEO and cofounder Mark Zuckerberg dismissed as "crazy" the idea that fake news on the social network influenced the election. Nearly a year later, he walked back the comment. And in a remarkably candid post for Yom Kippur, he apologized for "the ways my work was used to divide people rather than bring us together."

The same can be said of Twitter. For years, Twitter saw itself as a haven for free expression. A popular slogan internally described Twitter as "the free speech wing of the free speech party." Employees were particularly proud their service helped fuel the pro-democracy protests that swept across the Middle East in 2011, dubbed the Arab Spring.

"That fact has at times prevented the product from progressing for fear of upsetting or impeding the next Arab Spring," says Nathan Hubbard, a former VP at Twitter who left in mid-2016. "Of course, it's the same features that gave rise to the Arab Spring that gave rise to the fringes of Trumpism," he adds, referring to the activity of white supremacists. "It's just that the good came before the 'bad' organized."

The American awakening

Yet, the potential for the internet to corrode civil society was apparent years ago in some of the same Middle East countries where the Arab Spring movements previously took root.

Wael Ghonim, an Egyptian activist who became a leading voice in the Arab Spring, began noticing the polarizing impact of social media on politics in Egypt as far back as 2013. Ghonim, a former Google executive, raised his concerns in public speeches, but found it "wasn't hitting home" in the tech community.

"No one is incentivized to believe that their work has negative contributions to society," Ghonim said. The industry's thinking, he added, was that "it could be attributed to so many things," and may be "just something that has to do with Egypt."

It's only this past year, with social media being accused of warping political discourse in Silicon Valley's backyard, that the industry is taking these concerns more seriously.

Facebook, Twitter and Google have each worked to crack down on fake news and continue investigating Russia's activity. Twitter is finally issuing new rules to curb harassment. Facebook has taken its first modest steps to ease the filter bubble and pledged during the hearings to double the number of people working on safety and security issues to 20,000 next year (which would still average out to just one safety worker for every 100,000-plus Facebook users).

"We are moving from the age of denial into the age of realization," Ghonim says, "but it's not full realization yet."

Case in point: One early Facebook employee agreed that fake news, filter bubbles and disinformation campaigns are "very real problems," but stressed that "these issues ... are as old as the concept of media itself."

"It is completely misunderstanding the nature and threat of 'fake news' to suggest that Facebook and the people behind it, past and present, bear any personal responsibility for what happened in last year's election," the former employee said, "and the general state of political discourse in America today."