We are writing to urge you to VOTE “FOR” PROPOSAL 10 on the proxy card, which asks our Company to provide a report on Content Governance. The proposal makes the following request:

RESOLVED:Shareholders request Alphabet Inc. issue a report to shareholders at reasonable cost, omitting proprietary or legally privileged information, reviewing the efficacy of its enforcement of Google’s terms of service related to content policies and assessing the risks posed by content management controversies, including election interference, to the company’s finances, operations, and reputation.

SUPPORTING STATEMENT:Proponents recommend the report include assessment of the scope of platform abuses and address related ethical concerns.

Google, together with its subsidiary YouTube, is facing mounting risks from content shared on its platform that has spanned the gamut from hate speech to violence to political subterfuge. The social platform’s ability to assess its content governance, reporting mechanisms, and enforcement capabilities will inform how successfully the Company will navigate a complex landscape.

The challenges are substantial. In June 2017, for example, Google General Counsel Kent Walker acknowledged that “more needs to be done” immediately to fight terrorism online and promised that the Company would take a tougher stance to identify and remove content violating Google policies.1 In December 2017, YouTube CEO Susan Wojcicki acknowledged in a blog post that “some bad actors are exploiting [YouTube’s] openness to mislead, manipulate, harass or even harm.” She promised to continue the work to “stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”2

It is clear, however, that Google’s response to mounting controversies and enforcement of its terms of service continues to be problematic. In April 2018, a CNN investigation found that YouTube channels carried ads from over 300 companies and organizations promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda.3 CNN reported: “Companies such as Adidas, Amazon, Cisco, Facebook, Hershey, Hilton, LinkedIn, Mozilla, Netflix, Nordstrom and Under Armour may have unknowingly helped finance some of these channels via the advertisements.”

Implementing the Proposal would represent a proactive step toward greater accountability in addressing the global controversies surrounding Google. We believe the Company would benefit from transparent reporting mechanisms and a comprehensive, forward-looking approach to the problems identified by the press, legislators, regulators, advocacy groups and Google’s users.

We believe shareholders should vote “FOR” the proposal for the following reasons:

1.

Google’s controversies have a direct impact on the Company’s market value.

a)

Ad Revenue: In March 2017, securities analysts at Nomura Instinet estimated that Google stood to lose $750 million in YouTube revenue because of boycotts by advertisers concerned about the placement of their ads within or adjacent to hateful content.4

Recent 2018 incidents involving questionable YouTube content pose a continued threat of advertiser boycotts or significant reductions in advertiser spending.5 For example, Under Armour said it would pause its advertising buy after its ads appeared on a white nationalist YouTube channel.

Procter & Gamble, one of the world’s largest advertisers, said that while it would now advertise on YouTube, it would only advertise on videos P&G has reviewed and approved - fewer than 10,000 YouTube channels, compared with about 3 million channels previously.6

Following YouTube’s April disclosures, the online publication Business Insider asked: “Does YouTube have the ability to police the service? At minimum, can YouTube's managers create a safe haven for advertisers? If not, the situation could have big implications for Google's parent company Alphabet Inc., and its relationship with Wall Street. Google is considered a growth company but investors want to know where that growth will come from in the future.”7

b)

Data Privacy: Concerns regarding social media platforms have been reflected in industry market valuations. As Facebook Inc.has attracted widespread public attention related to its Cambridge Analytica data breach scandal – and a subsequent dramatic decline in market value - a report by ISS concluded that “Facebook is far from the only company in the Technology Sector to have issues concerning data privacy” and many companies “face the same risks as Facebook concerning data security concerns, as these problems are inherent to the business.”

Regulatory Risk: The Company must confront new challenges presented by the European Union’s General Data Protection Regulation (GDPR), which becomes effective today.The GDPR permits users to opt-out of Google’s targeted advertising, which could reduce advertising revenue for the Company. In addition, violating GDPR mandates could subject the Company to fines of up to 4 percent of annual revenues.

In the United States, a majority of Americans say tougher government regulations are needed to rein in the power of Google and other social media companies, according to a CBS News/YouGov poll released April 10, 2018.8

2.

The controversies surrounding Google constitute a significant public policy issue:

a)

Scope of Impact: It’s estimated that about 400 hours of video content is uploaded to YouTube every minute and over one billion hours of YouTube content is watched every day.9 Much of that video, however, contains content that may violate YouTube’s terms of service.10

b)

Russian Propaganda: Despite investor requests at Alphabet’s AGM in June 2017 for information on “fake news” propagated over the platform, it took Congressional testimony nearly 6 months later to learn about the extent of Russian propaganda on Google’s platform in the lead up to the 2016 U.S. presidential election. During a Congressional hearing in October 2017, the Company admitted that abuse of the YouTube platform to enable Russia’s influence on the U.S. 2016 presidential campaign was greater than previously acknowledged; Google said agents from Russia’s Internet Research Agency uploaded more than 1,000 videos on its YouTube platform.11 This raises critical concerns about the robustness of Google’s content governance approach.

c)

Terrorist Content: Google has been under fire in recent years for inadequate responses to terrorist-related content. In 2017, the Company was sued by relatives of the victims of the 2015 San Bernardino attack for helping fund ISIS through advertisements on YouTube.12

In August 2016, the UK parliament cited YouTube, together with several other social media giants, for “consciously failing” to tackle terrorism content. The Parliament said these platforms had become “the vehicle of choice in spreading propaganda and the recruiting platforms for terrorism.”13

Google continues to have a mixed track record of protectingusers from election interference, fake news, hate speech, sexual harassment, and violence.

a)

Inadequate Reporting: The Ranking Digital Rights 2018 Corporate Accountability Index recommended that Google “Be transparent about policing of content.The company should disclose comprehensive data on content and account removals due to violations of the company’s terms of service.”

And while You Tube has started to respond to these ongoing controversies, there is more work to be done to shift from a defensive approach to a proactive one. In a December 2017 blog, YouTube CEO Wojcicki listed several key measures YouTube is adopting to address abuse of the platform, including recruiting more content reviewers, and deploying advanced machine learning technology to “tackl[e] issues at scale.” Wojcicki noted that challenges would evolve and change and so should the enforcement methods to respond.14

In April 2018, YouTube released its first-ever Community Guidelines Enforcement disclosures,15 an analysis of how the company is managing content outside of its terms of service. According to the report, in the three-month period between October and December 2017, 8.3 million videos were removed from YouTube; 80 percent of those removed were “flagged” by software, 13 percent by “trusted flaggers,” and only 4 percent by regular users. With the review system’s heavy reliance on software to identify problematic content, multiple analysts noted that the YouTube report did not describe the standards for the review process or reveal how frequently human reviewers rejected the software-driven initial “flag.” And despite the number of videos removed, YouTube reportedly continues to feature videos featuring bestiality, for example. This prompted a columnist for Mashable to write: “Because YouTube relies on a combination of imperfect algorithmic monitoring, and human complaints, these videos often receive millions of views before they're taken down. This automatically puts YouTube in a losing and ineffective defensive position when it comes to ridding its platform of prohibited content.”16

b)

Emerging Issues: YouTube’s new service “Super Chat” was criticized in an April 2018 BuzzFeed analysis for generating extreme pay to play content: “Prominent far-right and white nationalist figures have for months been helping YouTube channels earn thousands of dollars thanks to frequently racist commenters who pay for the opportunity to make their voices heard…The use of Super Chats to spread and monetize racism and hate speech is the latest content moderation and product headache for YouTube, the internet's biggest video platform. It has also drawn fire for hosting videos that exploit children, spread extremism, feature bestiality, and spread conspiracy theories.”17

In the era of #MeToo, Google has not adequately addressed the unfortunate role of social media in perpetuating sexual harassment. One in five women ages 18 to 29 is sexually harassed online according to Pew Research Center, 83% of whom believe it is a major problem. The severity also gets reflected in YouTube’s quarterly report which shows sexual content is the No.1 reason for flagging, at 30.1%.18

Conclusion:

Now is the time for Alphabet to evaluate the risks that arise when users violate its terms of service. We believe the potential harm to the Company’s reputation, finances, and operations warrants a closer look at the fallout of recent controversies, the material risks to shareholder value, and the efficacy of current policies, procedures, and corrective strategies.

“A vote FOR this proposal is warranted, because a report on assessing the effectiveness of enforcement of content policies could help provide shareholders with valuable information on how well the company is assessing and mitigating content-related controversies.”

“…the company does seem to be responding to each content management-related controversy in a reactive way. Shareholders would benefit from additional disclosure reviewing and compiling in one report the efficacy of its enforcement of its terms of service related to content policies, and assessing the risks posed by content management controversies, with statistics on the percentage of content that is flagged as offensive and how that may change over time, how quickly content is removed if it is offensive, or other appropriate quantitative metrics. Therefore this proposal merits shareholder support.”