Information

Organizer 1: Civil Society, Western European and Others Group (WEOG)Organizer 2: Private Sector, Eastern European Group

Speaker 1: Marilia Maciel, Civil Society, Latin American and Caribbean Group (GRULAC)Speaker 2: John FRANK, Private Sector, Western European and Others Group (WEOG)Speaker 3: Felix Kartte, Intergovernmental Organization, Western European and Others Group (WEOG)

Additional Speakers:

Goetz Frommholz, Open Society Foundation

Format:

Round Table - Circle - 60 Min

Description: Cyber-enabled threats to democratic processes continue to be a concern around the world. In 2018, half of all advanced democracies holding national elections had their democratic processes targeted by cyber threat activity, which represents a three-fold increase since 2015 and a trend that we expect to continue in the coming year. Recognizing this threat and increasing trend, many governments, civil society groups and industry have sought to take action through underscoring the need for action in diplomatic dialogue and intergovernmental fora, such as through the Paris Call for Trust and Security in Cyberspace and the 2018 G7 Charlevoix Commitment on Defending Democracy from Foreign Threats. But to make progress on defending democracy and protecting election integrity requires better understanding the core definitions around these issues. Foreign intervention in democratic elections, whether to promote democratic values or to achieve opposite goals, can be seen as part of those foreign policy tools – ranging from diplomacy through negotiations, provision of foreign aid or imposition of economic sanctions, etc. – that countries’ have at their disposal. Beyond great powers, regional and international organizations have a well-documented history of influencing third countries’ governments in order to promote democratic values – namely, greater peace, prosperity, and pluralism. On the other hand, malicious actors seeking to interfere in the political climate or election of another country for nefarious purposes is also increasing in scale and impact given the development of technological tools. For both malicious actors and legitimate actors seeking to promote democracy or a particular political agenda, many of the tactics used can look similar: from disseminating rumors (false or true) to damaging rival candidates credibility, public threats or promises, public statements in support of candidates, provision of campaign funds, or increasing foreign aid or other types of assistance. This panel (full title) “Realpolitik foreign policy or manipulation of sovereign democratic processes: where's the red line?” is intended to facilitate a discussion around foreign interference and the use of disinformation. The Paris Call for Trust and Security in Cyberspace, which has been signed by over 500 entities (governments, civil society and industry organizations) worldwide, calls on the world to work together to “prevent interference in electoral processes.” While foreign election interference is not a new phenomenon, traditional tactics can now be achieved at a much greater scale with the help of new technologies. This is why it is critical to make progress in understanding the norms and rules of the road in this space. Throughout the course of the roundtable, experts and roundtable participants will answer the following questions: How do we define foreign interference? Where are the red lines between public diplomacy and election interference? Are there situations in which foreign election interference and corresponding use of cyber-enabled tactics such as disinformation be deemed acceptable? How can disinformation be used for nefarious and legitimate purposes? What international legal frameworks govern election interference and disinformation? Finally, what can we do to mitigate these threats in order to prevent interference in electoral processes? Format - Overview of election interference trends and threats (5 minutes) - Context setting on public diplomacy tools in the information space (5 minutes) - Case study deep dive (i.e. Ukraine, South America) (10 minutes) - Moderated discussion of roundtable questions (25 minutes) - Open mic session (10 minutes) - Conclusion and next steps (5 minutes)

Expected Outcomes: Very few, if any discussions around disinformation and election interference have focused on the idea of defining norms of behavior around what is acceptable activity in this space. In order to make progress on the commitments of multistakeholder agreements such as the G7 Charlevoix Commitment on Defending Democracy from Foreign Threats or the Paris Call for Trust and Security in Cyberspace, we must be able to accurately define the issue and then agree as an international multistakeholder community on what is permissible behavior. This panel is intended to be a starting point to make progress on this pillar and the findings of the discussion will be used in follow-up roundtables across the future gatherings of the “Friends of the Paris Call” or other initiatives designed to make progress on cybersecurity norms against the interference of elections and democratic processes.

Policy Question(s):

Norm on preventing interference in electoral processes: How do we define foreign interference?

Where are the red lines between public diplomacy and election interference?

Are there situations in which foreign election interference and corresponding use of cyber-enabled tactics such as disinformation be deemed acceptable?

How can disinformation be used for nefarious and legitimate purposes?

What international legal frameworks govern election interference and disinformation?

Finally, what can we do to mitigate these threats in order to prevent interference in electoral processes?

Relevance to Theme: Security and safety are prerequisites to economic growth and a healthy digital environment beneficial to all. To achieve security and safety requires having a set of commonly understood and followed rules of behavior in the digital space which guide actions and punish deviations from those rules. The Paris Call for Trust and Security in Cyberspace, which has been signed by over 500 entities (governments, civil society and industry organizations) worldwide, calls on the world to work together to “prevent interference in electoral processes.” However, to make progress on norms for security and safety, including noninterference in elections, we must first look to defining the scope and terminology of election interference.

Relevance to Internet Governance: Norms of behavior in cyberspace are a critical component to governing our actions online. The increased trend of interference in elections over the past decade, aided by new technologies, has led governments, industry and civil society actors to call for new norms against the interference in elections. This panel seeks to deep dive into one specific aspect of election interference, regarding the legitimate and illegitimate use and manipulation of information to influence an election.

Discussion Facilitation:

An open mic session follows the main session to enable the audience and remote participants to join the conversation and present their experiences, opinions, suggestions, etc., on how to move the debate forward. Audience discussants will either queue at their stakeholder-assigned mics, or the panel rapporteurs will bring the mics to discussants, and rotate, with online participants having their own equal queue.

Online Participation:

We will have two online moderators to assist with the online conversation. To broaden participation, social media (Twitter and Facebook) will also be employed by the on-line moderators who will be in charge of browsing social media using a dedicated hashtag.

Proposed Additional Tools: In order to broaden the conversation before, during and after this roundtable we would like to set up a dedicated Microsoft Teams channel in which interested participants can contribute to the discussion by adding questions, sending news articles and following up with experts. During the session Teams can be leveraged for its accessibility features (such as Translator, screen viewer, dictation) to enable those with disabilities to contribute to the conversation.

Very few, if any discussions around disinformation and election interference have focused on the idea of defining norms of behavior around what is acceptable activity in this space. In order to make progress on the commitments of multistakeholder agreements such as the G7 Charlevoix Commitment or the Paris Call for Trust and Security in Cyberspace, we must be able to accurately define the issue and then agree as an international multistakeholder community on what is permissible behavior. This panel is intended to be a starting point to make progress on this pillar and the findings of the discussion will be used in follow-up roundtables across the future gatherings of the “Paris Call Communities” or other initiatives designed to make progress on cybersecurity norms against the interference of elections and democratic processes.

2. Discussion Areas:

There was consensus that clear red lines include interference in election infrastructure and voter disenfranchisement. Hence discussion on norms that protects electoral processes and infrastructure is needed. Furthermore, due to lack of clarity of international law in this space, there was agreement that a set of criteria to measure disinformation would help to bring discussions forward. The criteria brought forward were transparency, extent of deception, purpose, scale and effect. In particular, participants agreed that more transparency is required to understand what is happening on social media platforms and then also to be able to analyse the consequences and effects it has. The scale of the operations we witness also constitutes a crucial factor for the panelists as the phanonemon as such is not new but the extent of its use is. However, it was also highlighted that the playbook of threat actors is much broader than just spreading false content. The simple manipulation of divisive domestic debates that would not be debunked by fact-checkers poses another significant problem. The panelists agreed that we are living in a post-fake news era and now have to talk about false narratives. Lastly, participants were concerned that there must be a balance between openness online and combating disinformation. Human rights such as freedom of speech must be safeguarded.

3. Policy Recommendations or Suggestions for the Way Forward:

Participants agreed that a good way forward would be the further development of the above mentioned criteria to better measure harmful content online. Before these could be translated into international rules, however, customary declarations at the political level have to made. In general, creating transparency on social media platforms was seen as another crucial starting point for governments and civil society to base their analyses and policy decisions on. The latter would also bring more legal certainty to the liability regime of internet platforms for third party content. Once a more straightforward problem definition would be agreed on, online platforms should be encouraged to work together with governments on that basis. Lastly, understanding the actual impact online information has on offline behavior was also considered as essential to better address the issue at hand.

4. Other Initiatives Addressing the Session Issues:

Industry participants referenced the multistakeholder initative 'Paris Call for Trust and Security in Cyberspace' as an example defining norms and rules of behaviour in cyberspace. It constitutes the largest multistakeholder initative related to cyberspace in history and the most broadly accepted political statement the international community formulated so far. Among the signatories are 77 states and over 900 private sector and civil society organizations worldwide. Furthermore, the G7 Charlevoix Commitment was named as another important political statement from the leading industrial states in the world. It recognizes the threat of foreign actors seeking to undermine democratic institutions and electoral processes. Other participants referenced academic work that reflected on how to address the issue from a normative perspective. Please find the relevant work listed below.

5. Making Progress for Tackled Issues:

The panelists agreed to further explore the common elements of an influence operation which are based on five criteria, namely transparency, extent of deception, purpose, scale, and effects. The use of these criteria to assess disinformation campaigns constitutes a first step to develop a better basis of knowledge and help to verify whether, for example, electoral infrastructure had actually been compromised or not. Once there is a more straight forward problem definition, stakeholders should clearly encourage online platforms to further work on the issue.

6. Estimated Participation:

There were aproximetely 80 participants onsite, of which over 50% were women. Online there were 8 participants following the discussion but no questions were asked through the platform.

7. Reflection to Gender Issues:

Gender issues were not discussed per se, however, other human rights' related issues with regard to minority inclusivity were touched upon. Traditionally underrepresented parties actively contributed to the panel.