Content of the Session:Surveillance through the collection of population-data has historically functioned as an oppressive tool to control the bodies of women and other marginalised groups, and is closely related to and informed by colonial modes of managing populations underpinned by white supremacist, capitalist and heteropatriarchal ideologies. Today, “big-data,” metadata and the technologies used to collect, store and analyse them are, similarly, by no means neutral, and come with their own biases and resultant exclusions.

The “informatisation” of the body in the digital age is increasingly redefining how we understand “embodiment” and bodily experience. At the level of ICTs and their relationship with public-policy development, both State and powerful non-State actors have come to view the body as data in order to provide services and/or segment and target markets, employing new ways to monitor, assess, analyse, categorise and ultimately manage and control the body. The term “dataveillance,” which combines “data” and “surveillance,” has been used to describe these systematic data-based surveillance practices that involve sorting and aggregating large quantities of data to monitor, track and regulate people and populations.

In this session, participants will discuss and highlight the connection between “big-data,” surveillance and sexuality in the gathering and exploitation of data relating to internet users’ online identities and behaviors. The session will explore the evolution and normalization of surveillance through “big-data” and its relationship with the growing reliance on algorithmic decision-making, particularly at the level of the development and implementation of public policy. Particular emphasis will be placed on how this impacts on especially “at-risk” and marginalised groups such as women and people marginalised on the basis of their sexual orientation and/or gender-identity. Participants will present key issues the internet governance community must consider about the relationship between and impact of power, agency and consent when developing and applying standards and guidelines for the collection and use of internet user data by both State and non-State actors.

Relevance of the Session:Algorithmic decision-making and data surveillance are often seen as neutral technological tools. However questions of privilege extend to data and the politics of algorithms in several ways - it is a space in which multiple forms of discrimination on the basis of race, religion, class, caste, sexuality, gender and more intersect to exclude, discriminate and further marginalise through lack of inclusion, distortion or hypervisibility in data practices. Being counted in the data is often mandatory for those populations at greater risk of discrimination on account of gender, class or race, since it is enmeshed in platforms to make their voices heard or in mechanisms to access welfare. But concerns around privacy and regulation in data collection and governance also pose a dilemma.

Without adequate and responsive norms and guidelines governing the collection and use of their data, being gendered, raced and classed bodies, necessarily exposes citizens occupying identities and/or presentations outside of the mainstream paradigm (that is: white, privileged, male and heterosexual), and become the subjects of discrimination through technologies otherwise deemed “neutral”.

Tag 1: Big DataTag 2: SurveillanceTag 3: Public Policy

Interventions:Inputs will be made by the following participants, covering the following issues:Bishakha Datta - the legal construction of obscenity in the digital realm, and how sexual surveillance applies or is practised, particularly in respect of online content characterised as “obscene.”Horacio Sívori - the impact of dataveillance and algorithmic decision making on LGBTI struggles in Latin America, with a focus on Brazil .Jeanette Hoffman - informed consent in data protection.Ralph Bendrath - perspectives on data protection in the EU.Katarzyna Szymielewicz - human rights implications on the digitilisation of social policy in Poland.

Diversity:Speakers have been selected on a range of criteria in order to promote maximum diversity in regional representation and expertise, stakeholder groups, level of profile in the internet governance and human rights communities, and perspectives on an issue of common interest.

In addition to their area of expertise, perspectives and profile in policy-making, further speakers will be selected in a manner that intersects the voices of women and sexually marginalized people with the voices of regulators, researchers from varied geographical regions, and activists working on issues concerning of “big-data” and social policy .

Online Participation:Throughout the session, the tags #IGF2017 and #genderit will be used to curate and facilitate online discussion and participation from off-site participants through Twitter. APC will also solicit questions ahead of time from those who cannot attend in person by publicizing the workshop on Twitter, and through our Exploratory Research on Sexuality and ICTs (EroTICs) project - a global network of 50 activists, academics, and organizations working at the intersection of sexual and digital rights. We work on sexuality issues including LGBT rights, sex work, sex education, SRHR rights, and gender-based violence, in addition to internet freedom advocates, policy experts, and techies.

A dedicated communications person will be available to facilitate online participation and to increase the visibility of the session and IGF among the networks of the co-organisers. The online moderator will have the online participation session open and will ensure communication with the onsite moderator to make sure online participants are able to make interventions and raise questions. This person will also be working on the live visual aid for the whole session towards setting up the chart that identifies key issues raised.

Discussion facilitation:The session will start with a five-minute briefing by the moderator which outlines the background and objectives of the workshop introducing the key concepts of big data, sexuality and surveillance. The speakers will provide additional context and specific examples to help setting a common ground for the group’s work.

Participants will then work in small groups to share cases from their regions. Using the examples as case-studies, groups will be invited to identify and reflect on data practices that duly acknowledge the agency and consent of users. Examples may include practices which:oppose the non-consensual collection of data.empower women and sexual minorities.display adequate care in protecting the data, privacy, and anonymity of activists and the communities they engage with.work to expose and level algorithmic discriminations.The groups will be invited to report back for a final round of comments and highlights.

The moderator will tie the discussion into each of these sections to ensure the conversation is coherent, informative, and useful. The session dynamic will be 30 minutes for setting the baseline and framing the issues, 20 minutes for group work and 40 minutes for reporting back, highlighting common threads, emerging practices/scenarios and ways forward.