panels

In this discussion we will review the dynamics and patterns of online abuse on social networks. How does a minor scuffle so quickly become an avalanche of online harassment? Why are women, people of color, and the queer and trans community disproportionately targeted? What are steps we can take to build safe spaces on the internet? A killfile or block button is no longer a sufficient tool to prevent abuse and the common advice “don’t feed the troll” ignores the contemporary climate of online abuse. We will discuss tactics to minimize online abuse and the potential for structural change.

This discussion will consider topics related to what MIT professor Rosalind Picard calls “affective computing,” emotion recognition in artificial intelligence and the use of technology to simulate empathy or respond to mood. Sensors in automobiles might respond to a stressed out driver with softer light or upbeat music. Commercial surveillance applications increasingly measure facial movements to profile the reactions of customers. Meanwhile data is collected on social networks for engineering relationships. A team of social media researchers recently proposed an “early breakup warning system” for Twitter that is possible with just public availably data. Affective computing is automating the largely undervalued and often gendered work known as “emotional labor.” A nanny, waitress, community manager, journalist, administrator assistant, or counselor is subject to the fallacious conflation of “doing what she loves” and labor, and therefore often underpaid for her services.