Next week I’ll be giving a couple talks in New York. The first will be at the International Workshop on Obfuscation at New York University. Led by Finn Brunton and Helen Nissenbaum, the workshop “convenes researchers, scientists, developers, and artists to discuss a broad range of technical, theoretical, and policy approaches to obfuscation, from tools that anonymize users’ social media data to new methods for writing code itself.” The title of my talk is “Go Rando First and Ask Questions Later: Resisting Emotional Surveillance with Noisy Feelings.” I’ll be speaking about Go Rando as an example method for obfuscating personal identity, both in an outward sense (e.g. within big data) and an inward one (e.g. how the project encourages reconsideration of one’s one reactions on a case-by-case basis). I’ll also talk a bit about my works Facebook Demetricator and ScareMail within this same context, as well as the role of obfuscatory artworks in the public perception of governmental and corporate surveillance.
The second, with my co-author Nicole Brown, will be at Theorizing the Web 2017 in Queens. Here’s the abstract:
The (In)visibility of Black Death: Questioning the Image on Social Media Feeds
From its earliest days, Facebook’s News Feed has included images uploaded by users. But the site’s early status interface began with a text-based prompt: “[Username] is…”. Though Facebook still supports textual content and users still post text-based messages, the feed is now dominated by the image. One significant visual subject of this domination has been violence against black bodies. Within the contexts of the Black Lives Matter movement and the 24-hour news cycle, social media platforms—and the increasing number of images they display—make visible how national bloodlust and anti-blackness converge to create and feed a compulsive desire to consume images of black death. But what else do these displays of anti-black violence reveal? Are emotional traumas created by repeated exposure to images that reconstruct and display black experiences with violence? We argue that this exposure distorts how one’s blackness constructs perceptions of who is human and who is not human. We also consider the role of the News Feed algorithm in this context, where being seen and metrically responded to can lead to increased visibility (and thus, power), while overexposure to the same material can eventually lead to a perceptual invisibility and trauma. Critical net art practices that treat software systems as recomposable material can help us examine these issues. Specifically we consider how images make black death both visible and invisible through the use of the artwork Textbook. Textbook is a browser extension that removes images from Facebook, allowing users to test for themselves how images affect their experience and read of the site. Such a mechanism can enable resistance against the forced consumption of images of black death without erasing the subjects and actions they portray from critical inquiry. Given how the relationships between image, metrics, and algorithms have led to a torrent of dehumanizing and socially damaging visual material on the News Feed, it is time to more critically interrogate the role of the image on social media feeds.