Obfuscates your feelings on Facebook
Please visit Install Go Rando to setup the extension on your preferred browser.
Facebook’s “reactions” let you express how you feel about a link, photo, or status. While such data might be helpful for your friends, these recorded feelings also enable increased surveillance, government profiling, more targeted advertising, and emotional manipulation. Go Rando is a web browser extension that obfuscates your feelings on Facebook. Every time you click “Like”, Go Rando randomly chooses one of the six “reactions” for you. Over time, you appear to Facebook’s algorithms as someone whose feelings are emotionally “balanced”—as someone who feels Angry as much as Haha or Sad as much as Love. You can still choose a specific reaction if you want to, but even that choice will be obscured by an emotion profile increasingly filled with noise. In other words, Facebook won’t know if your reaction was genuine or not. Want to see what Facebook feels like when your emotions are obscured? Then Go Rando!
What’s Wrong with Reactions?
We’ve known for years now that your “Likes” on Facebook not only tell your friends what you saw today, but also change what you see on Facebook in the future. For example, Facebook uses your “Like” activity to target ads, to decide which posts appear on your News Feed, and to manipulate your emotions as part of its own studies of human behavior. At the same time, Facebook shares your data with other corporations and government agencies, fueling increased surveillance and algorithmic decision making.
So if our “Likes” were already shared widely, what’s the harm in a user selecting “Angry”, “Sad”, or “Love”? When “Like” was the only option, it was a multi-purpose signifier that could mean many things, and was thus harder to algorithmically interpret. Facebook’s “reactions” are still reductive of human emotion, but they suggest just enough nuance to encourage algorithmic analysis of state-of-mind. While these analyses will be of questionable accuracy at best, they’ll still be used to generate an emotion profile for every Facebook user. When combined with other data available to state agencies and corporations, the potential abuses and misuses are significant.
For example, emotion profiles could affect one’s economic future. Amazon could use your reactions to feed dynamic pricing. Banks might see “Sad” or “Angry” customers as a higher credit risk for a loan. Or a future employer could treat a “Sad” profile as a sign to negotiate a lower salary or to skip that candidate altogether.
Civilian police use analytics software that draws on social media data for the purposes of intelligence gathering, crowd management, and threat analysis. From “Likes” to hashtags to emojis, recent articles have revealed how this data gets used to track activist locations in real-time during protests, or to analyze the threat an individual poses based on how they “feel.” The addition of Facebook’s reactions into these systems will lead to further (questionable) analyses of state-of-mind, possibly using how one feels as partial justification for surveillance, arrest, or otherwise.
The US Government and other state actors have long been tracking everyone’s digital activities in an attempt to predict future security threats. As they integrate every “Angry”, “Sad”, or “Wow” we post into their prediction algorithms, that data could lead to increased surveillance, placement on watch lists, and/or rejection of individuals at the border.
Finally, this should all be considered within the context of the recent US presidential election and Brexit votes. As is now being revealed, the Trump campaign engaged the predictive analytics company Cambridge Analytica to use social media data to ascertain the personalities of individuals and groups. This allowed the campaign to glean people’s “needs and fears, and how they are likely to behave.” Such data was then used to craft custom messages for voters based on a division of “the US population into 32 personality types.” (The same company played a role in the “leave” side of the Brexit vote in Great Britain). Given the policy intentions of the Trump administration on issues like immigration, terrorism, and more, it is likely that these campaign techniques will now become government surveillance techniques.
Why Go Rando?
Go Rando adopts the strategy of obfuscation to disrupt Facebook’s increasingly fine-grained data collection practices. While unlikely, if everyone started using Go Rando tomorrow, it could have broad collective effects against state and corporate emotion profiling. But regardless, for any one user it provides individual benefits by disrupting Facebook’s News Feed algorithm (and thus, blunting the “filter bubble” effect), resisting the site’s attempts at emotional manipulation, and confusing corporate and governmental surveillance.
Further, Go Rando provokes questions about the uses of Facebook’s “reactions.” Who benefits when you mark yourself as “Angry” or “Sad” in response to a particular post? Which groups have the most to lose? And how might the uses of this data change the nature of privacy and democracy over the coming months or years?
Finally, when you see a discongruent or “inappropriate” reaction from your friends in the future, perhaps this might be a sign of a potential ally in the battle between individual freedom and the big data state-corporate machine that seeks to use our data against us.
Go Rando will premiere as part of Blinding Pleasures, an exhibition curated by Filippo Lorenzin at Arebyte Gallery in London, UK. The exhibition will include custom software visitors to the gallery can use to generate a custom emotion profile based on their Facebook reactions.
I did an interview with Régine Debatty of We Make Money Not Art about Go Rando, the Facebook News Feed, emotional surveillance, and more.
Go Rando is free and its source is open and hosted on Github.
Frequently Asked Questions
For questions about (de)installation, privacy, and usage, please see the FAQ.