Sketchy Facebook Experiment Shows It's Their Lab and We're Just the Guinea Pigs

The disclosure that Facebook tried to manipulate the emotions of users will be just the start of a scary attempt at social engineering.
Avatar:
Author:
Publish date:
Social count:
29
The disclosure that Facebook tried to manipulate the emotions of users will be just the start of a scary attempt at social engineering.
Zuckerberg

Facebook is using all of us as its experimental guinea pigs, and it's reported the lab results in the Proceedings of the Academy of Sciences.

Approximately 680,000 Facebook users had their news feeds artificially engineered in order to attempt to manipulate their emotions. Facebook effectively wanted to see if it could make emotional states spread across social media (while doing so stealthily). The company summed up the experiment, ominously titled "Experimental evidence of massive-scale emotional contagion through social networks," thus:

We show, via a massive experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. [...]

The experiment manipulated the extent to which people were exposed to emotional expressions in their News Feed. [...]

When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

The effect Facebook found was modest. But this might mark the point at which Facebook has ceased primarily operating as a social networking company and began being a social engineering company. And the implications, frankly, aren't great. What Facebook did is very different and much more troubling than, say, the emotionally-manipulative headlines on viral sites like Upworthy or ViralNova. As Valleywag notes, no one consented to participate in the experiment (except via their terms of service) and labeled it "a horror show of anti-ethics."

MIT professor Kate Crawford told the Wall Street Journal, "It's completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments." NPR points out that the company could just as easily run an experiment to see what happens when it makes users feel ignored or ostracized.

Facebook's short-term game here is probably attempting to find out which type of posts garner the most-favorable reactions from users and prioritize them, making spending time on the site more fun. But Zuckerberg et al. are inevitably going to get much better at this, especially as retailers are picking up more sponsored ads on social media.

For what it's worth, Facebook claims that its internal review practices have "come a long way" since the original study was completed in 2011. Which for all we know means they stopped publishing the results. Facebook certainly doesn't seem to think this is a big deal.

If you don't think it's a big deal either, remember that Facebook is ultimately interested in prying open your wallet. If it's working on a system to make you happy or sad, it's also working on pairing that power with targeted advertising tools. Some day soon Facebook will be actively fucking with you to try and get you to buy shit, if it isn't already.