The platform's controversial 'emotion experiment' could have serious consequences for the industry, David Benady writes.
Facebook’s "emotional contagion" experiment threatens to seriously undermine the reputation of social media and tech companies.
As regulators clamp down on perceived invasions of online privacy with "the right to be forgotten", the industry needs to take care.
The revelation that Facebook manipulated the news feeds of about 700,000 English-language users in 2012 without their consent to test how emotions spread on the network has caused considerable outrage in the media. Chief among the issues appears to be that the experiment conducted was without consent and was a clear case of manipulation.
The Information Commissioner’s Office has asked regulators in Ireland, where Facebook’s European operations are based, to look into whether this was a breach of privacy. Facebook insists it has played by the rules.
Some in the industry struggle to understand the fuss. Amy Kean, the head of futures at Havas Media, says she is not in the slightest bit outraged: "We test advertising and content all the time – optimising content to get the best results, playing with the user experience and baskets to drive the best sales, testing copy to see what gets people to click. The only difference here is that Facebook was measuring how people felt, not how much they were buying."
Others are not so sure. Dino Myers-Lamptey, the head of strategy at the7stars, thinks running the experiment without specific consent was unacceptable: "It just opens the doors up for more mass manipulation and propaganda."
He adds: "Great brands don’t try to influence people’s moods, they reflect them. Facebook has crossed a line by not seeking consent. If you consider their intentions, they were doing it for the purpose of keeping people on Facebook. That inevitably results in banning any negative comments on the site. Frightening."
While the effects of the experiment on users was small, the revelation that it took place could have huge ramifications for the industry. The recent introduction of the "right to be forgotten" ruling by the European Union is already leading to news articles being removed from Google’s search engines in Europe. If suspicion of Facebook’s activities snowballs, the contagion could be regulatory as well as emotional.
Neil Major, the strategy director at the social media agency Yomego, says: "Algorithms are a big part of making the service as valuable to people as possible. They did that experiment to improve the EdgeRank [which determines what is displayed and how high on the news feed] – that’s their secret sauce. It is in the DNA of tech companies to do this; it is trying to drive interaction."
He adds that, along with Google, Facebook needs to do this kind of experiment to offer the best results and prevent the algorithm being manipulated by spammers and other online ne’er-do-wells.
But Major fears that the revelation feeds into the perception that big data is part of what the campaigner Bruce Sterling calls "surveillance marketing".
As people get more knowledgeable about the activities of social networks, the question is whether they will become more or less tolerant of them. Facebook needs to tread carefully because the regulatory Big Brother is watching and is prepared to legislate.
This article was first published on