Facebook’s disclosure last week that it had tinkered with about 700,000 users’ news feeds as part of a psychology experiment conducted in 2012 inadvertently laid bare what too few tech firms acknowledge: that they possess vast powers to closely monitor, test and even shape our behavior, often while we’re in the dark about their capabilities.
The publication of the study, which found that showing people slightly happier messages in their feeds caused them to post happier updates, and sadder messages prompted sadder updates, ignited a torrent of outrage from people who found it creepy that Facebook would play with unsuspecting users’ emotions. Because the study was conducted in partnership with academic researchers, it also appeared to violate long-held rules protecting people from becoming test subjects without providing informed consent. Several European privacy agencies have begunexamining whether the study violated local privacy laws.
Facebook and much of the rest of the web are thriving petri dishes of social contact, and many social science researchers believe that by analyzing our behavior online, they may be able to figure out why and how ideas spread through groups, how we form our political views and what persuades us to act on them, and even why and how people fall in love.
Most web companies perform extensive experiments on users for product testing and other business purposes, but Facebook, to its credit, has been unusually forward in teaming with academics interested in researching questions that aren’t immediately pertinent to Facebook’s own business. Already, those efforts have yielded several important social science findings.
But there’s another benefit in encouraging research on Facebook: It is only by understanding the power of social media that we can begin to defend against its worst potential abuses. Facebook’s latest study proved it can influence people’s emotional states; aren’t you glad you know that? Critics who have long argued that Facebook is too powerful and that it needs to be regulated or monitored can now point to Facebook’s own study as evidence.