Facebook News Feed Study: Participants Selected 'Randomly,' Unable To Find Out If They Were Targeted
Facebook users hoping to find out if they were one of the nearly 700,000 people used as a guinea pig for a 2012 study will find it nearly impossible - not only to find out if their information was manipulated but also how it could be used in the future.
A study published Friday in the Proceedings of the National Academy of Sciences revealed that, for 689,003 randomly selected users, Facebook tweaked the algorithm that determined which of their friends’ status updates were visible. Some saw only negative updates, others only positive. Online outrage was immediate due to the perception that Facebook was treading on the privacy of unwitting users, yet the company itself has been mum on how users can avoid being swept up in future such studies.
With over 1.2 billion users around the world, Facebook social scientists sit in the enviable position of having the ability to examine an array of sociological trends, from examining how sports fans feel after their team loses to, as in this case, assessing how someone feels when their friends seem to be sharing only positive or only negative emotions. The problem for psychology researchers is that Facebook, while admitting that its study proved that emotional contagion exists, did not earn explicit permission from its users, as is the usual research protocol.
Instead, the company pointed to the terms of service agreement signed by every user, which explains that an individual’s data may be used for “internal operations, including troubleshooting, data analysis, testing, research, and service improvement.” While the study didn’t say the users were selected anonymously, it did note that “participants were randomly selected based on their User ID.”
The study only manipulated the news feed of a small fraction of users, but the only sure-fire way to avoid being swept up in the increasing amount of research studies online is to not only delete Facebook but to drop offline altogether.
MIT Technology Review reported earlier this year that as many as 15% of 10,000 websites are conducting controlled testing at any given moment. Facebook, it seems, is merely the only one to admit it.
“Meanwhile, don’t plan on the company to stop this sort of testing, no matter how loud the outcry over this study,” wrote Forbes contributor Dan Diamond. “There are too many business-driven reasons for Facebook to keep tweaking its platform and learning more about how its users respond to different triggers.”
Facebook did not respond to repeated requests for comment from the International Business Times.
© Copyright IBTimes 2024. All rights reserved.