Facebook Experiment Creates A Trust Gap
By now the pattern is familiar: Facebook (NYSE: FB) in some way alters the news feed, changes its policy on providing data to advertisers or somehow otherwise violates trust. Bloggers howl. Users say they’re outraged. Some demand an apology, others threaten to delete their accounts. But in the end, nothing really happens.
In Facebook’s latest gaffe, the social network disclosed last week that it had manipulated news feeds as part of an experiment on users to determine if negative language in posts had a lasting impact the mood of its users. The answer was, not really.
The research, conducted on 700,000 randomly chosen accounts in 2012, found that de-emphasizing posts with emotional words in them had minimal effect. “People produced an average of one fewer emotional word, per thousand words, over the following week,” Adam Kramer, one of the study's coauthors, said in a Facebook post.
“In hindsight, the research benefits of the paper may not have justified all of this anxiety,” he added.
The idea that Facebook is not just collecting data but manipulating user experience as part of an experiment struck many as tone deaf or unethical. Even Susan Fiske, the Princeton University Professor who edited the study for publication, had misgivings about it.
“I was concerned,” Fiske told the Atlantic, “until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time.”
Facebook's business model in part depends on news feed experiments: The "like" button is a mechanism to indicate positive posts, with which advertisers want to be associated.
Still, if the pattern of past Facebook missteps holds, most users will calm down: The site plays too big a role in their lives for them to give it up.
"I think people appreciate the connections they have through Facebook, and over time they tend to overlook these things," said Debra Aho Williamson, an analyst for eMarketer. After each transgression, the world looks for evidence of an impact -- such as declining user growth or engagement -- and it just never happens.
"We have not seen a mass exodus from Facebook [after] anything they've done," she said.
So, users may love Facebook a little less, but they still rely on it like any other utility. “Utilities typically aren’t loved -- often, they are hated -- but everyone has to keep their lights on,” said David Berkowitz, head of marketing at social ad agency MRY.
Facebook is a social utility; it's where your friends and coworkers are, and for most people, the switching costs are just too high.
"You're just not going to stop because you're worried about a little mood experiment," said Mike Vorhaus, president of Magid Advisors. "If you get cyberattacked, people go on your page and create problems, then you may opt out."
As Facebook becomes less loved -- if not less used -- the company has responded by buying up networks that truly are loved (for now), like Instagram and WhatsApp. Facebook remains the company's core social utility, but increasingly it's a collection of social technologies targeted at different needs and demographics.
"The reality is, it costs a lot of money to disrupt yourself," Vorhaus said. "These are attempts at disrupting themselves."
A brief history of Facebook gaffes:
2006: Soon after Facebook launched the News Feed, Mark Zuckerberg issued his first public apology over a lack of privacy controls.
2007: Facebook launched “Beacon,” an ad initiative that tracked what users purchased on the Web and turned those transactions into news feed ads. A class action suit ensued, and Beacon was shut down in 2009. In 2011, Zuckerberg admitted it was a mistake.
2009: Zuckerberg responded to a furor over changes to Facebook's terms of use regarding who owns users' information.
2009: Facebook settled with the FTC over charges that it changed privacy settings without warning users. “Facebook’s innovation does not have to come at the expense of consumer privacy," then-FTC chairman Jon Leibowitz said.
2010: Frustrated by Facebook's confusing privacy settings, disgruntled users mount "Quit Facebook Day," which received a lot of coverage but pretty much flopped as a movement.
2013: Facebook settled a class action lawsuit relating to the use of images and likenesses in news feed ads called “Sponsored Stories.” Facebook agreed to make changes to its terms of service to make it clearer that “likes” and other actions on the network will be used for advertising purposes.
2013: Facebook is revealed by Edward Snowden to be a participant in the U.S. government's PRISM program; now, it reports the number of data requests it receives from governments around the world.
2014: New Facebook app update will allow the network to collect data through smartphone microphones to identify the TV show, song or movie a user may be watching. That data will be stored indefinitely.
© Copyright IBTimes 2024. All rights reserved.