“By now,” writes Mike Isaac, “Facebook is very, very good at saying sorry.” [“Facebook Says It’s Sorry. We’ve Heard That Before.” The New York Times, 30 June 2014] The latest apology is for conducting a secret mood altering experiment on nearly 700,000 users. As Kashmir Hill writes, “Unless you’ve spent the last couple of days in a Faraday pouch under a rock, you’ve heard about Facebook’s controversial ‘emotion manipulation’ study.” [“Facebook Added ‘Research’ To User Agreement 4 Months After Emotion Manipulation Study,” Forbes, 30 June 2014] She continues:
“Facebook data scientist Adam Kramer ran an experiment on 689,003 Facebook users two and a half years ago to find out whether emotions were contagious on the social network. It lasted for a week in January 2012. It came to light recently when he and his two co-researchers from Cornell University and University of California-SF published their study describing how users’ moods changed when Facebook curated the content of their News Feeds to highlight the good, happy stuff (for the lucky group) vs. the negative, depressing stuff (for the unlucky and hopefully-not-clinically-depressed group). The idea of Facebook manipulating users’ emotions for science — without telling them or explicitly asking them first — rubbed many the wrong way.”
Americans have become a nation of people who mostly want to read or hear news that is in line with their personal ideologies. Apparently, it takes too much time and effort to hear all sides and make an informed opinion. It’s easier to let someone else do your thinking for you. Business leaders engaged in social media apparently understand this phenomenon and have developed algorithms that deliver news they think you prefer reading. Therefore, it didn’t surprise me when I read about the emotion manipulation experiment. To say that it “rubbed people the wrong way” is an understatement. It made a lot of people angry — very angry. As the president and CEO of a company that offers big data analytic services, including marketing services, it upsets me as well. I have argued for some time that the unethical use of big data could create havoc and result in poorly thought out legislation that could hurt consumers, academics, service providers, and other groups that have much to gain from big analytics (see, for example, “Big Data and Ethical Corporate Behavior“). Isaac reports that Facebook “declined to comment” for his post, even though it had “to publicly apologize again.” He details a litany of Facebook apologies over the years for doing things that have raised the hackles of users. Isaac muses, “Perhaps the company should make sure it’s worth pushing the limits of what people are comfortable with” before doing stupid things that are sure to backfire when they come to light. The latest brouhaha is a good example, “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Kramer, Facebook’s data scientist, wrote on his Facebook page. He got that right.
To be fair, not everyone was upset. Patrick Lin, an associate professor in the philosophy department at California Polytechnic State University, believes the furor caused by the Facebook experiment is overblown. [“Facebook Furor Is Overblown, Says CalPoly Ethicist,” by Steve Rosenbush, The Wall Street Journal, 30 June 2014] Lin says, “It’s not clear that Facebook’s attempts to manipulate user emotion are an ethical lapse.” Rosenbush continues:
“Facebook’s core feature already manipulates users, so the experimentation isn’t at odds with the nature of the service itself, according to Dr. Lin. The default mode for users of the Facebook platform, known as Top Stories, already is designed to highlight select posts, although users have the ability to switch to a strict chronological feed of their friends posts, he said. From a legal perspective, Facebook probably isn’t violating any laws, ‘and ethically, they probably have a lot more flexibility than a private service that people pay for,’ Dr. Lin says. He concedes that not all ethicists are as sanguine when it comes to the power and prerogative of social media.”
Perhaps the most unethical thing that Facebook did was change their data use policy after the fact. Hill reports:
“Critics and defenders alike pointed out that Facebook’s ‘permission’ came from its Data Use Policy which among its thousands of words informs people that their information might be used for ‘internal operations,’ including ‘research.’ However, we were all relying on what Facebook’s data policy says now. In January 2012, the policy did not say anything about users potentially being guinea pigs made to have a crappy day for science, nor that ‘research’ is something that might happen on the platform.”
The timing of Facebook’s PR disaster couldn’t be worse for the company. Mark Bergen reports, “Advertisers are flirting more with paid ad offerings from Facebook and Twitter, but it’s a cautious courtship.” [“Ad Age Survey: How Advertisers Are Spending on Facebook, Twitter and YouTube,” Advertising Age, 30 June 2014] If the furor continues, marketers might shun Facebook in favor of other social media outlets. According to Bergen, Facebook is currently the king of the mountain. “Around 84% of respondents reported using Facebook,” he writes, “making it the dominant social platform. In addition, Facebook commands a greater share of digital budgets than Twitter and YouTube.” Joshua Brustein points out that the latest uproar resulted from a self-inflicted wound. “The kerfuffle exists only because Facebook published a paper about the research in a scientific journal,” he writes. [“Facebook’s Emotional Manipulation Test Was Unethical—and So Is the Rest of Social Media,” Bloomberg BusinessWeek, 30 June 2014] Brustein continues:
“By scientific standards, however, pretty much all of social media is an unethical experiment. After all, Facebook doesn’t claim to show its users an unfiltered version of what their friends are posting. It uses some mysterious cocktail of factors to decide what mix of content is most likely to keep people on the site longer. The goal, always, is to increase the number of ads seen by users. Just about every decision made by Facebook is based on what will make money for Facebook. This has been good business but wouldn’t pass muster as science.”
What would cause Facebook the greatest harm is if users started to abandon it in favor of something like Google+. There are pundits suggesting that users do just that. For example, Emma Fuller, an expert on social media alternatives and etiquette at Omlet, offers recommendations about “How to Break Up With Mark Zuckerberg.” She writes:
“At first, Mark — and his Facebook app — seemed like the perfect guy: He ‘shared’ regularly, ‘checked in’ often, kept you ‘fed’ with the latest news, and paid attention to what you ‘liked.’ But before you knew it, Mark began to change: He blatantly disregarded your privacy, started keeping tabs on everything you did, and even provided intimate details about you to others. Unfortunately, although you’d like to change your relationship status and totally break up with Mark, ‘it’s complicated.’ So what can you do? How can you log out from Facebook and log in to a healthier relationship? … The key is to follow these three steps:
STEP #1: Don’t Wait By the Phone. Turn off mobile notifications so that Mark doesn’t message you throughout the day. Browse and post on Facebook only when it’s convenient for you.
STEP #2: Take a Break for a Week. When you’re ready, deactivate your Facebook account for one week to see how it goes. Mark, of course, is always willing to take you back.
STEP #3: Get a Better Boyfriend.”
Of course, Fuller thinks that Omlet is that better boyfriend. Omlet, she writes, is “a fun and social chat app that allows you to easily message friends, share photos, post videos, and express yourself without Mark looking over your shoulder.” I’m sure other services are willing to step up and offer themselves as better boyfriends as well. Even some businesses have started to reconsider their Facebook presence (for example, read “What Happens When You Break Up with Facebook: Nothing,” by Evie Nagy, Fast Company, 6 May 2014). Facebook needs to toe the line a bit closer in the future so that it doesn’t have to say “I’m sorry” or “Sorry to see you go.”