2014-08-04



A recently revealed study by Facebook of its users triggered both legal and ethical concerns about privacy, transparency and respondent consent. A U.S. Senator called for an FTC investigation, which seems likely. What does this incident mean for the survey, opinion and marketing research profession and what do researchers think about it?

Researchers ran an A/B experiment with Facebook users (almost 700,000 people) in early 2012, altering the amount of positive and negative posts in users’ newsfeeds to see if it impacted their mood via “emotional contagion.” Apparently, it worked.

Some users were outraged to learn they had been a part of the study. Privacy activists were even more upset: the Electronic Privacy Information Center (EPIC), in calling for FTC action, said that Facebook “purposefully messed with people’s minds.”

Lead researcher Adam Kramer went so far as to publicly apologize, but a spokesperson for Facebook told National Journal that “companies that want to improve their services use the information their customers provide, whether their privacy policy uses the word ‘research’ or not.”

However, a big reason for activist ire, and why this incident has direct legal implications, is that Facebook appears to have altered its data terms of use policy to allow for such research studies several months after it conducted the study, instead of before. The FTC will probably not be happy: the agency usually wants prior consent for material changes to such terms of use, and this may run afoul of the FTC’s expectations. Anything less may be deemed a “deceptive” trade practice by the FTC.

The FTC’s broad authority in consumer privacy and data security is why the agency is the top legal issue for the research profession in 2014.

According to EPIC’s complaint to the FTC, “At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes. Facebook also failed to inform users that their personal information would be shared with researchers.” EPIC said that Facebook also may have violated its consent order with the FTC, “which required the company to obtain users’ affirmative express consent prior to sharing user information with third parties.”

Senator Mark Warner (D-VA) expressed a lot less outrage, but no less concern: “I think many consumers were surprised to learn they had given permission by agreeing to Facebook’s terms of service. And I think the industry could benefit from a conversation about what are the appropriate rules of the road going forward.”

Warner sent his own questions for the FTC, including: “Does the FTC make any distinction between passively observing user data versus actively manipulating it? Should consumers be provided more of an explicit option to opt-in or opt-out of such studies? Additionally, is it appropriate for any research findings to be shared with participants prior to public dissemination?”

The answers to each of Warner’s questions will be of direct interest to the research profession and our ability to conduct research in the U.S. in the future.

Warner is also pondering the need for legislation to expand FTC authority. While the senator avoided stating his own preferences for what such legislation should do or look like, he did ask if the FTC requires “any additional regulatory authority or technology in order to monitor this type of data-mining.”

Stuart Pardau, an MRA member and an attorney in the research industry, was also struck by Facebook’s failure to provide notice and get respondent consent. Were Facebook users notified and did they consent to “potentially be participants in controlled behavioral experiments of this sort?” The broader public generally understands and accepts that online behaviors and postings will be tracked, Pardau said, and such tracking is generally disclosed in a company’s privacy policies (even more so since California revised their online privacy law). By contrast, being “served up customized news feeds as part of control groups for the express purpose of drawing broader conclusions of your emotional responses” is “creepily close to Facebook treating its members like unknowing participants in a lab experiment.” As Pardau notes, such activity was not “evident or clear from Facebook’s privacy policy or other terms of use.”

Top Legal Takeaway for MRA Members

As MRA always recommends to research companies: say what you do and do what you say. Anything less may bring unwanted negative attention from regulators and policymakers, like the FTC and Senator Warner, and lead to costly legal action. Make sure that your posted privacy policy, for instance, doesn’t promise more data security protections than you actually provide or claim that you’re compliant with the U.S.-EU Safe Harbor when you aren’t.

Ethical Concerns and the MR Profession Response

Public reaction to the Facebook emotion study also sparked interesting ethical debates within the research profession.

MRA member Janet Savoie, vice president of Online Survey Solution, took umbrage at the Facebook study for violating research ethics and tarnishing the research profession. “The rules I work under as a researcher include obtaining informed consent from potential respondents and giving them the option to opt out,” which are standard ethical practices in marketing research (and a part of most research covered by the MRA Code of Marketing Research Standards). Doing research with Facebook users “without their consent is a clear breach of these ethics” and users’ privacy. “As a consumer and member of Facebook, I am angry, but as a researcher I am even angrier, because it shines a bad light on our industry. With all the carryover anger we already get from the public’s confusing survey research and telemarketing, we don’t need any more bad press.”



Jeffrey Henning, a member of the MRA Board of Directors and president of Researchscape International, felt that the Facebook emotion study ran afoul of research standards to some extent because the company failed to explain what they were doing and why. “Web businesses are constantly conducting A/B tests and more advanced behavioral experiments on their websites to improve sales conversions or increase time on the site. Since Facebook algorithmically determines what to show any individual, they no doubt were trying to determine if producing happy thoughts vs. sad thoughts in their users would increase engagement of users and, if so, by how much. If they had framed their study as a way to increase engagement, would it have received the backlash that it did? I don’t think so. Given the broad language of their terms of service, it is unlikely they violated any law, even if – as seems increasingly clear – they violated the standards for conducting psychological experiments.”

Annie Pettit, vice-president of research standards at Research Now, was also unimpressed with response to the Facebook study. She pointed out that “every bit of research we do, every interaction we have with people, is intended to manipulate emotions,” and suggested the need to revisit MR standards in light of how Big Data analytics are changing the face of marketing research.

Kathryn Korostoff, president of Research Rockstar and an MRA member, noted that “as a market researcher, I am comfortable with information collected in an ethical way being used in aggregate form. Did Facebook reveal the names of people who posted negative comments following reading negative posts? If not, and assuming they didn’t break any laws, then I am really not concerned.” However, Korostoff was seriously concerned that “it casts a negative halo on researchers as people who manipulate emotions in order to test our theories,” but that view is too simplistic, she said. “Advertisers do A/B testing all the time: which version of an online ad got the most clicks (positive response)? This isn’t totally apples to apples, but in both cases someone is trying to see how the presentation of online information leads to positive or negative outcomes.”

Setting aside the legal and ethical issues, Korostoff suggested that the Facebook research “may inform how we analyze traditional survey data. For example, based on this data, it appears we need to be incorporating current social media sentiment analysis (in the form of analyzing online comments and product reviews) when analyzing results from customer satisfaction or brand awareness surveys—as what people have read online may sway their responses. A shocking implication? No. But I am happy to have some data about it rather than just a gut feel.”

Andrew Grenville, chief research officer for Vision Critical, lamented that the case of the Facebook emotion study exacerbates the predominant public attitude towards most companies, as demonstrated by a recent Edelman study. “People feel companies are not listening to and communicating transparently with them.”

Finally, Danah Boyd, senior researcher at Microsoft Research, highlighted a little-debated concern among researchers in academia and government: the usefulness of Institutional Review Boards (IRBs) for approving research with human subjects. “IRBs are an abysmal mechanism for actually accounting for ethics in research. By and large, they’re structured to make certain that the university will not be liable. Ethics aren’t a checklist. Nor are they a universal. Navigating ethics involves a process of working through the benefits and costs of a research act and making a conscientious decision about how to move forward. Reasonable people differ on what they think is ethical. And disciplines have different standards for how to navigate ethics. But we’ve trained an entire generation of scholars that ethics equals ‘that which gets past the IRB’ which is a travesty. We need researchers to systematically think about how their practices alter the world in ways that benefit and harm people. We need ethics to not just be tacked on, but to be an integral part of how everyone thinks about what they study, build, and do.”

More to Come

Given the ongoing interest in this topic among the research profession, it will be the subject of a panel discussion at the MRA’s upcoming Corporate Researchers Conference in Chicago in September.

Howard Fienberg (@hfienberg) is director of government affairs for the Marketing Research Association (MRA). He is MRA’s lobbyist in the U.S. on behalf of the survey, opinion and marketing research profession.

The post What Government Interest in the Facebook Emotion Experiment Means for MR appeared first on Research Access.

Show more