Facebook has been in the news with criticism of its privacy policies, sharing customer information with Fusion

Question:

Facebook has been in the news with criticism of its privacy policies, sharing customer information with Fusion GPS, and criticism regarding the attempts to influence the 2016 election. In March 2014, Facebook released a study entitled “Experimental evidence of massive-scale emotional contagion through social networks.” It was published in the Proceedings of the National Academy of Sciences (PNAS), a prestigious, peer-reviewed scientific journal. The paper explains how social media can readily transfer emotional states from person to person through Facebook’s News Feed platform. Facebook conducted an experiment on members to see how people would respond to changes in a percentage of both positive and negative posts. The results suggest that emotional contagion does occur online and that users’ positive expressions can generate positive reaction, while, in turn, negative expression can generate negative reaction.

Facebook has two separate value propositions aimed at two different markets with entirely different goals. Originally, Facebook’s main market was its end users—people looking to connect with family and friends. At first, it was aimed only at college students at a handful of elite schools. The site is now open to anyone with an Internet connection. Users can share status updates and photographs with friends and family. And all of this comes at no cost to the users.

Facebook’s other major market is advertisers, who buy information about Facebook’s users. The company regularly gathers data about page views and browsing behavior of users in order to display targeted advertisements to users for the benefit of its advertising partners.

The value proposition of the Facebook News Feed experiment was to determine whether emotional manipulation would be possible through the use of social networks. This clearly could be of great value to one of Facebook’s target audiences—its advertisers.

The results suggest that the emotions of friends on social networks influence our own emotions, thereby demonstrating emotional contagion via social networks. Emotional contagion is the tendency to feel and express emotions similar to and influenced by those of others. Originally, it was studied by psychologists as the transference of emotions between two people.

According to Sandra Collins, a social psychologist and University of Notre Dame professor of management, it is clearly unethical to conduct psychological experiments without the informed consent of the test subjects. While tests do not always measure what the people conducting the tests claim, the subjects need to at least know that they are, indeed, part of a test. The subjects of this test on Facebook were not explicitly informed that they were participating in an emotional contagion experiment. Facebook did not obtain informed consent as it is generally defined by researchers, nor did it allow participants to opt out. When information about the experiment was released, the media response was overwhelmingly critical. Tech blogs, newspapers, and media reports reacted quickly. Josh Constine of TechCrunch wrote:

“ . . . there is some material danger to experiments that depress people. Some people who are at risk of depression were almost surely part of Facebook’s study group that were shown a more depressing feed, which could be considered dangerous. Facebook will endure a whole new level of backlash if any of those participants were found to have committed suicide or had other depression-related outcomes after the study.”

The New York Times quoted Brian Blau, a technology analyst with the research firm Gartner, “Facebook didn’t do anything illegal, but they didn’t do right by their customers. Doing psychological testing on people crosses the line.” Facebook should have informed its users, he said. “They keep on pushing the boundaries, and this is one of the reasons people are upset.”

While some of the researchers have since expressed some regret about the experiment, Facebook as a company was unapologetic about the experiment. The company maintained that it received consent from its users through its terms of service. A Facebook spokesperson defended the research, saying, “We do research to improve our services and make the content people see on Facebook as relevant and engaging as possible. . . . We carefully consider what research we do and have a strong internal review process.”

With the more recent events, Facebook is changing the privacy settings but still collects an enormous amount of information about its users and can use that information to manipulate what users see. Additionally, these items are not listed on Facebook’s main terms of service page. Users must click on a link inside a different set of terms to arrive at the data policy page, making these terms onerous to find. This positioning raises questions about how Facebook will employ its users’ behaviors in the future.


Critical Thinking Questions

1. How should Facebook respond to the 2014 research situation? How could an earlier response have helped the company avoid the 2018 controversies and keep the trust of its users?

2. Should the company promise to never again conduct a survey of this sort? Should it go even further and explicitly ban research intended to manipulate the responses of its users?

3. How can Facebook balance the concerns of its users with the necessity of generating revenue through advertising?

4. What processes or structures should Facebook establish to make sure it does not encounter these issues again?

5. Respond in writing to the issues presented in this case by preparing two documents: a communication strategy memo and a professional business letter to advertisers.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question
Question Posted: