A popular adage states that there is no such thing as a free lunch. Social media platforms

Question:

A popular adage states that “there is no such thing as a free lunch.” Social media platforms such as Facebook, Instagram, Google, TikTok, and LinkedIn offer access to their platforms where users can connect and share information with each other at seemingly no cost. These companies then collect data on consumers, either selling the information itself or using it to target consumers for third-party advertising. This is the business model of the “modern age”; the product is not the services utilized by individual users but rather access for third-party companies to the users themselves and their data. According to Wharton Customer Analytics and a Drexel marketing professor, Elea Feit, “Most companies are collecting data these days on all the interactions, on all the places that they touch customers in the normal course of doing business.” The old adage should be updated for modern times to read “If the service is free, you are the product” (Deloitte, 2021).

Facebook is well known for providing users access to third-party-developed applications that enhance the appeal of its platform. Applications (referred to as “apps”) come in many forms, from videogame farming simulators to video summaries of the user’s past year of photographs and posts and quick quizzes that claim to provide insight into one’s personality. One of these apps, Your Digital Life, developed by Aleksandr Kogan of Cambridge University in June 2014, offered users a standard personality quiz experience. The app was downloaded by 270,000 people and installed on their Facebook accounts. The app then provided Kogan access to the data of not only the users who downloaded the app, but their friends’ data as well. Such data included posts, status updates, and even private messages. Ultimately the profiles of an estimated 50 million users were downloaded to Kogan’s private database. 

This information was then provided to Cambridge Analytica, a political data firm, for use in various political initiatives—most famously the 2016 U.S. presidential election. The firm first entered politics that year with the goal of “giving conservatives big data tools to compete with Democrats” (Detrow, 2018). The firm had high-profile conservative activists involved from inception. Wealthy Republican donor Robert Mercer and political strategist Steve Bannon of Breitbart News were on its board, which helped direct its politically driven activities. The firm was comprised of data scientists, psychologists, social media experts, and creative designers. Using the data acquired by Kogan’s app, they built psychographic profiles of Facebook users to understand their political leanings and susceptibility to various advertising methods. Cambridge Analytica then build a network of news sites and blogs, interlinked to increase the apparent legitimacy of their claims, and used Facebook’s targeted advertising capabilities combined with the profiles built using Kogan’s data. Strategies for different user profiles were developed, ranging from misinformation and the warping of the facts to fear campaigns and genuinely conveying information that may be influential. The firm not only claims to have been a major driver in President Donald Trump’s surprise victory over Hillary Clinton in the 2016 election, but the Brexit referendum in 2016 as well. In a post-election piece titled “The Data Gurus Who Anticipated the Election Results,” Frank Luntz, a political consultant declared, “There are no longer any experts except Cambridge Analytica. They were Trump’s digital team who figured out how to win” (Zizal, 2016). 

The use of Facebook’s marketing capabilities and access to consumer data is at the core of Facebook’s business model. Facebook allows companies the opportunity to market to its 2.7 billion monthly active users and boasts an accessible API (application programming interface), which allows third-party developers to build apps that are easy to upload to the Facebook platform. Data collection and utilization was widespread on the platform prior to its inception and is nothing new, and in fact in 2014 Facebook expanded its data protections for consumers. Facebook introduced three restrictions that year. The first was to limit developers from accessing data on consumers’ friends who did not download the data-gathering application. Furthermore, if apps were not used for three months, then developers lost access to the user’s data. And finally, Facebook developers who had uploaded apps to the platform prior to 2014 had to submit to an audit to confirm that data collected outside of these guidelines was deleted. The consequence of not following these procedures would result in immediate termination from the platform. Facebook’s commitment to these audits was minimal. In 2016, Facebook contacted Cambridge Analytica to conduct an audit and inquire if the data collected, against policy, had been deleted. The only proof required was a signed certification that in fact the information had been deleted. As of late, there have been several new updates to Facebook’s user privacy practices. In a piece published by Facebook CEO Mark Zuckerberg titled “A PrivacyFocused Vision for Social Networking,” he wrote that because “people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.” His new vision stressed private interactions, encryption, reduction of permanency, safety, interoperability, and secure data storage. Noticeably missing from these core ideas are data sharing and ad targeting, which were what brought Facebook into the spotlight. 

The matter gained national attention in March 2018 when Christopher Wylie, a Cambridge Analytica data scientist, publicly revealed the extent to which data was collected without consumers’ awareness and the ultimate use of that information for political means, which some would consider manipulation. This resulted in not only media attention but a drop in Facebook users, a drop in share price, outrage from privacy advocates, and investigations by lawmakers. Cambridge Analytica executives had been called before Congress multiple times, but in May 2018 the company closed, saying that “the siege of media coverage has driven away virtually all of the company’s customers and suppliers.” This left Facebook and its massive stores of user information as an unanswered question for regulators. States such as Vermont and California enacted a range of regulations requiring data brokers to register with the state or requiring companies to allow consumers to opt-out of data gathering. According to the New York Times, “the California Consumer Privacy Act grants people in the state the right to opt-out of the sale of their personal information. Apps, websites and other services that exchange personal data for money—or for non-monetary compensation—must prominently display a ‘Do Not Sell My Data’ notice allowing people to stop the spread of their information” (Cowan and Singer, 2020). 

After testifying to Congress, Zuckerberg pledged to take action to prevent a crisis like this from happening in the future. Zuckerberg pledged to restrict developers’ data access even further and reduce the data that users provide to apps to just the user’s name, profile photo, and e-mail address. Facebook would also require developers to obtain approval from consumers before accessing their posts or private data. The company would also conduct a full forensic audit of suspicious apps and ban developers who do not agree to the increased oversight. Many speculate that Zuckerberg’s refusal to accept blame for this scandal, and the consistent blaming of rogue developers, was a way to prepare for a legal defense that might be required later. Investigations by the Federal Trade Commission and Congress, as well as fines in other foreign countries including England and Italy, indicated that Facebook’s responsibility for Cambridge Analytica’s gathering of user information without consent, and the broader questions the scandal raised, would be far from over. 

The Facebook-Cambridge Analytica scandal raises ethical questions about whether Facebook, as the operator of the social network platform, had an obligation to inform users that their data was being collected and whether or not the company should have provided data protection. The scandal is an example of improper collection and misuse of that data. However, data collection is relatively new and unregulated, which creates ethical gray areas. Many firms collect similar data as Cambridge Analytica while informing customers they are doing so. These firms then sell that properly acquired data to third parties who then use it for political action as well. Other organizations purchase data without an understanding of how it was acquired, even if the method was unethical. The data is then used for legitimate advertising purposes. Most large corporations are involved in large-scale data acquisition and analysis. An important lesson learned from this scandal is that the public’s eyes were opened to see how detailed their collected information can be and how it can be used without their knowledge or approval.


Epilogue

Facebook consented to pay a £500,000 fine, set by the data-protection watchdog in the United Kingdom, for its role in the Cambridge Analytica scandal. A Facebook lawyer stated that Facebook also enacted changes to restrict app developers’ access to information. Facebook also made no admission of liability (Criddle, 2020). While this scandal is over, the issue as indicated earlier remains: large social media companies’ “surveillance business models, their increasingly central position in digital society, and the power they now hold as a result” (Cobbe, 2020). For example, the legal and ethical concern persists about how “microtargeting is used across the political spectrum” with regard to large social media firms allowing political campaigns to “slice and dice the electorate, dividing voters into small groups,” thus facilitating the “prospect of campaigns using these tactics to suppress turnout among supporters of other candidates” (Cobbe, 2020). 

In fairness to Facebook, it claimed that bad actors and their questionable information cannot always be easily detected. The company walks a fine line between freedom of the press and propagating dangerous propaganda. The firm did tighten its surveillance on political advertising during the 2020 presidential race between Donald Trump and Joe Biden. 

Still, critics of large social media companies claim that by their very nature and DNA, they participate in what has been termed “surveillance capitalism” (Zuboff, 2019). That is, their business models—how they make money—are designed to collect, analyze and use private citizens’ data to sell to advertisers without the knowledge of users. This practice is, according to this theory, part of “an economic system centered around the commodification of personal data with the core purpose of profit-making” and the “increasing price of data having limited accessibility to the purchase of personal data points to the richest in society” (Surveillance capitalism, n.d.).

With regard to how one group of stakeholders, consumers, feel about the dilemma of using social media firms and individuals’ concern for privacy, a Pew Research Center 2018 study concluded that, “on one hand, the rapid growth of the platforms is testimony to their appeal to online Americans. On the other, this widespread use has been accompanied by rising user concerns about privacy and social media firms’ capacity to protect their data. All this adds up to a mixed picture about how Americans feel about social media” (Rainie, 2018). Other stakeholders and their stakes also present a more complicated but very important view of this controversy.


Questions for Discussion

1. What role(s) did Facebook have in this case?

2. Did Facebook violate its trust and obligations to consumer stakeholders? Explain.

3. What should Facebook have done differently in this case to be a responsible firm?

4. Was your view and practices using Facebook changed or not in this case? Explain.

5. What would you recommend to Facebook as an ethical and consumer consultant?

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Question Posted: