This week, we learned that Cambridge Analytica, a data mining firm, stole personal data of 50 million Facebook users. The story hit Facebook hard, and it has raised new privacy concerns about the social network. And it was widely reported that a former Cambridge Analytica employee claimed that the data “stolen” from Facebook users was used to manipulate users through micro-targeting for the U.S. presidential election. Cambridge Analytica was also allegedly involved in manipulating the public during the Brexit referendum in the United Kingdom, as well as during multiple different political events in Kenya, Brazil, Mexico, and other countries around the world.
Cambridge Analytica reportedly used data collected by a personality quiz app on Facebook developed by a Cambridge University researcher. The data was used to predict voter behavior and then manipulate voters with propaganda. The data was also used by the firm to help political parties design their campaigns, help their leaders design speeches to talk about what people want to hear, and help political leaders take pride in what they do best: making false promises.
All of this is obviously wrong in many different ways. But contrary to what you may have read elsewhere, none of this is Facebook’s fault.
Facebook’s apps platform is built to let third-party apps gain access to your personal data on the social network. But the social network has implemented multiple measures over the years to make sure that nefarious companies like Cambridge Analytica don’t misuse its users’ data. When a user gives a third-party app access to their personal data, Facebook specifically informs users of the exact type of data they are sharing The company even lets users limit the access of the third-party apps to their personal data with the single click of a button.
The process is incredibly straight-forward. But many Facebook users never bother to read the simple notice when they give a random third-party app access to their personal data. That’s become a major problem for the company, and its founder Mark Zuckerberg confirmed earlier that the firm is taking steps to limit third-party apps’ access to its users’ personal data. It will also provide users with new tools that make it easy for users to revoke access to their personal data for certain apps. And apps that they don’t use for 3 months will automatically have their access revoked in wake of the recent revelations.
The press is blaming Facebook for all of this. But Facebook shouldn’t be held responsible for its user’s actions. You see, once a Facebook user gives a third-party app access to their personal data, the third-party can do anything they want with the data. Improper use of the data is explicitly against Facebook’s terms of service — but for Facebook, it’s immensely difficult to constantly monitor all the millions of apps on its platform and how they are actually using its users’ data. Ultimately, it is up to you—not Facebook—-to determine whether to trust third parties with your data.
Yes, there are some things Facebook could have done to help prevent this type of event. The company found out about Cambridge Analytica’s wrongdoings in 2015, and it ordered the data mining firm to destroy the data because it was violating the Facebook terms of service. But instead of doing a complete audit to confirm Cambridge Analytica has destroyed all of the data, Facebook company was satisfied by a legal certification.
Problem is, Cambridge Analytica is full of vile culprits, ones that use sex workers to orchestrate outrageous strategies for its customers—mostly political parties—-to take down their client’s competitors. And The New York Times, the UK’s Channel 4 and The Guardian—some of the world’s most prestigious publications—all claim that Cambridge Analytica never did delete the Facebook user data they had gathered.
Facebook, the world’s largest and most profitable social network, should also have been more cognizant of the privacy implications of sharing its users data with third-parties. And that’s true even if the firm is technically not responsible for what their users decide to share. The company could have enabled different mechanisms to automatically limit third-party apps’ access to user data after a period of inactivity or suspicious activities, for example. Even better, it could have made it completely impossible for third-parties to access unnecessary information, such as a user’s friends lists or posts unless the app developer goes through some sort of verification process.
Facebook has a lot of work to do to rebuild trust with its users. In the interim, its executives will be busy meeting with government regulators while its engineers and security researchers work to protect its platform from other the Cambridge Analyticas out there.
The takeaway from this entire scandal for us—the billion-plus regular users who make up Facebook’s customer base—should be to increase our personal awareness about who we share our personal data with, which companies we trust, and the companies that deserve to use our personal data. Data mining is a dirty but hugely-profitable business, and we should be more protective of our own privacy instead of relying on companies like Facebook to do the right thing.