Research Cannot Be the Justification for Compromising People’s Privacy

For months, we’ve attempted to work with New York University to provide three of their researchers the precise access they’ve asked for in a privacy protected way. Today, we disabled the accounts, apps, Pages and platform access associated with NYU’s Ad Observatory Project and its operators after our repeated attempts to bring their research into compliance with our Terms. NYU’s Ad Observatory project studied political ads using unauthorized means to access and collect data from Facebook, in violation of our Terms of Service. We took these actions to stop unauthorized scraping and protect people’s privacy in line with our privacy program under the FTC Order. 

The researchers gathered data by creating a browser extension that was programmed to evade our detection systems and scrape data such as usernames, ads, links to user profiles and “Why am I seeing this ad?” information, some of which is not publicly-viewable on Facebook. The extension also collected data about Facebook users who did not install it or consent to the collection. The researchers had previously archived this information in a now offline, publicly-available database. 

We offer researchers a number of privacy-protective methods to collect and analyze data. We welcome research that holds us accountable, and doesn’t compromise the security of our platform or the privacy of the people who use it. That’s why we created tools like the Ad Library and launched initiatives like Data for Good and Facebook Open Research & Transparency (FORT) — to provide privacy-protected APIs and data sets for the academic community. 

We told the researchers a year ago, in summer of 2020, that their Ad Observatory extension would violate our Terms even before they launched the tool. In October, we sent them a formal letter notifying them of the violation of our Terms of Service and granted them 45 days to comply with our request to stop scraping data from our website. The deadline ended on November 30, long after Election Day. We continued to engage with the researchers on addressing our privacy concerns and offered them ways to obtain data that did not violate our Terms. 

Earlier this year, we invited researchers, including the ones from NYU, to safely access US 2020 Elections ad targeting data through FORT’s Researcher Platform. This offered the Ad Observatory researchers a more comprehensive data set than the one they created by scraping data on Facebook. The researchers had the opportunity to use the data set, which is designed to be privacy-protective, instead of relying on scraping, but they declined.

We made it clear in a series of posts earlier this year that we take unauthorized data scraping seriously, and when we find instances of scraping we investigate and take action to protect our platform. While the Ad Observatory project may be well-intentioned, the ongoing and continued violations of protections against scraping cannot be ignored and should be remediated. 

Collecting data via scraping is an industry-wide problem that jeopardizes people’s privacy, and we’ve been clear about our public position on this as recently as April. The researchers knowingly violated our Terms against scraping — which we went to great lengths to explain to them over the past year. Today’s action doesn’t change our commitment to providing more transparency around ads on Facebook or our ongoing collaborations with academia. We’ll continue to provide ways for responsible researchers to conduct studies that are in the public interest while protecting the security of our platform and the privacy of people who use it. 

Source

Leave a Reply

Your email address will not be published. Required fields are marked *