Meta let researchers study whether it stokes polarization. The results were polarizing.

8/3/23
 
   < < Go Back
 
from CJR,
8/3/23:

For much of the past decade, academic researchers have been trying to persuade Meta, the company formerly known as Facebook, to share internal data about the behavior of users on its platforms, so that they might understand how—if at all—the sites’ algorithms influence people’s political views and behavior. The company suggested that it might offer such access; back in 2018, it even launched a project designed to share data. But the amount of usable information it ended up offering to researchers was minuscule and, in some cases, significantly flawed. As I reported for CJR two years ago this month, Meta also thwarted attempts by social scientists to collect their own data through scraping, and even disabled the accounts of some researchers. All this left the impression that the company had no interest in facilitating academic scrutiny.

It was more than a little surprising, then, when social scientists last week published not one but four new studies based on user data that Meta had shared with them, part of a research project that the company launched in 2020 to analyze users’ behavior both during and immediately after that year’s presidential election.

More From CJR: