Facebook’s algorithm is ‘influential’ but doesn’t necessarily change beliefs, researchers say

A sign at the corporate campus of Meta, parent company of Facebook, in Menlo Park, California, February 15, 2022. In four new studies, researchers found complicated results from experiments on Facebook's and Instagram's algorithms, suggesting there was no silver bullet to fixing the platforms. [Jim Wilson/The New York Times]

SAN FRANCISCO - The algorithms powering Facebook and Instagram, which drive what billions of people see on the social networks, have been in the crosshairs of lawmakers, activists and regulators for years. Many have called for the algorithms to be abolished to stem the spread of viral misinformation and to prevent the inflammation of political divisions.

But four new studies published Thursday - including one that examined the data of 208 million Americans who used Facebook in the 2020 presidential election - complicate that narrative.

In the papers, researchers from the University of Texas, New York University, Princeton and other institutions found that removing some key functions of the social platforms' algorithms had "no measurable effects" on people's political beliefs. In one experiment on Facebook's algorithm, people's knowledge of political news declined when...

Continue reading on: