Pages

12 October 2024

Big Tech manipulating research into its harm to society - Opinion

Timothy Graham

For almost a decade, researchers have been gathering evidence that the social media platform Facebook disproportionately amplifies low-quality content and misinformation.

So it was something of a surprise when in 2023 the journal Science published a study that found Facebook’s algorithms were not major drivers of misinformation during the 2020 United States election.

This study was funded by Facebook’s parent company, Meta. Several Meta employees were also part of the authorship team. It attracted extensive media coverage. It was also celebrated by Meta’s president of global affairs, Nick Clegg, who said it showed the company’s algorithms have “no detectable impact on polarisation, political attitudes or beliefs.”

But the findings have recently been thrown into doubt by a team of researchers led by Chhandak Bagch from the University of Massachusetts Amherst. In an eLetter also published in Science, they argue the results were likely due to Facebook tinkering with the algorithm while the study was being conducted.

No comments:

Post a Comment