Nearly three years ago Meta announced it was partnering with more than a dozen independent researchers to study the impact Facebook and Instagram had on the 2020 election. Both Meta and the researchers promised the project, which would rely on troves of internal data, would deliver an independent look at issues like polarization and misinformation.
Now, we have the first results of that research in the form of four peer-reviewed papers published in the journals Science and Nature. The studies offer an intriguing new look at how Facebook and Instagram’s algorithms affected what users saw in the run-up to the 2020 presidential election.
The papers are also a notable milestone for Meta. The company has at times had a strained relationship with independent researchers and been accused of “transparency theater” in its efforts to make more data available to those wishing to understand what’s happening on this platform. In a statement, Meta’s policy chief Nick Clegg said that the research suggests Facebook may not be as influential in shaping its users’ political beliefs as many believe. “The experimental studies add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization, or have meaningful effects on key political attitudes, beliefs or behaviors,” he wrote.
The researchers’ initial findings, however, appear to paint a more complex picture.
One study in Nature looked at the effect of so-called “echo chambers,” or when users are exposed to a large amount of “like-minded” sources. While the researchers confirm that most users in the US see a majority of content from “like-minded friends, Pages and groups,” they note all of it isn’t explicitly political or news-related. They also found that decreasing the amount of “like-minded” content reduced engagement, but didn’t measurably change user’s beliefs or attitudes.
While the authors note the results don’t account for the “cumulative effects” years of social media use may have had on their subjects, they do suggest the effects of echo chambers are often mischaracterized.
Another study in Nature looked at the effect of chronological feeds compared with algorithmically-generated ones. That issue gained particular prominence in 2021, thanks to revelations from whistleblower Frances Haugen, who has advocated for a return to chronological feeds. Unsurprisingly, the researchers concluded that Facebook and Instagram’s algorithmic feeds “strongly influenced users’ experiences.”
“The Chronological Feed dramatically reduced the amount of time users spent on the platform, reduced how much users engaged with content when they were on the platform, and altered the mix of content they were served,” the authors write. “Users saw more content from ideologically moderate friends and sources with mixed audiences; more political content; more content from untrustworthy sources; and less content classified as uncivil or containing slur words than they would have on the Algorithmic Feed.”
At the same time, the researchers say that a chronological feed “did not cause detectable changes in downstream political attitudes, knowledge, or offline behavior.”
Likewise, another study, also in Science, on the effects of reshared content in the run-up to the 2020 election found that removing reshared content “substantially decreases the amount of political news, including content from untrustworthy sources” but didn’t “significantly affect political polarization or any measure of individual-level political attitudes.’
Finally, researchers analyzed the political news stories that appeared in users’ feeds in the context of whether they were liberal or conservative. They concluded that Facebook is “substantially segregated ideologically” but that “ideological segregation manifests far more in content posted by Pages and Groups than in content posted by friends.” They also found conservative users were far more likely to see content from “untrustworthy” sources, as well as articles rated false by the company’s third-party fact checkers.
The researchers said the results were a “manifestation of how Pages and Groups provide a very powerful curation and dissemination machine that is used especially effectively by sources with predominantly conservative audiences.”
While some of the findings look good for Meta, which has long argued that political content is only a small minority of what most users see, one of the most notable takeaways from the research is that there aren’t obvious solutions for addressing the polarization that does on social media. “The results of these experiments do not show that the platforms are not the problem, but they show that they are not the solution,” University of Konstanz’ David Garcia, who was part of the research team, told Science.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.
Source: www.engadget.com