Photo by Brett Jordan on Unsplash
Photo by Brett Jordan on Unsplash

EXPERT REACTION: Facebook and Instagram 'echo chambers' may not be driving our political polarisation

Embargoed until: Publicly released:
Peer-reviewed: This work was reviewed and scrutinised by relevant independent experts.

Changing the 'echo chambers' that exist on Facebook and Instagram may not actually change our political polarisation or attitudes, according to a series of studies published in Nature and Science. The studies challenge views about the extent to which the ‘echo chambers’ of social media drive political polarisation. The series of papers found that the platform algorithms are extremely influential in terms of what people see and in shaping their on-platform experiences, and there is significant ideological segregation in political news exposure. However, a series of experiments during the 2020 presidential election in the US suggest that while changing or removing the algorithms does affect what people see and their level of engagement on the platforms, the changes did not notably affect political attitudes or political polarisation.

Journal/conference: Nature, Science

Research: Paper

Organisation/s: University of Texas, New York University, Meta USA

Funder: The costs associated with the research (such as participant fees, recruitment and data collection) were paid by Meta. Ancillary support (for example, research assistants and course buyouts) was sourced by academics from the Democracy Fund, the Guggenheim Foundation, the John S. and James L. Knight Foundation, the Charles Koch Foundation, the Hewlett Foundation, the Alfred P. Sloan Foundation, the University of Texas at Austin, New York University, Stanford University, the Stanford Institute for Economic Policy Research and the University of Wisconsin-Madison.

Media release

From: The content of this press release is the responsibility of the authors

First Findings from US 2020 Facebook & Instagram Election Study Released

Unprecedented research in the context of the 2020 presidential election reveals algorithms are extremely influential in people’s on-platform experiences and there is significant ideological segregation in political news exposure but, among consenting study participants, changes to critical aspects of the algorithms that determine what they saw did not sway political attitudes

Today, academics from U.S. colleges and universities working in collaboration with researchers at Meta published findings from the first set of four papers as part of the most comprehensive research project to date examining the role of social media in American democracy. The papers, which focus primarily on how critical aspects of the algorithms that determine what people see in their feeds affect what people see and believe, were peer-reviewed and published in Science and Nature.

The academic team proposed and selected specific research questions and study designs with the explicit agreement that the only reasons Meta could reject such designs would be for legal, privacy, or logistical (i.e., infeasibility) reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions. With this unprecedented access to data and research collaboration, the team found:

1.    Algorithms are extremely influential in terms of what people see and in shaping their on-platform experiences.

2.    There is significant ideological segregation in political news exposure.

3.    Three experiments conducted with consenting participants run during the 2020 election period suggest that although algorithm adjustments significantly change what people see and their level of engagement on the platforms, the three-month experimental modifications did not notably affect political attitudes.

The project was announced in 2020 after internal researchers at Meta initiated a partnership with Professor Talia Jomini Stroud, founder and Director of the Center for Media Engagement at the University of Texas at Austin, and Professor Joshua A. Tucker, co-founder and co-director of the Center for Social Media and Politics at New York University and Director of the NYU Jordan Center for the Advanced Study of Russia, around the impact of Facebook and Instagram on the 2020 U.S. elections.

“Social scientists have been limited in the study of social media’s impact on U.S. democracy,” said Stroud and Tucker. “We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn't likely to change people’s political attitudes. What we don't know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources.”

The core research team included 15 additional academic researchers with expertise in the four areas this project focused on: political polarization, political participation, (mis)information and knowledge, and beliefs about democratic norms and the legitimacy of democratic institutions.

The team worked with Meta researchers to design experimental studies with consenting users who answered survey questions and shared data about their on-platform behavior. The team also analyzed platform-wide phenomena based on the behavior of all adult U.S. users of the platform. Platform-wide data was only made available to the academic researchers in aggregated form to protect user privacy. Additional findings include:

Ideological segregation on Facebook

●     Many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.

●     Ideological segregation associated with political news URLs posted by Pages and in Groups was higher than content posted by users.

●     There was an asymmetry between conservative and liberal audiences, where there were far more political news URLs almost exclusively seen by conservatives than political news URLs exclusively seen by liberals.

●     The large majority (97%) of political news URLs (posted at least 100 times) on Facebook rated as false by Meta’s third-party fact checker program were seen by more conservatives than liberals, although the proportion of political news URLs rated as false was very low.

Impacts of removing reshared content on Facebook

●     Removing reshared content on Facebook substantially decreased the amount of political news and content from untrustworthy sources people saw in their feeds, decreased overall clicks and reactions, and reduced clicks on posts from partisan news sources.

○     Removing reshares reduced the proportion of political content in people’s feeds by nearly 20% and the proportion of political news by more than half.

○      Although making up only 2.6% of Facebook feeds on average, removing reshares reduced the amount of content from untrustworthy sources by 30.6%.

●     Removing reshared content on Facebook decreased news knowledge among the study participants, and did not significantly affect political polarization or other individual-level political attitudes.

Impacts of altering feed algorithms from personalized to chronological

●     Replacing study participants’ algorithmically ranked feeds on Facebook and Instagram with a simple chronological ranking, meaning that they saw the newest content first, substantially decreased the time participants spent on the platforms and how much they engaged with posts there.

○     The average study participant in the Algorithmic Feed group spent 73% more time each day on average compared with U.S. monthly active users, but the Chronological Feed group spent only 37% more.

●     The chronologically ordered feed significantly increased content from moderate friends and sources with ideologically mixed audiences on Facebook; it also increased the amount of political and untrustworthy content relative to the default algorithmic feed. The chronological feed decreased uncivil content.

○     When presented in chronological order, political content -- appearing in 13.5% of participants’ feeds on Facebook and 5.3% on Instagram on average -- increased by 15.2% on Facebook and 4.8% on Instagram.

○     When participants viewed the chronological feed, content from untrustworthy sources, making up 2.6% of Facebook feeds and 1.3% of Instagram feeds on average, increased by 68.8% and 22.1%, respectively.

○     Posts with uncivil content on Facebook (estimated as 3.2% of participants’ feeds on average) decreased by 43% when participants saw a chronological feed. Posts with uncivil content on Instagram (estimated as 1.6% of participants’ Instagram feeds on average), however, did not decrease.

●     Despite these substantial changes in participants’ on-platform experience, the chronological feed did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes during the three-month study period.

Impacts of deprioritizing content from like-minded sources on Facebook

●     Posts from politically ‘‘like-minded” sources constitute a majority of what people see on the platform, although political information and news represent only a small fraction of these exposures.

○     The median Facebook user received a majority of their content from politically like-minded sources—50.4% versus 14.7% from cross-cutting sources (i.e., liberals seeing content from conservatives or the opposite). The remainder are from friends, Pages and Groups that are classified as neither like-minded nor cross-cutting.

●     Reducing the prevalence of politically like-minded content in participants’ feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims.

“Asymmetric Ideological Segregation in Exposure to Political News on Facebook,” led by Professors Sandra González-Bailón and David Lazer from the University of Pennsylvania and Northeastern University, respectively, analyzed on-platform exposure to political news URLs during the U.S. 2020 election and compared the inventory of all the political news links U.S. users could have seen in their feeds with the information they saw and the information with which they engaged.

“This begins to answer questions about the complex interaction between social and algorithmic choices in the curation of political news and how that played out on Facebook in the 2020 election,” said Sandra González-Bailón.

“Reshares on Social Media Amplify Political News but Do Not Detectably Affect Beliefs or Opinions,” led by Professors Andrew Guess from Princeton, and Neil Malhotra and Jennifer Pan from Stanford, studied the effects of exposure to reshared content on Facebook during the 2020 U.S. election.

“Most of the news about politics that people see in their Facebook feeds comes from reshares,” said Andrew Guess. “When you take the reshared posts out of people's feeds, that means they are seeing less virality-prone and potentially misleading content. But that also means they are seeing less content from trustworthy sources, which is even more prevalent among reshares.”

In “How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign?,” also led by Guess, Malhotra and Pan, the team investigated the effects of Facebook and Instagram feed algorithms during the 2020 U.S. election by comparing the standard feed to a chronologically ordered feed.

“The findings suggest that chronological feed is no silver bullet for issues such as political polarization," said Pan.

Finally, the “Like-minded Sources on Facebook Are Prevalent but Not Polarizing” study, led by Professors Brendan Nyhan from Dartmouth, Jaime Settle from William & Mary, Emily Thorson from Syracuse and Magdalena Wojcieszak from University of California, Davis, presented data from 2020 for the entire population of active adult Facebook users in the U.S., showing that content from politically like-minded sources constitutes the majority of what people see on the platform, though political information and news represent only a small fraction of these exposures. The study subsequently reduced the volume of content from like-minded sources in consenting participants’ feeds to gauge the effect on political attitudes.

“This tells us that reducing exposure to content from like-minded sources, at least in the context of the 2020 presidential election, did not substantively affect political attitudes,” said Settle.

Academics from Dartmouth, Northeastern University, Princeton, Stanford, Syracuse University, University of California, Davis, University of Pennsylvania, University of Virginia and William & Mary are the lead authors of these initial studies. The lead researchers from the Meta team were Pablo Barberá for all four papers and Meiqing Zhang for the paper on ideological segregation. Meta project leads are Annie Franco, Chad Kiewiet de Jonge, and Winter Mason.

In the coming year, additional papers from the project will be publicly released after completing the peer-review process. They will provide insight into the content circulating on the platforms, people's behavior and the interaction between the two.

###

Attachments:

Note: Not all attachments are visible to the general public

  • Springer Nature
    Web page
    Paper -“Like-minded sources on Facebook are prevalent but not polarizing”, Nature. Please link to the article in online versions of your report (the URL will go live after the embargo ends).
  • AAAS
    Web page
    Paper - "Asymmetric ideological segregation in exposure to political news on Facebook", Science. Please link to the article in online versions of your report (the URL will go live after the embargo ends).
  • AAAS
    Web page
    Paper - "Reshares on social media amplify political news but do not detectably affect beliefs or opinions", Science. Please link to the article in online versions of your report (the URL will go live after the embargo ends).
  • AAAS
    Web page
    Paper - "How do social media feed algorithms affect attitudes and behavior in an election campaign?", Science. Please link to the article in online versions of your report (the URL will go live after the embargo ends).

Expert Reaction

These comments have been collated by the Science Media Centre to provide a variety of expert perspectives on this issue. Feel free to use these quotes in your stories. Views expressed are the personal opinions of the experts named. They do not represent the views of the SMC or any other organisation unless specifically stated.

Dr John Kerr, Senior Research Fellow, Department of Public Health, University of Otago, Wellington, comments:

There is a lot to unpack in these studies, but one of the key takeaways is that some features of Facebook and Instagram, such as their newsfeed algorithms, have a substantial impact on the kinds of political news we see in our social media feeds. However, it seems that this has very little effect on people’s actual political attitudes and beliefs, essentially zero in fact!

Researchers have been concerned for some time that social media newsfeed algorithms are making people more politically extreme by only serving up content that supports ‘their side’, politically speaking. This new US research shows that Facebook and Instagram algorithms do bring up more liberal news sources for liberals and conservative news sources for conservatives, compared to just a basic feed that just shows the most recent posts from friends and groups. However, this doesn’t translate into more extreme or polarised political views. So that’s good news really, these algorithms are not pushing people into more entrenched political camps.

What makes this new research especially impressive is that Meta allowed a select group of independent researchers to run experiments on their platforms and analyse users’ data. However this type of access is incredibly rare. In fact, social media companies have been making it harder for scientists to study and analyse their data. Facebook, Twitter, and Reddit have all reduced access to content and data for research. This lack of transparency means we will have even less insight into the prevalence, spread and impact of content like hate speech and disinformation. As noted in the Policy Forum article accompanying the new research, regulatory requirements for major platforms to share data may be necessary to 'foster opportunities of path-breaking, comprehensive scholarship that does not require a social media platform’s permission'.

For this new research, it’s not clear how well it will translate to the Aotearoa New Zealand context. New Zealand is a very different kettle of fish in terms of politics. We tend to be less politically polarised as a society. We don’t have a big Democrat-Republican type split like you see in the US, partly due to our MMP system. Our mainstream news media is also less polarised. Unlike the US, where you get major news outlets that have a very right-wing or left-wing stance—think of channels such as Fox News or MSNBC—the bigger players here tend more towards the centre of the political spectrum.

Acknowledging these differences, the new research suggests that as New Zealand heads into the election, we are more likely to see political stories on social media that match our own views, rather than those that challenge them. And that is thanks in part to newsfeed algorithms. These algorithms are geared towards showing us content we like and engage with, to keep us glued to the newsfeed. Indeed, when the researchers in one of these studies switched off users’ algorithm-based newsfeed and gave them just a basic, uncurated newsfeed, people spent less time on the platform.

If you are worried about getting trapped in filter bubbles or echo chambers—situations where you are only seeing a slice of the news that suits your worldview—there is a relatively easy solution. Read widely. Actively seek out multiple articles on an issue and different viewpoints. Don’t rely on your social media feed to give you the full picture. Hopefully, you’ll gain a wider understanding of what is going on and, as a bonus, you will sound like you know what you are talking about at parties.

Last updated: 27 Jul 2023 9:41am
Declared conflicts of interest:
No conflict of interest.
Dr Andrew Lensen, Senior Lecturer in Artificial Intelligence, School of Engineering and Computer Science, Victoria University of Wellington, comments:

The Study

It is great to see outside academics finally being enabled to study the effects of Meta’s algorithms on political polarisation. Tech companies are notoriously siloed and protective of their data, so it is a big and exciting step to see this collaborative research take place. The outside academics did not get 'free rein', though – they weren’t allowed to see the data themselves (due to Meta’s privacy concerns) and so had to trust that Meta was supplying the data faithfully.

Meta’s Angle

In Meta's own media release, their President of Global Affairs Nick Clegg claims the studies show 'there is little evidence that social media causes harmful 'affective' polarisation', whereas the study authors more accurately state that changing the recommendation algorithm for only a few months isn’t likely to change political attitudes. This highlights a danger in these studies where the social media company has control over the study length – Clegg’s comments suggest that Facebook and Instagram play no part in polarisation, which isn’t what the studies say. We must be careful that Big Tech does not abuse the reputation of academics (as trusted independent experts) to create their own disinformation about results.
 
We still don’t know how Meta’s algorithm works, as it is a closely guarded trade secret. I would trust Meta more if they allowed independent academics to audit their algorithm to look for bias and polarisation.

Wider Implications to Aotearoa

The studies showed that using reverse chronological ordering (i.e. most recent posts first) instead of the current algorithm means users spend 'dramatically less time on Facebook and Instagram'. In New Zealand, we hear about how our rangatahi are spending more time than any other generation on social media, with flow-on effects on their schooling and wider lives. A simple change like this by Meta could reduce the addictiveness of these technologies. Social media companies won’t do this willingly: time on a platform is income, and they are businesses, after all.

This election campaign in Aotearoa is already very tech heavy. We’ve seen political parties use AI image generation for political attack ads, and I strongly believe they will be looking to ChatGPT to write social media posts, individualise letters to constituents, and more. We need to decide as a society what role we allow AI algorithms (including the ones in this study) to have in our democracy, as our existing campaigning laws were not built for the AI era.

Last updated: 27 Jul 2023 9:40am
Declared conflicts of interest:
No conflicts of interest.

News for:

International

Media contact details for this story are only visible to registered journalists.