Are Google Search algorithms creating an online 'echo chamber' that just tells you what you want to hear?

Publicly released:
International
Photo by Nathana Rebouças on Unsplash
Photo by Nathana Rebouças on Unsplash

Are Google Search algorithms creating an online 'echo chamber' or 'filter bubble' that just tells you what you want to hear? According to international research, exposure to and engagement with partisan or unreliable news on Google Search is primarily driven not by algorithmic curation but by users’ own choices. The study used a browser extension that recorded three sets of data: the URLs presented to users on Google Search results pages, interactions with the URLs on those pages, and wider online URL interactions beyond the Google Search results during the 2018 and 2020 US elections. The results showed that participants were found to engage with more partisan content overall than they were exposed to in their Google Search results. The authors suggest that the Google Search algorithm was not leading users to content that affirms their existing beliefs, but rather, that the formation of these ‘echo chambers’ could be driven by user choice, rather than algorithmic intervention. 

Media release

From: Springer Nature

Social sciences: People, not Google Search, choose partisan news (N&V)

Engagement with partisan or unreliable news is driven by personal content choices, rather than the content presented by online web search algorithms, suggests a study published in Nature. Researchers found that the Google Search does not disproportionately display results that mirror the user’s own partisan beliefs, and that instead, such sources are sought actively by individuals. 

It has been suggested that search engine algorithms may promote the consumption of like-minded content through politically biased search rankings, among other concerns. Such suggestions raise concerns about the cultivation of online ‘echo chambers’ or ‘filter bubbles’ that limit the user’s exposure to contrary opinions and further exacerbate existing biases.

To investigate whether search results influence what people choose to read, Ronald E. Robertson and colleagues conducted a two-wave study analysing computer browsing data during two US election cycles, with 262–333 participants in 2018 (with an over-sampling of ‘strong partisans’ in this period), and 459–688 in 2020. Participants were asked to install a custom-built browser extension that recorded three sets of data: the URLs presented to users on Google Search results pages, interactions with the URLs on those pages, and wider online URL interactions beyond the Google Search results in these two time periods. This dataset includes over 328,000 pages of Google Search results, and almost 46 million URLS from wider internet browsing. The partisanship of sources was scored on the basis of wider sharing patterns among political bases and compared to the self-reported political leanings of participants. 

For both study waves, participants were found to engage with more partisan content overall than they were exposed to in their Google Search results. This suggests that the Google Search algorithm was not leading users to content that affirms their existing beliefs. Rather, the authors suggest that the formation of these ‘echo chambers’ could be driven by user choice, rather than algorithmic intervention. It was also found that unreliable news was less prevalent in the URLs derived from Google Searches, compared to both URLs followed from the Google Search and wider engagement, with the largest discrepancy being among strongly right-wing users.

Journal/
conference:
Nature
Research:Paper
Organisation/s: Stanford University, USA
Funder: This research was supported in part by the Democracy Fund, the William and Flora Hewlett Foundation and the National Science Foundation (IIS-1910064).
Media Contact/s
Contact details are only visible to registered journalists.