Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Technology

Do Social Media Algorithms Polarize Us? Maybe Not.

A series of studies suggest it's not algorithms that are driving political polarization, ignorance, or toxicity online.

Elizabeth Nolan Brown | From the November 2023 issue

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
sidebar | Illustration: Arkadivna/iStock
(Illustration: Arkadivna/iStock)

A safer, saner social media world is possible, former Facebook product manager Frances Haugen told members of Congress in 2021. Instead, she said, leaders at the social media company chose engagement over democracy, using algorithms that kept people glued to the site but also angry, anxious, and ill-informed.

Haugen's diagnosis of the cause of our current political dysfunction (social media algorithms) and cure ("get rid of the engagement-based ranking" of content and return to displaying posts in simple chronological order) has become dogma for many politicians, members of the press, and would-be change-makers. Doing away with algorithms would also halt hate speech and misinformation, these groups insist.

But more and more research is casting doubt on such claims. The latest comes from a collaboration between academics and Facebook parent company Meta, who set out to explore the impact of algorithms in the lead-up to the 2020 election.

The first results from this project were published in four papers in July. Michael W. Wagner, the project's independent rapporteur, called the studies works of "independent," "rigorous," and "ethical" research.

With users' consent, researchers tweaked various elements of their Facebook feeds to see what effect it would have on things like political outlooks and media diet. The interventions differed by study, but all revolved around assessing how Facebook algorithms affected user experiences.

What they found cuts to the heart of the idea that Facebook could produce kinder, better informed citizens by simply relying less on engagement metrics and algorithms to determine what content gets seen. "Both altering what individuals saw in their news feeds on Facebook and Instagram and altering whether individuals encountered reshared content affects what people saw on the platforms," reported Wagner in the July 28 issue of Science. But "these changes did not reduce polarization or improve political knowledge during the 2020 US election. Indeed, removing reshared content reduced political knowledge."

In one study, led by Princeton University's Andy Guess and published in Science, select users were switched from algorithm-driven Facebook and Instagram feeds to feeds that showed posts from friends and pages they followed in reverse chronological order—just the sort of tweak social media critics like Haugen have pushed for. The shift "substantially decreased the time they spent on the platforms and their activity" and led to users seeing less "content classified as uncivil or containing slur words," concludes the study (titled "How do social media feed algorithms affect attitudes and behavior in an election campaign?").

But "the amount of political and untrustworthy content they saw increased on both platforms." Despite shifting the types of content users saw and their time spent on Facebook, the switch "did not cause detectable changes in downstream political attitudes, knowledge, or offline behavior" during the three-month study period.

Despite some limitations, "our findings rule out even modest effects, tempering expectations that social media feed-ranking algorithms directly cause affective or issue polarization in individuals or otherwise affect knowledge about political campaigns or offline political participation," the team concludes. They suggest more research focus on offline factors "such as long-term demographic changes, partisan media, rising inequality, or geographic sorting."

In another study—also led by Guess—researchers excluded re-shared content from some users' news feeds. These users wound up seeing substantially less political news ("including content from untrustworthy sources"), clicking on less partisan news, and reacting less overall. Yet "contrary to expectations," the shift didn't significantly alter "political polarization or any measure of individual-level political attitudes." Those in the experimental group also ended up less informed about the news.

"We conclude that though re-shares may have been a powerful mechanism for directing users' attention and behavior on Facebook during the 2020 election campaign, they had limited impact on politically relevant attitudes and offline behaviors," write Guess and colleagues in "Reshares on social media amplify political news but do not detectably affect beliefs or opinions," also published in Science.

In another experiment, researchers tweaked some Facebook feeds to reduce exposure to "like-minded sources" by about a third. As a result, these users indeed saw content from a more "cross-cutting" range of sources and less "uncivil" content. But this failed to alter their political attitudes or belief in misinformation.

Ultimately, the results "challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy," writes a research team led by Brendan Nyhan, Jaime Settle, Emily Thorson, Magdalena Wojcieszak, and Pablo Barberá in "Like-minded sources on Facebook are prevalent but not polarizing," published in Nature. "Algorithmic changes…do not seem to offer a simple solution for those problems."

For years, politicians have been proposing new regulations based on simple technological "solutions" to issues that stem from much more complex phenomena. But making Meta change its algorithms or shifting what people see in their Twitter feeds can't overcome deeper issues in American politics—including parties animated more by hate and fear of the other side than ideas of their own. This new set of studies should serve as a reminder that expecting tech companies to somehow fix our dysfunctional political culture won't work.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Competition, Not Antitrust, Is Humbling the Tech Giants

Elizabeth Nolan Brown is a senior editor at Reason.

TechnologyAlgorithmsSocial MediaMedianews
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Show Comments (171)

Latest

The EPA Is a Prime Candidate for Reform by the Trump Administration

J.D. Tuccille | 5.9.2025 7:00 AM

Review: A Doomsday Murder Mystery Set in an Underground Bunker

Jeff Luse | From the June 2025 issue

Review: A Superhero Struggle About the Ethics of Violence

Jack Nicastro | From the June 2025 issue

Brickbat: Cooking the Books

Charles Oliver | 5.9.2025 4:00 AM

The App Store Freedom Act Compromises User Privacy To Punish Big Tech

Jack Nicastro | 5.8.2025 4:57 PM

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!