If You Think Facebook Is Full of Dubious Outrage-Bait, Wait Til You See the Company's Critics
A business model where outrage is exploited for clicks describes both social media and the news media.
Imagine a business model where people's outrage is exploited for clicks, where emotions like affection and anger are valuable to tease out, and where, if people seem uninterested, you know you've done your job poorly.
Of course, this describes both Facebook and the news media criticizing it. Journalists, foaming at the mouth from so-called whistleblower Frances Haugen's innocuous revelations, want you to believe that this model is unique to social media sites, gripped by the pursuit of the profit that accompanies expansion. The Washington Post, for example, published a report this week on how Facebook's algorithms classify "angry" react emojis as more valuable than regular old "likes." This pushes "more emotional and provocative content into users' news feeds":
Starting in 2017, Facebook's ranking algorithm treated emoji reactions as five times more valuable than "likes," internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook's business.
But what the Post and others fail to emphasize is that Facebook algorithms also rank the "love" emoji as more important than the "like" emoji when determining which content other users see. "Love" reacts were used far more commonly than "angry" reacts (11 billion clicks per week vs. 429 million). The algorithm, which assigns each post in a given feed a score, translating to placement in the news feed, was built to value people expressing strong emotions and show that type of content to others.
This completely changes the story! It turns out that the "love" react is ~25x as important as "anger" in deciding news feed content. pic.twitter.com/7ATJD4WNXL
— Byrne Hobart (@ByrneHobart) October 26, 2021
You have to love the irony of the negative framing here, what's not said, is that the love&care emojis also get 5x value.
A deeper data science point is that these are treated as "higher value" because they are rarer, so it is a stronger signal from the user https://t.co/ygGcc4cx6b
— Aishvar (@AishvarR) October 26, 2021
So, pretty similar to how the news media work: Online publications have strong incentives to write headlines and promotional materials that compel readers to click on their piece in a crowded marketplace, to prompt readers to spend as many minutes as possible actively engaged with the content. These basic incentives are at play for Facebook engineers designing algorithms. But the Post and others have treated these revelations as somehow explosive, portraying Zuckerberg as Frankenstein and Facebook as his monster. This narrative—that Facebook deliberately sows division in such a profound way that it ought to be regulated by Congress—is one with plenty of staying power. The media realized that when choosing how to format coverage of Russian interference and the Cambridge Analytica scandal back in 2016–2018. (Ironically, covering Facebook in such a negative way might drive traffic for some of these news sites.)
Favoring "controversial" posts—including those that make users angry—could open "the door to more spam/abuse/clickbait inadvertently," a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, "It's possible."
The warning proved prescient. The company's data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news.
But there are lots of items that could feasibly appear in someone's news feed that could cause justified anger: videos portraying police abuse of innocent citizens; government suppression of protest movements like that in Hong Kong; or revelations about data breaches or unlawful snooping by the state. In each of these cases, the strong reactions evoked could've even played a role in the news item going viral.
Haugen told the British Parliament earlier this week that "Facebook has been unwilling to accept even a little sliver of profit being sacrificed for safety," and that "anger and hate is the easiest way to grow on Facebook." She should probably add that that's true for media organizations, too.
Show Comments (119)