Facebook

Facebook Trending Topics are Curated by Editors, Not an Algorithm, Leaked Documents Reveal

Documents show no evidence of political bias, but do contradict the company's claim that topics trend organically.

|

Share this on Facebook!
Pixabay/Creative Commons CC0

The latest chapter in the Facebook "Trending Topics" saga can be found in leaked internal documents which indicate Facebook operates as most newsrooms do, with substantial style guidelines and an editorial hierarchy, but with no edict to promote or suppress any specific political viewpoints.

The documents were given to The Guardian following a report in Gizmodo earlier this week, which quoted several anonymous ex-staffers who accused the social media giant of promoting a culture which deliberately kept right-of-center news outlets and topics from appearing on the site's Trending section, a major director of internet traffic. The timing of the leaked documents indicates the company wants the public to know its official guidelines are focused on making an attractive and user-friendly friendly product, not promoting a political agenda.  

The conservative uproar over the story led to Sen. John Thune (R-S.D.) sending a letter to Facebook CEO Mark Zuckerberg "requesting" internal documents detailing the Trending Topics' curation process. The letter also summoned Zuckerberg and senior Facebook staffers to appear before the US Senate Committee on Commerce, Science and Communication. Thune, a long-time vocal opponent of the "Fairness Doctrine," appears to be oblvious to the irony of his attempts to strong-arm a private company after having described the powers granted to the FCC under the Fairness Doctrine as "a recipe for an Orwellian disaster."

The documents leaked to The Guardian indicate that, despite Facebook's claims to the contrary, the human element was far more prominent than an algorithm (which supposedly detected popular stories based on organic shares by Facebook users) in determining which stories made it to the Trending section. 

According to the internal documents, Facebook relies on only 10 mainstream news sites (BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo) to determine whether a story is legitimately trending, though Facebook gave the Guardian of a list of 1000 "trusted sites," including conservative ones, used to "verify and characterize" the stories being considered for the Trending section.

Though the word "blacklist" carries some pretty sinister connotations and was used by the ex-staffer in the Gizmodo piece to suggest a political bias at the company, its appearance as a topic ("Blacklisting items") in the Facebook guidelines has more to do with avoiding duplicate stories or misleading topics than it does with avoiding any particular politics.

Though the intent behind "blacklisting" topics is to keep staffers from having to review the same topic over and over again, this unfortunately phrased sentence taken out of context could certainly ruffle some feathers: "Our bias is to blacklist topics for the maximum of 24 hours in almost all cases."

The Guardian reports that several former Facebook staffers confirmed that the algorithm gave way to editorial discretion in 2014:

Specifically, complaints about the absence from trending feeds of news reports about clashes between protesters and police in Ferguson in 2014 were evidence to Facebook that – in the specific case of the trending module – humans had better news judgment than the company's algorithm. Multiple news stories criticized Facebook for apparently prioritizing Ice Bucket Challenge videos over the riots. Many said the incident proved that Twitter was the place for hard news, and Facebook was a destination for fluff.

By "injecting" stories about Ferguson, one could make the argument that Facebook helped build the social media presence of the Black Lives Matter movement, but it is not evidence that the company tried to create the movement out of whole cloth. Rather, it appears that Facebook essentially operates as a newsroom, but the current and former employees interviewed by The Guardian all maintain it is not an ideological newsroom.  

In a statement released today, Facebook's Vice President of Global Operations, Justin Osofsky wrote that the company "does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period." Osofsky added, "We have at no time sought to weight any one view point over another, and in fact our guidelines are designed with the intent to make sure we do not do so."

However, by asserting that it fully intends to be neutral, Facebook created the unnecessary expectation that they strictly try to balance left and right viewpoints.

What if, as the Gizmodo piece appeared to indicate, there is no top-down official edict to prioritze liberal news? What if it's just a matter of demographics that the educated, well-to-do, East and West Coast editorial staffers probably tend to lean left and may have unconsciously prioritized left-of-center news and outlets at the expense of conservative news?

Facebook could have avoided a lot of its current public relations issues if it hadn't continued to insist on selling Trending Topics as mostly driven by organic shares picked up by the algorithm, which is presumably a neutral code incapable of bias. But people employed by Facebook had to create that code, so even that process involves a human element, and humans carry the own personal biases no matter how hard they try to temper them.

The company appears to have determined a while ago that having editorial guidelines and using the human element makes the product better, and it probably does.