A pudgy little figure with wide hips and ample breasts, the Venus of Willendorf was discovered in 1908 but originally dates to the Stone Age. One of the oldest surviving art works in the world, the limestone sculpture now resides in Vienna's Natural History Museum, where a woman named Laura Ghianda snapped a pic last December and then posted the image to Facebook.
It was promptly removed. A notice from Facebook explained that the naked figure was inappropriate for the social site.
According to the company's official policy, "photographs of paintings, sculptures, and other art that depicts nude figures" are allowed. But despite four attempts by Ghianda to appeal the image's removal, Facebook wouldn't budge.
The Natural History Museum also appealed to Facebook. "There has never been a complaint by visitors concerning the nakedness of the figurine," Christian Koeberl, the museum's director general, posted in January. "There is no reason…to cover the Venus of Willendorf and hide her nudity, neither in the museum nor on social media."
The museum's plea also failed to get a reaction from Facebook. But after news media began running with the story this week, the company finally caved. On Thursday, a spokesperson for the company told AFP that it had been a mistake to censor the Venus of Willendorf's image and apologized for the error.
As a private company, Facebook is of course entitled to remove whatever imagery it pleases. But the inability of Facebook's algorithms and human moderators to distinguish obscenity from ancient artifacts provides yet another reason to doubt Facebook's ability to police "fake news."
Lately, politicians and activists have been calling on the social network to somehow stop the spread of misinformation, to be more proactive in determining what is and isn't a credible news source, to rate stories for trustworthiness, to suss out "Russian trolls," etc. Who would make these determinations and how is a bit trickier.
At the same time, some authorities want to give Facebook even more reason to censor content. Yesterday the European Commission recommended that Facebook, Twitter, and similar sites be required to take down terrorism-related content within an hour of it being flagged by any European Union law enforcement. The commission has also been urging sites to be more proactive in removing "hate speech" and pornographic content. Yesterday's recommendation warned that if tech companies couldn't comply "voluntarily," legislation would be passed to force them.
And here in the U.S., the House of Representatives just passed legislation that would allow websites to be sued or prosecuted if sex workers post on them.
As authorities keep making ridiculous demands of user-generated content sites, expect to see a lot more situations like the removal of the Venus of Willendorf. Sites simply won't have enough incentive to take chances.