A little-known company called Cambridge Analytica (CA) is at the center of a breaking story about Facebook, the Trump campaign, and possibly shady data exchanges. But it's unclear whether CA did anything wrong here, or if it was just engaged in the kind of micro-targeted marketing common in consumer and political campaigns.
On Sunday, Facebook announced that it had suspended the accounts of CA and its parent company, Strategic Communication Laboratories (SCL). "In 2015, we learned that a psychology professor at the University of Cambridge named Dr. Aleksandr Kogan lied to us and violated our Platform Policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica [and to] Christopher Wylie of Eunoia Technologies," explained Facebook VP Paul Grewal in a statement.
Facebook Login lets websites and apps offer the option to sign in using your Facebook account, and it lets them request and obtain data from those who do. Kogan had created a personality-test app that was download by around 270,000 people, according to Facebook. This, Facebook says, gave Kogan access to such user info as "the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it."
All of this is standard. Where Kogan crossed a line and violated Facebook's terms of service was in passing this information on to third parties. When Facebook found this out, it suspended Kogan's Login account and demanded he, CA, and Wylie delete the data; all confirmed that they did. But now Facebook says it has reason to suspect that "not all data was deleted."
What's lifted this story into big news territory is the fact that the Trump campaign hired Cambridge Analytica in the summer of 2016.
Senate Intelligence Committee members immediately called for more regulation of digital political advertising and more investigations. Sen. Amy Klobuchar (D-Minnesota) wants Mark Zuckerberg to appear before a Judiciary panel. Meanwhile, the U.K.'s Information Commissioner's Office announced this morning that it's launching its own investigation. State prosecutors are wading in too.
Both U.S. and U.K. officials say they're concerned that Facebook didn't notify users about Kogan's "breach." But Facebook Chief Security Officer Alex Stamos insisted in a series of now-deleted Saturday tweets that calling it a breach was wrong:
See more from Stamos here.
CA issued its own series of sassy tweets, starting with "Reality Check: Cambridge Analytica uses client and commercially and publicly available data; we don't use or hold any Facebook data." It said the company "did not use any Facebook data for the 2016 Trump campaign" and opined that "advertising is not coercive; people are smarter than that."
"This isn't a spy movie," CA continued. "We're a data analytics company doing research & analysis on commercial, public and data sets for clients" that span "the political mainstream."
It also pointed out that Barack Obama's 2012 presidential campaign was "famously data-driven" and "pioneered microtargeting" of the sort CA does.
Critics of the kerfuffle over CA's actions have also been pointing to the Obama campaign. In late 2014, Facebook shifted its policy to prevent future campaigns from using the same sort of "sophisticated social targeting"—heralded then as "a powerful new form of voter outreach"—that Obama data gurus had employed.
If you're hysterical about privacy on principle, fine. But the reason people are up in arms is because someone they dislike did it. When Obama did it, people thought it was cool.https://t.co/fJO2IOgreh— Patrick Ruffini (@PatrickRuffini) March 19, 2018
CA says it didn't use the Facebook data to target potential Trump voters, and it says it didn't know that Kogan wasn't supposed to share the data his app had collected. But even if it did use the data that way, and even if it did know where it came from, the only real violation would be using Facebook user data in a way that company itself sanctioned until four years ago. Calling such actions "a project to turn tens of millions of Facebook profiles into a unique political weapon" (as The Guardian does) hovers somewhere between hyperbolic and flat-out wrong. It drastically mischaracterizes the nature and novelty of microtargeted political ads, in a way strongly reminiscent of the whole "Russian election hacking" hoopla as a whole.
There's an emerging tendency—one seen repeatedly when new forms of media emerge—to regard any form of online advertising or social media campaigns as deceptive and devious if political actors are involved. People worried about the same thing when targeted robocalls began, and when TV political ads were new, and so on, back throughout the history of politics and persuasion.
With every iteration of this, we see the same fear that people are powerless against the secret messages encorded into these ads. But as the Tufts political scientist Eitan Hersh tweeted Sunday, "every claim about psychographics etc made by or about [Cambridge Analytica] is BS." This is nothing like a precise science or a devious plot here.
This would be a fine opportunity to discuss how much Facebook users really understand about who sees their data, or how we can improve Americas' privacy, or how Facebook tries to be all things to all people. Instead, a lot of people want to force this into a narrative about election influence. In doing so, they ascribe way too much power and sophistication to advertisers and political consultants, and way too little responsibility and literacy to the rest of us.