A New Book Reveals Facebook's Problems Started Way Before the 2016 Election
In Facebook: The Inside Story, even Steven Levy’s most generous conclusions about the tech giant are still pretty damning.

"It remains uncertain whether anything that happened on Facebook made a significant difference in the 2016 election," Steven Levy writes in his introduction to Facebook: The Inside Story. This is a sentence the reader has to keep in mind throughout Levy's new book, which documents how the seeds of what has gone wrong for the company were planted years before 2016 in a series of heedless or needlessly aggressive decisions that are deeply rooted in the history and culture of the company itself.
When it comes to what we actually can prove about the 2016 votes in the United Kingdom (which resulted in Brexit) and in the United States (which gave us President Donald Trump), it's never been demonstrated that Facebook and its ad policies made any difference. Even so, it's now indisputable that various political actors (including Russia) have tried to use Facebook that way, and that the company made it easy for them to try.
Levy's introductory chapter is designed to show both the reputational heights Facebook founder Mark Zuckerberg and his company briefly reached and the depths to which they've both now fallen. The author draws a brisk line between Zuckerberg's triumphant surprise visit to Lagos, Nigeria, in August 2016—where he meets a range of programmers and would-be startup entrepreneurs as well as top government officials—and the unexpected outcome of the American election in November that year. The latter inspired a gathering storm of critics to point directly at Facebook as the source of that outcome's unexpectedness.
This juxtaposition works to draw readers into the thematic heart of the book but also suggests, a bit misleadingly, that Facebook's massive loss of public regard starting in 2016 was an abrupt, unexpected one. But what the author documents in his introductory chapter and throughout the book is that Facebook's unforced errors have amounted to a car crash more of the slow-motion variety; you can't look away, but also it never seems to end.
Nothing in Levy's narrative arc for Facebook and its principal founder will come as a surprise to anyone who has followed the company's fortunes over the 15 years. You could fill a specialty bookstore with nothing but Facebook-related books published in the last decade or so, starting with Ben Mezrich's The Accidental Billionaires, later adapted by Aaron Sorkin for the David Fincher-directed movie The Social Network. Mezrich's book, and almost all of the books published about Facebook since then, fall into two broad categories: (1) Look At The Great Things These Geniuses Have Done, or (2) Look At All The Pathological Things These Twisted Guys Have Done. With the exception of David Kirkpatrick's The Facebook Effect (a book on the company's early years that Levy credits by name in his acknowledgments), the authors of most of these other books typically have tried to distill the story of Facebook into lessons—either what not to do or what must be done—regarding social media and big tech-company successes. One way or another, they've almost all had axes to grind.
That was never going to be the kind of story Levy—whose 2011 book on Google, In The Plex, is the best journalistic account of that company's history and its impact on the tech world—would write about Facebook. Levy's books and his admirably accessible body of tech journalism for journals from Newsweek to MacWorld to Wired consistently demonstrate how he's driven by the facts rather than by any philosophical or political agenda. And that's exactly why, once Levy has layered on so many new facts about Facebook, its principals, and its various lapses and betrayals, piling on the details from hundreds of interviews, putting all the pieces of every part of Facebook's story into one place, his most even-handed conclusions are still damning:
"The troubled post-election version of Facebook was by no means a different company from the one it was before, but instead very much a continuation of what started in Mark Zuckerberg's dorm room 15 years earlier. It is a company that benefits from and struggles with the legacy of its origin, its hunger for growth, and its idealistic and terrifying mission. Its audaciousness—and that of its leader—led it to be so successful. And that same audaciousness came with a punishing price."
And what was that price? Per Levy, "Facebook now admits that the damage [from the company's decisions to promote rapid growth over everything else] turned out far more extensive than expected, and is not easily repaired. All the while Mark Zuckerberg and his team insist, despite the scandals, that Facebook is still overwhelmingly a force for good in the world."
Even now—even after reading Levy's increasingly unhappy account of Facebook's growth in size and profitability on the one hand and its growth in scandalously negligent or callous treatment of users and their data on the other—I'm still inclined to agree that Facebook is a force for good. I'm biased: Were it not for our ability to stay in touch on Facebook while half a world apart, the woman who became my wife in 2017 and I would not be married today. But, just as important, I've seen so many instances in which individual users and communities have found constructive uses for Facebook's many features, ranging from the broadly political to the deeply personal. I've been a defender of some of Facebook's approaches to real problems, such as its Internet.org project (later known as Free Basics) and its (apparent) commitment to end-to-end encryption.
But again and again in Levy's book I stumble across things like Facebook's deployment of a mobile app called Onavo Protect, which purported to provide VPN (virtual private network) services for one's phone, but whose real purpose was to gather user data about how they used other apps. For a year or two I had that Onavo app on my own damned phone! As Levy writes, "it takes a certain amount of chutzpah to present people with a privacy tool whose purpose was to gain their data."
Far more common in the Facebook story than perverse examples like Onavo Protect are the occasions in which the company failed to anticipate problems that would arise from the markets it pushed itself into with little awareness of how its services might be misused—especially in the absence of Facebook personnel who spoke the dominant languages in those markets and who might at least theoretically identify content that violated Facebook's content policies or other problematic uses. Take Myanmar, for example. Levy shows that Facebook learned in 2014 about how its services were being used maliciously in Myanmar, a country where large-scale mobile-phone internet access was just taking off.
I learned about Facebook's impact in Myanmar at roughly the same time through my contacts in Burmese civil society, and my first reaction was just as brain-dead as Facebook's turned out to be. When I was told that incidents of civil violence were being reported on Facebook, my first thought was that this was positive—that the increasing ubiquity of smartphones with cameras was making rogue government officials and factions more accountable. What I didn't immediately grasp (my Burmese friends politely schooled me), and what Facebook took far longer to grasp, was that false reports of crime and sexual assaults were being used to stir up violence against innocents, with a growing genocidal focus on the country's oppressed and persecuted Rohingya minority. Levy then outlines how Facebook's property WhatsApp, with its end-to-end encryption features and its built-in ability to amplify messages, including pro-violence messages, exacerbated the civil-discord problem in Myanmar (just as it eventually was shown to have done in the Philippines and in Brazil):
"Facebook contracted with a firm called BSR to investigate its activity in Myanmar. It found that Facebook rushed into a country where digital illiteracy was rampant: most Internet users did not know how to open a browser or set up an email account or assess online content. Yet their phones were preinstalled with Facebook. The report said that the hate speech and misinformation on Facebook suppressed the expression of Myanmar's most vulnerable users. And worse: 'Facebook has become a useful platform for those seeking to incite violence and cause offline harm.' A UN report had reached a similar conclusion."
It's fair to note, as Facebook's defenders (who sometimes have included me) have noted, that all new communications technologies are destined to be misused by somebody. But Facebook's reckless stress on its grow-first-fix-problems-later strategy more or less guaranteed that the most harmful aspects of the misuses of these new media would be exacerbated rather than mitigated. By 2018, the company began to realize it was at the bottom of a reputational hole and needed to stop digging; when Zuckerberg testified before the U.S. Senate in 2018, he responded to Sen. Patrick Leahy's (D–Vt.) question about Myanmar by saying "what's happening in Myanmar is a terrible tragedy, and we need to do more."
That last line is a theme that appears again and again in the final third of the book, uttered by different top executives at Facebook. "We know we have more work to do," an exec responded when reporters revealed that Facebook's AI-fueled self-service ad product created the targeted category "Jew haters." Writes Levy: "Investigative reporters at ProPublica found 2,274 potential users identified by Facebook as fitting that category, one of more than 26,000 provided by Facebook, which apparently never vetted the [category] list itself."
When Facebook Chief Operating Officer Sheryl Sandberg, a Google veteran, met with the Congressional Black Caucus, she didn't have a particularly strong defense for Facebook's having (in Levy's words) "hosted Russian propaganda that fueled white prejudice against black people" or violating civil-rights laws "because it allowed advertisers to discriminate against African-Americans." Sandberg was reduced to "repeating, almost like a mantra, 'We will do better.'" Sandberg later told Levy that "I walked out of there saying we, and I, have a lot of work to do." This has become the paradigmatic Facebook response when any new scandal emerges from the company's shortsighted strategic choices to privilege growth over due diligence.
If there is one major exception to the company's institutional willingness to plunge ahead into new markets and new opportunities to reap revenue, it is its cautiousness in dealing with American conservatives, primarily driven by Facebook's head of global policy, Joel Kaplan. Levy writes that
"for years right-wing conservatives had been complaining that Facebook—run by those liberals in Silicon Valley—discriminated against them by down-ranking their posts. The claim was unsupported by data, and by many measures conservative content was overrepresented on Facebook. Fox News routinely headed the list of most-shared posts on the service, and even smaller right-wing sites like the Daily Wire were punching above their weight."
Facebook's intense desire to be perceived as lacking political bias seems to have led to policies and outreach efforts—including a big powwow in Menlo Park where Zuckerberg and Sandberg personally attempted to mollify prominent conspiracy theorists like Rush Limbaugh and Glenn Beck--that added up to white-glove treatment for American conservatives. Beck, at least, told Levy he was impressed with Zuckerberg's sincerity: "I sat across the table from him to try to gauge him [and he] was a little enigmatic, but I thought he was trying to do the right thing." Even so, Levy writes, "after leaving Menlo Park the conservatives returned to complaining about Facebook's treatment of them—while piling up millions of views because of their skill in exploiting Facebook's algorithms."
The attempts to accommodate conservative critics, occurring simultaneously with Facebook's promises to "do better" on other content moderation issues, illustrate the bind in which the company now finds itself. It's voicing its commitment to be more proactive in moderating malicious content and disinformation while simultaneously reassuring the squeakiest political wheels that, no, their content and policies won't be subjected to any Facebook-imposed test of factuality or truth. In other words, it's promising both to police more content and to police content less. Facebook's laboring now to create a "content oversight board," a kind of "Supreme Court" where Facebook content decisions can be appealed.
I'm skeptical of that whole oversight-board project—not least because it seems largely to sidestep the other big bucket of problems for Facebook, which is its handling of user data. I'm also skeptical about broad claims that Facebook's content algorithms and ads truly manipulate us—in the sense of robbing us of ordinary human independence and agency—but that skepticism has no bearing on the ethical question of whether Facebook should allow malign actors to exploit user data in efforts to manipulate us (e.g., by suppressing voter turnout). Whether those efforts are effective or not isn't relevant to the ethical questions, just as when a drunk in a bar misses when he swings at you has no bearing on whether he can be charged with attempted assault.
Finally, I'm most skeptical as to whether anything Facebook tries to do on its own is going to either restore Facebook's public reputation or blunt the impulse of government policymakers, both in the United States and elsewhere, to impose hobbling and even punitive regulation on the entire social-media industry (and on the tech industry generally).
It's clear that one reason Zuckerberg and Sandberg have been scrambling to find an accommodation that works—first and foremost with the U.S. government but also with the European Union (E.U.) and with non-E.U. nations—is that they know they need to get out of the crosshairs, especially as more of their company's story continues to come to light. Their problem now is that Steven Levy's Facebook: The Inside Story has instantly become the indispensable single-volume resource for all policymakers everywhere when it comes to Facebook—not because it sets out to take the company down, but because the facts it reports leave readers with no choice but to recognize how the company's indisputable successes have been undermined by its indisputable systemic deficiencies.
My takeaway is that Facebook could still manage a win out of all this—if it seizes this moment as an opportunity to embrace an ethical framework that's designed for something bigger than just solving Facebook problems. I don't believe the company's bad behavior (or negligence—there's plenty of both) can tell us what to think of this whole sector of the internet economy and what industry-level regulation or law should look like. As I've argued in my own book, I believe a new ethical framework has to be built and shared, industry-wide (affecting more companies than just Facebook). It needs to be informed by all stakeholders, including users, governments, and civil society as well as the companies themselves—and it needs to privilege fiduciary obligations to users and the general public even over any commitment to growth and profitability. Part of these obligations will entail, yes, a commitment to fighting disinformation, treating it as a cybersecurity problem—even when political stakeholders complain.
That's just my view—other critics will argue for different approaches, some of which will center on more regulation or laws or other government interventions, while others will argue for fewer but better-crafted ones. But what all Facebook's critics, and the tech industry's critics, will have in common is this: going forward, we all will be citing stuff we learned from Levy's Facebook: The Inside Story.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
The attempts to accommodate conservative critics, occurring simultaneously with Facebook's promises to "do better" on other content moderation issues, illustrate the bind in which the company now finds itself. It's voicing its commitment to be more proactive in moderating malicious content and disinformation while simultaneously reassuring the squeakiest political wheels that, no, their content and policies won't be subjected to any Facebook-imposed test of factuality or truth. In other words, it's promising both to police more content and to police content less. Facebook's laboring now to create a "content oversight board," a kind of "Supreme Court" where Facebook content decisions can be appealed.
Where does the idea that Facebook has some obligation to ensure that its users only put up "factual information" or it is somehow responsible if someone says something bad come from?
This whole article is nothing but a lament about how people the author doesn't like are given a platform to speak. Well fuck you you tyrannical little bastard. Who made you the judge of what can and cannot be said on a public platform?
You know who else was a tyrannical little bastard?
Napoleon?
Bloomberg?
Adam ant?
Bloomberg?
King Joffrey?
My son when he was three?
This whole comment is nothing but a lament about how people the commenter doesn't like are given platform to speak. Well fuck you, you tyrannical little bastard. Who made you the judge of what can and cannot be said by writers at Reason?
Indeed. It's becoming a common observation: The right complains that Facebook censors, the left complains that Facebook doesn't censor enough.
Not hard to see which side of that divide Mike falls into, and surprisingly for a libertarian, it's not the side that objects to censorship.
Facebook got Mikey laid, so he has a soft spot for it.
Was Western Union responsible for what people said over the telegraph? Was Bell Telephone responsible for what people said over telephone lines? Was RCA responsible or what was broadcast and heard on its radios? Then why is Facebook, which is nothing more than a medium of expression, responsible for its users say?
I do not use Facebook, but it begins to appear to me that its only “sin” is being successful.
John writes:
'This whole article is nothing but a lament about how people the author doesn’t like are given a platform to speak.'
Except that this whole article never says anything of the kind. If you ever get around to reading the book (I know, right?) you'll discover that Facebook also tried to mollify the Congressional Black Caucus. (I even mention this in the article.) But the book documents that Facebook was more solicitous of American conservatives. If you ever get around to reading the book (I know, right?), you'll discover that I reported the book's contents accurately.
Has there ever been an explanation (other than "it let the guy I hate win") why using social media/big data is bad for one side (the critics above) but ok for the other?
Never. What was Obama showing his genius and the superiority of Democrats in 2008 and 2012 became the evil Trump subverting our elections in 2016. It is fucking pathetic.
Meh. I found the stories about technically illeterate Myanmar people getting riled up against minorities... interesting?
But it's very similar to suing gun manufacturers for intercity gun crimes.
Breaking News: Dictatorial governments use available technology to further their dictatorial ends.
Next up: How barbed wire manufacturers are responsible for concentration camps and gulags.
When it comes to what we actually can prove about the 2016 votes in the United Kingdom (which resulted in Brexit) and in the United States (which gave us President Donald Trump), it's never been demonstrated that Facebook and its ad policies made any difference. Even so, it's now indisputable that various political actors (including Russia) have tried to use Facebook that way, and that the company made it easy for them to try.
We had no evidence this caused any harm. Moreover, what is so magical about evil Russian propaganda? If Russians saying bad things should be stopped, then shouldn't the same be said of Americans who say things the nasty little shit who wrote this article doesn't like?
There is nothing special about Russian propaganda. If you are so terrified of its effect on the public that you feel it must be censored, there there is no reason why you shouldn't feel the same about Americans doing the same thing.
Ah no. It is called a free society. The market place of ideas sorts this out. Even evil foreigners get a chance to speak. I am sorry freedom or expression and the danger of someone saying something someone doesn't like or thinks is false terrifies the reason libertarians so much.
Moreover, what is so magical about evil Russian propaganda?
AMERICANS ARE STUPID! THEY BELIEVE EVERYTHING ON FACEBOOK! RUSSIAN INTERFERENCE IS THE ONLY POSSIBLE EXPLANATION FOR HILLARY NOT BEING CORONATED AS OUR FIRST QUEEN! TRUMP IS PUTIN'S LAP DOG! PUTIN GOT HIM ELECTED! AAAUUUGHH!
/what many people actually believe
Don't forget that everything should be democratic, but also democracy is mortally threatened by a shitty FB ad.
Unless of course it causes people to vote for their side. Then it is just enlightened public discourse.
Yep.
Damnit, I just accidentally flagged someone's post for review. Sorry. Luckily, Reason is still libertarian enough (or doesn't want the overhead) that I don't think it matters.
Or Sanders walks into the convention with 40% of the delegates.
Even evil foreigners get a chance to speak. I am sorry freedom or expression and the danger of someone saying something someone doesn’t like or thinks is false terrifies the reason libertarians so much.
Democrats: Evil Russians came here and instilled a puppet dictator via Facebook.
Republicans: Facebook suppresses and omits conservative messages and kicks people off their platform (and others) for reasons that are nebulous and may violate their civil and/or Constitutional rights.
Facebook: We admit to all of it and promise to do better.
Reason: Both sides! Illegal immigrants have a right to be here!
+1
I'd give you more than one, but that's all that Discus will let me give you.
John writes:
"We had no evidence this caused any harm."
Yes, that's what I wrote, all right. Thanks for the validation!
"It remains uncertain whether anything that happened on Facebook made a significant difference in the 2016 election," Steven Levy writes
No, it's pretty certain.
But it is an article of faith with people like this author that Hillary was robbed. They cannot let it go. It was her turn. And things have turned out so badly with the second Great Depression and all those nuclear wars Trump started before he declared himself God Emperor for life and all
John, you are just like Hitler.
Literally
I'm biased: Were it not for our ability to stay in touch on Facebook while half a world apart, the woman who became my wife in 2017 and I would not be married today.
And you still think Facebook is a force for good?
*rimshot*
I don't like Zuckerberg but pinning that on him is going too far.
But again and again in Levy's book I stumble across things like Facebook's deployment of a mobile app called Onavo Protect, which purported to provide VPN (virtual private network) services for one's phone, but whose real purpose was to gather user data about how they used other apps
Guys, when a mega-corporation gives you a free digital tool, there's near to 100% chance that tool is being used to gather user information.
I guess what I'm so surprised about is that... everyone is surprised.
Yes, occasionally a company will give you something completely free with no strings attached to get you under the brand umbrella through loyalty. But when a company launches an entire standalone project or app, be rest assured they're using it to... in some way shape or form follow your activities, even if those activities are in the meta.
For instance, I use WhatsApp because of the end to end encryption. But do I KNOW it's encrypted end to end? No, I merely trust it is. I do believe, however, that it is reporting other meta stuff about me. At this point, I'm willing to live with that for the convenience and usability of the app.
No, you are supposed to be a stupid sheep who is being fooled by evil tech giants and evil Russians. Don't you understand that?
The precise issue is not that Onavo Protect gathered your data. It's that it was misrepresented as a tool that would protect your privacy and your data (through a VPN) and also optimize bandwidth for low-bandwidth environments like developing countries. In general misrepresenting a consumer-facing product is taken to be a legal and policy problem in all developed democracies, including the United States.
The best part of this article is when the author whines about Facebook catering to conservatives. A business trying to please it's customers. The humanity. My God, Facebook worried about expanding and making money rather than if someone might use their platform to do awful things like tell people not to vote for Hillary.
And this is what passes for "Libertarian" thinking at reason.
And, as I indicated above, the Conservatives weren't going to be assuaged by pretty much anything but nothing they were claiming was that outlandish or even really unfounded. Alex Jones was kicked off the platform and several others because the platforms didn't like him. Youtube overtly and explicitly cracked down on non-violent gun channels on their platform *as policy*. Off the platform, James Damore was fired under similar political correctness HR bullshit. Businesses that booted Alex Jones the exact same day Facebook did have overt and (civilly and Constitutionally) questionable anti-conservative hate speech policies.
Democrats, OTOH, were suggesting that Facebook, somehow, keep Russian secret agents, known and unknown, off of Facebook.
Paul nails it below. It wasn't happening and anyone who said it was was a conspiracy theorist until they admitted it was but said no one had a right to complain.
Alex Jones is a fucking nut. Not a good hill to die on.
Alex Jones is a fucking nut. Not a good hill to die on.
Bigger nut: Alex Jones or Louis Farrakhan?
Would watch.
If you don't die on the other guy's hill, the army marching in your direction reaches your territory, and you die on your own hill. Either you're willing to defend freedom of speech when you don't like what's being said, or it's not a principle, it's just naked self-interest.
So I don't freaking CARE if Alex Jones is a nut.
Amen
It seems weird to be criticized for reporting what the book says. You have reminded, however, that I often get read by audiences that don't read books. (I know, right?)
For a year or two I had that Onavo app on my own damned phone! As Levy writes, "it takes a certain amount of chutzpah to present people with a privacy tool whose purpose was to gain their data."
This is why we need to stop calling Tech Journalists Tech Journalists. They're arts and culture journalists--with a focus on the internet. When you're REALLY tech savvy, and you actually understand HOW these apps work, it's a lot easier to not fully trust them.
I'm fairly technical, not least because my work requires me to be. So is Steven Levy, which you'll discover if you read more of his work.
You know, there's a lot to unpack here with this book. But I don't know if anyone ever said that Fox News was being suppressed by Silicon Valley algorithms. It was... you know... all the conservative self-made personalities in the non-trad media sphere who we all watched get depersoned, canceled, de-ranked, shut out, shadow-banned, unsubscribed, unfriended by payment services and demonetized on youtube and other services.
Anyone who thinks that conservative voices were being amplified by Facebook et. al. isn't paying attention.
And we saw the predictable progression of attitude on the issue:
1. It isn't happening.
2. Well, ok, it is happening but it's not as bad as you say.
3. Ok, well, yeah it is as bad as you say but suck it up, it's all hate speech and they're all private companies and can do what they want.
And everyone who said it was happening is a "conspiracy theorist" even though we know admit it did happen but doesn't matter because it is their property.
One almost historic example was the first episode of the Joe Rogan podcast that was demonetized by Youtube: The episode where he hosted Jordan Peterson. Just having Peterson on the show got Rogan, one of the biggest youtubers in the history of the universe demonetized.
Progressives make death threats every day on Twitter and the tweets are never removed. Conservatives have been kicked off Twitter without any explanation or reason.
it’s all hate speech
And the Paypal and Patreon bannings, deplatformings, and cancellations were all... uh... good faith moderation! Yeah, yeah, good faith moderation of finan... I mean... speech, yeah! The *recipients* speech! Totally not refusing legal tender based on someone's ideology. Totally just a good faith moderation of free speech.
Anyone who bases their choices in the ballot box based upon Facebook ads is too stupid to breathe.
FWIW I'm presuming most of this is referring to Facebook the company, not facebook the product, 'cause no one uses that shit anymore.
"It remains uncertain whether anything that happened on Facebook made a significant difference in the 2016 election,"
Those Russkii gifs caused huge numbers of HRC voters to vote for Trump instead.
Tony is sure of that.
CNN loves to disingenuously claim that Russia spent "millions of rubles" on Facebook ads to affect the election, hoping that their useful idiot readers won't notice that 1 dollar is worth 66 rubles.
Sort of like Castro spending millions of pesos....
At Micky D's...
No one is forced to use FB. No one. I don't.
^+1
I totally agree, but just for shits and giggles, I am gonna play devil's advocate here:
Well, yeah, but you could also make the argument that no one is forced to take a job, so if you don't like the income tax, don't work. Get some land in bumfuck nowhere and grow your own food. Sell just enough to pay your property taxes and you are good. But if you wanna live a modern life, you gotta get a job. Similarly, in many circles, you gotta have Facebook to lead a modern life.
"Similarly, in many circles, you gotta have Facebook to lead a modern life."
Bull
Shit.
You may not be using Facebook, but Facebook's using you. They've got their tentacles into every little orifice on the internet and if you're on the internet they've got their tentacles into you.
"You may not be using Facebook, but Facebook’s using you. They’ve got their tentacles into every little orifice on the internet and if you’re on the internet they’ve got their tentacles into you."
Aww, poor Jerry! Wants the state to save him from the boogey man!!
Grow up and shut up.
"A New Book Reveals Facebook's Problems Started Way Before the 2016 Election"
Note the author; perhaps they started in the 1930s?
Far more common in the Facebook story than perverse examples like Onavo Protect are the occasions in which the company failed to anticipate problems that would arise from the markets it pushed itself into with little awareness of how its services might be misused.
This is kind of like saying "Far more common than projects like killing all the Jews, Hitler's projects more often included ways of finding his socks in the morning." You only need one Onavo Protect to see what totally amoral unscrupulous lying pieces of shit Zuckerberg and Facebook really are.
And the thing is, there's a long string of incidents where Facebook has been caught scraping every single bit of data they can from whatever source they can whenever an opportunity presents itself. There's no "failed to anticipate problems" problem, they simply don't give a shit.
"You only need one Onavo Protect to see what totally amoral unscrupulous lying pieces of shit Zuckerberg and Facebook really are."
Aww, cry some more.
Fucking statists demanding the government save them from themselves; stuff it up your ass.
"There’s no “failed to anticipate problems” problem, they simply don’t give a shit."
So what?
Are the users to be exonerated? The users might as well post their data on billboards and now scream that someone uses it?
Tough...
Shit.
Jerryskids, the sentences that follow the sentence you quote don't center on misuse of data. Instead, they center on use of Facebook's features to promote things like genocide and other kinds of political violence.
Don't like FB? Delete your account and log off for good. I never felt like the world needs to know my business.
Start now earning extra $16,750 to $19,000 per month by doing an easy home based job in part time only. Last month i have got my 3rd paycheck of $17652 by giving this job only 3 hrs a day online on my Mobile. Every person can now get this today and makes extra cash by follow details her==►Read MoRe
Far more common in the Facebook story than perverse examples like Onavo Protect are the occasions in which the company failed to anticipate problems that would arise from the markets it pushed itself into with little awareness of how its services might be misused—especially in the absence of Facebook personnel who spoke the dominant languages in those markets and who might at least theoretically identify content that violated Facebook's content policies or other problematic uses. - so true, agree with every word
The thing that amazes me is that they publish garbage like this all the time, and still have the least moderated comments anywhere on the internet.
Did all the actual libertarians at Reason get pushed into IT positions?
"Levy's books and his admirably accessible body of tech journalism ... consistently demonstrate how he's driven by the facts rather than by any philosophical or political agenda"
False. This statement merely demonstrates that Godwin agrees with Levy's political agenda.
I was directly involved in a meeting that Levy reported on in one of his books. I was there, my information is first hand. Levy authoritatively provided the account of only one of the participants', an account that he had to suspect was false but that he knew his audience would lap up.
Don't believe what Levy writes unless you have an independent source you trust. Godwin must bathe in a tub of shit, he's so full of it.
I'm sorry to discover that you're unfamiliar with Levy's work. Is the problem that you're unhappy he interviewed Glenn Beck but not you? Maybe you didn't seem interested in being interviewed?
Anyone who makes a choice in the ballot box based on Facebook ads is too stupid to breathe. Many ads on the Facebook platform have political conspiracy and political manipulation. Many people cannot polish their eyes to see why. https://www.wallpapertip.com/wpic/ibooi_facebook/