Proposed Tweak to Internet Law Could Spur Seismic Shifts in Web as We Know It
A bill related to sex trafficking and Section 230 could have far-reaching consequences for web content, publishers, and apps.

A draft bill in the House of Representatives would add sex trafficking to the list of crimes excluded from the protection of the Communication Decency Act (CDA), a Geocities-era law with an important provision on internet publishing. That provision—Section 230—would prove crucial to the development of the "World Wide Web" as we know it, allowing for a world in which social networks and participatory media could thrive. The new House proposal is portrayed as a mere tweak to Section 230, one which would make it easier to catch bad guys while having little effect on online communication. Don't believe it.
Simply put, Section 230 protects web publishers and platforms—from Facebook and Reddit to The New York Times to Petfinder.com—from being legally culpable for things that third parties post or upload, at least when it comes to state crimes and civil lawsuits. (Federal criminal offenses are not afforded Section 230 protection.) If you're found to be criminally harassing someone via Twitter, the company can't be prosecuted for it. If a magazine commenter makes libelous statements, the publication can't be sued for libel. If a 16-year-old meets a 19-year-old on Facebook and they begin a sexual relationship, Facebook can't be charged for statuatory rape. And so on.
"It's the reason I can't sue [Snapchat CEO] Evan Spiegel for harassment if a dude sends me unsolicited pictures of his dick on Snapchat," writes Kate Knibbs in this excellent and detailed piece about adult-advertising and Section 230. "This protection has been absolutely essential to the development of the internet in this country and really around the world," the Center for Democracy & Technology's Emma Llansó told Knibbs. Without it, web providers would "be in court all the time. And they'd run up inordinately high legal bills, even if they were ultimately successful in defending a case."
The draft House measure, sponsored by Rep. Ann Wagner (R-Missouri) and dubbed the "No Immunity for Sex Traffickers Online Act," would carve out an exception to Section 230 for sex-trafficking offenses involving minors. Supporters portray it as a way to "hold sex traffickers accountable," but we already have sufficient penalties—at the state and federal level—for people who force, decieve, or coerce others into prostitution, as well as for anyone directly involved in the prostitution (forced or not) of a minor. And nothing in Section 230 of the CDA, nor in this new proposal, affects the way we treat folks found to be sexually exploiting others.
What the change would do is make it possible for states to indict any app, website, or platform that introduces an underage person to a possible sex buyer as a conspirator in sex trafficking. And it would allow any underage person who was paid for sex to subsequently sue any website or web service remotely involved in the transaction.
To be very clear, the change would not merely apply to classified-ad sites like Backpage, or to sites and services specializing in escort advertising. Facebook, Snapchat, Instagram, and similar social platforms have all helped introduce underage sex-trafficking victims to perpetrators in recent U.S. cases. Victims often use use popular email providers, messaging apps, and text messaging to communicate with clients (police have been fond of late with charging sex workers with cell phones or laptops for felony possession of the instruments of a crime). Perhaps prosecutors won't go after these sites and services (I have my doubts), but regardless, victims can. With the proposed change, victims will have the right to sue any third-party web service that enabled their participation or exploitation in the sex trade. And in this case, victim means anyone under 18 whom someone paid for sex, regardless of whether any force, fraud, coercion, or middlemen and women were involved.
You can see how this might cause problems. The Section 230 bill deals not a wit with the people actually causing sexual exploitation; it simply opens up a new category of defendants that can be punished as child sex traffickers. It gives victims—most of whom fall prey to petty pimps with few assets, not organized criminals—a civil-suit target with much deeper pockets than the criminals who exploited them, and the same for state prosecutors with asset-forfeiture fever. And it does all this while a) defining child sex trafficking victim as anyone under 18 who accepts money or anything of value for sexual activity and b) defining child sex trafficking as a crime that need not involve any real children.
A huge number of "child sex trafficking stings" in this country involve police posing online as sex workers (using pictures of young adults, because otherwise they would all be posting child porn) and, once a customer is interested, "admitting" that they're actually underage (usually 16 or 17). The men who still agree to meet for sex are greeted by police officers and charged with things like patronizing a minor for prostitution or, increasingly, child sex trafficking. Their vehicles and sometimes other assets are seized. Imagine if cops could do this sort of "random virtue testing" (as Ars Technica's Nate Anderson aptly described it) but then go after big web publishers and platforms instead of just impounding a few cars.
Several county sheriffs have taken up this tactic with particular zeal independently, but the vast majority of such "john stings" are conducted in conjunction with an Internet Crimes Against Children, Innocence Lost National Initiative, or general human trafficking task force funded and spearheaded by the federal government. The U.S. Court of Appeals for the Eighth Circuit held, in 2013, that there need not be an actual victim involved for the offense of sex trafficking. Since 2009, the U.S. Department of Justice has been training and giving "written guidance to federal prosecutors indicating that [federal trafficking in persons law] could be used to prosecute customers seeking to pay for sex" with minors, according to Jill Steinberg, national coordinator for child exploitation prevention and interdiction. And the 2015 Justice for Victims of Trafficking Act explicitly added "partonizes" and "solicits" to the means by which someone can be guilty of the federal offense of sex trafficking of children or adults.
Law professor and blogger Eric Goldman is alarmed by the No Immunity for Sex Traffickers Online Act and its potential to decimate Section 230 and large swaths of the internet:
As you may recall, in 2013, 47 state AGs (including California's then-AG and now-Senator Kamala Harris) sent a letter to Congress complaining that Section 230 prevented them from squashing Backpage and requesting that Congress amend Section 230 to exclude all "federal *and state* crimes." Congress never responded to the letter–until now. Consistent with the AGs' request, Rep. Wagner's bill would open up Section 230 to state crimes, but only if the crimes relate "to sexual exploitation of children or sex trafficking of children"–a smaller universe than the AGs' request to open up *all* state crimes.
But the [new] bill would also go much further than the AGs' request to loosen up criminal enforcement. The bill would also open up Section 230 to civil claims "relating to sexual exploitation of children or sex trafficking of children." What does that mean? I'm not sure, but I expect crafty plaintiffs' lawyers could find dozens or hundreds of tort claims that they could argue, consistent with Rule 11, relate to this exclusion. If so, it will be open season on defendants who think Section 230 protects them.
Plus, the door would be open for states to enact new laws that could get around Section 230. For example, imagine a state currently has, or newly enacts, an existing strict liability crime, with a bonus civil cause of action, against publication of online prostitution ads. The strict liability rule might run into First Amendment concerns, but we won't know that until the court challenge. As we know, if there's not a single home for them, online prostitution ads migrate into other topics. So any classified ad or message board service–even those that are completely free–would need to prescreen most/all user postings to screen out the possibly-small percentage of those postings that violate the new law. The overall cost imposition on publishers, and associated chilling effect, attributable to the law would be huge. Note that the law doesn't limit itself to ads, so new crimes and torts could reach even non-commercial activity related to child sex trafficking (whatever that means).
This sort of "scope creep," writes Goldman, is why "'small' exceptions to Section 230 rarely remain small in practice." Goldman also notes that "Congress hasn't changed Section 230's core immunity since the beginning. In contrast, this bill would dramatically reshape Section 230's contours."
Goldman may underestimate the effect such a reshaping could have even without other changes. With Backpage's adult-ad section shuttered as of early 2017, it's hard to imagine what Rep. Wagner's proposal "seeks to restrict or what existing behavior the bill wants to stop," he writes. But the adult section's shutdown certainly hasn't ended online prostitution marketing, not even on Backpage; people are simply posting adult ads to other sections of the site. Plus there are still prostitution ads on Craigslist and other general classified sites, and all sorts of smaller, sex-work-specific advertising forums. There are sex workers—on both the higher and lower end spectrum—marketing themselves and being marketed on Twitter, Tumblr, Instagram, "sugar baby" and dating websites, personal pages hosted by Blogger and WordPress, etc. And when it comes to child exploitation prosecutions, it's not infrequently that cases involve teens enticed by a pimp/trafficker or introduced to a potential customer via social-media sites (especially Facebook) and messaging apps. Law enforcement would find no shortage of possible targets if Backpage disappeared tomorrow.
Politically, the measure has bright prospects. Not only does it have bipartisan backers, but politicians are practically allergic to voting on principle when it comes to bills that invoke the words "child sex trafficking" (see Rand Paul, staunch advocate against mandatory minimums, and the 2015 "Justice for Victims of Trafficking Act"). And as Tim Cushing writes at Techdirt, "the recent arrival of former California attorney general Kamala Harris (a newly-elected Senator) should ensure lousy, internet-damaging bills aren't limited to the House. It's a chance to make earlier complaints become debilitating statutes. Goldman notes the bill not only duplicates Rep. Wagner's previous human trafficking law (which was passed), but echoes a 'Let's Blame Backpage!' letter sent [by Harris and other attorneys general] to Congress back in 2013."
Cushing also notes that this same idea of a "narrow" exception to Section 230 immunity "has been floated as a way to tackle the revenge porn problem." Why? "It's almost always easier to locate and serve/prosecute site owners than it is to go after those actually violating laws."
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I have 4 hours of mowing today .Has to be done,rain here the next 5 days..I did make a nice potato and ham chowder wit garlic,onions,rosemary and thyme. Some broth,olive oil,butter,cream and pepper. Have sourdough croutons to top with Asiago cheese. Made a salad and a couple cold stouts. Good day all.
Rain is coming my way too, bro. Making spaghetti the easy way and a salad. Then it's time to chief a bowl and watch the storm.
I planted some daffodils and now I am eating a tuna salad. True story.
Testing 123
Why do all bands play that song first?
The Ken Shultz Band usually only has time for one song. But they will do encores.
I especially like their 20-minute drum solos.
That's his online banking password.
"If a magazine commenter makes libelous statements, the publication can't be sued for libel."
This one example is also, maybe mostly, about our libel laws needing to compromise with the First Amendment. Specifically, this is about needing to demonstrate malice. How can you prove to a jury that Reason had malice if they weren't the ones who made the comment?
Compare that to Canada, Australia, and the UK, where malice isn't necessary in libel claims. Go to a major news website in Australia, post a comment, and wait for it to be moderated. It presumably costs major corporations with deep pockets, there, less to hire people to pre-moderate comments than it does to constantly fight off libel lawsuits.
On a news story about a baby drowning in a pool, say some commenter writes, "Well, the mother should have been paying attention!". What if the mother was at work and it was the babysitter that wasn't paying attention? Now News Corp has a lawsuit on their hands--and they're gonna lose.
Point is, we don't need to look far to see what the world is like when websites are held responsible for what individual users write, upload, etc. Beyond the fundamental unfairness of holding one person responsible for another person's mens rea, websites like Reason.com probably couldn't afford to keep comments open in that environment.
Ultimately, if you don't have deep enough pockets to pre-moderate comments, hire people to sit there all day taking bad comments down, or the wherewithal to constantly fight lawsuits, you shut down comments. Ultimately, it'll turn into rent seeking. Why would the big boys want to compete with small time operators anyway? I'm sure they'd love to see some serious barriers to entry.
"Ultimately, if you don't have deep enough pockets to pre-moderate comments, "
It can be automated, I'm pretty sure, and the cost of someone to administrate such a system would be less than the cost of employing a lawyer.
So, you're saying that if the just geeked harder they could know the necessary hashtags to properly secure the series of tubes?
" if the just geeked harder they could know the necessary hashtags to properly secure the series of tubes?"
I figure it'll come down to this: They just need to geek hard enough to satisfy a judge, no more.
How would one automatically detect libelous comments? Do you even think before you post?
The idea that a computer could be automated to do something that only humans can do now is not new. Do I think before posting? A bit. I thought that Ken Shultz hadn't considered low cost automated censorship, as in expert systems.
That works for easy things: key words like swears, "lie", "sex", etc.
The tech certainly isn't here yet that would be capable of screening a post that says, for instance:
"Young woman looking for expert training, small fee"
But, you like the taste of authoritarianism, and your posts are disingenuous.
There's not a lot of point in criticizing the capabilities of technology that doesn't exist. Censors will always err on the side of caution and censor needlessly. Heavy handedly. There's nothing disingenuous about my pointing out the possibility of human activities like legal vetting and censorship becoming automated in the future.
Easily. You ban all comments in the first place and remove the comments section. Easy way to ensure no libel in the comments.
Automated systems don't pick up sarcasm, and comments drip with the stuff. Major news organizations in Canada and Australia don't pay real people to premoderate comments because it's cheaper.
Postmoderation means people don't have to wait as long, but even then, all an algorithm can do is flag keywords.
Even sites in the U.S. even use real moderators to tackle reported abuse. Go use a homophobic slur at SF Chron, and not only will the comment come down after a while, they'll ban your account.
The Washington Post isn't all in real time either. They don't pay people to do the moderation because they're technophobes. The technology to premoderate comments with quality is either too expensive or doesn't exist.
I doubt we'll ever get computers that understand language as well as people until we get conscious machines.
Machines read syntax. Conscious beings understand semantics as function of perspective--and even humans often screw up sarcasm in text format.
There's a difference between demeaning a group as "queer" and talking about queer politics as in the "Q" of LGBTQ. You think there's an off the shelf algorithm for that?
"You think there's an off the shelf algorithm for that?"
Not on any shelf that's in my reach but I don't work for the CIA, the NSA, the FBI etc. If anyone is putting money into researching encoding/decoding natural language it's them. And the military, of course. These kinds of expert systems are widely touted to be putting a lot of people in hitherto safe professions, like censors or lawyers, out of business in the days to come. It needn't be expensive, it's only software, after all.
1) We're not talking about the CIA, the NSA, or the FBI. We're talking about websites that allow commenting like Reason.com and whether they would be able to afford that technology if it existed.
2) Did you not understand what I wrote about the problem of semantics being fundamentally different from the problem of syntax? There are things computers can't do. For instance, they suffer from the problem of induction, just like human scientists do. Just like that, the problem of the difference between syntax and semantics doesn't disappear because people use computers.
3) Even if is possible to develop, developing software like that isn't likely to be inexpensive. And developing software generally isn't inexpensive because it's software. Does the term "burn rate" mean anything to you? The guys that invented Google in their dorm room don't code it themselves anymore. They spend a tremendous amount of money developing software.
1) As I said, it's software, something that can be reproduced at almost zero cost. It's not a car or even a computer.
2) We have used humans as censors for many years, even though humans sometimes have trouble distinguishing sarcasm from sincerity, for example. Even the intended audience have these troubles, as you'll find comments enclosed in specious tags in these pages. This software doesn't have to be perfect, it only has to be good enough to satisfy a judge.
3) I'm sure the government has been financing research into understanding natural language for years at places like MIT. That does not alter the fact that the cost of reproducing software is essentially costless. I don't know what burn rate is, but if the guys at google are as smart as those at apple, it's your money they'll be spending, not theirs.
Well, now we can Computer Science to the list of Things That mtrueman Doesn't Understand, along with basic chemistry and economics.
Keep us informed.
"1) As I said, it's software, something that can be reproduced at almost zero cost. It's not a car or even a computer."
The question isn't whether it can be reproduced. The question is whether it can be developed.
" 2) . . . . This software doesn't have to be perfect, it only has to be good enough to satisfy a judge."
It needs to be good enough that users will continue to post comments.
If the trolls' comments keep getting through, because the algorithm isn't good enough to reject them, and, meanwhile, people's good comments are persistently being rejected by the algorithm, people will stop posting and go somewhere else.
There's this thing called "customers". Some of those commenters are paying subscribers. Some of them are just users--and they're created content for free. You don't even have to pay them to do it. They just show up and create free content for you! You don't want to chase them away.
"I don't know what burn rate is, but if the guys at google are as smart as those at apple, it's your money they'll be spending, not theirs."
You really have no idea what I'm talking about, do you?
They don't write their own code. They pay people to do it--expensive people. Because reproduction costs are low doesn't mean development costs are low.
"Because reproduction costs are low doesn't mean development costs are low."
I should have mentioned that government funds much development through outfits like DARPA, which I believe is entirely publicly funded. Whether or not expert systems that understand natural language will be developed is a question nobody can say for sure. As I mentioned earlier, many foresee a time when middle class professions are replaced by automated expert systems. They may be wrong, but I wouldn't dismiss their claims out of hand.
"people will stop posting and go somewhere else."
Less posting and more going elsewhere. Works for me.
"If the trolls' comments keep getting through, because the algorithm isn't good enough to reject them..."
The algorithm can be changed and improved. I understand nothing of computer science, chemistry or economics, but I think it's normal practice to rethink, adapt and improve an unsatisfactory algorithm. Computer generated random number algorithms, for example.
So, someday, someone will develop a perfect NLP algorithm, so let's pass a law that would only not bankrupt everyone with a website if everyone had that algorithm today. That's your thinking here?
"So, someday, someone will develop a perfect NLP algorithm"
I predict that before that day, someone will develop an imperfect NLP algorithm. Honestly, I don't really get too exercised over what laws are passed or not passed. People will find ways to speak out if it's important to them, they've managed to do it under much more intolerant and repressive regimes than Trump's. Also I think that censoring the web could be much cheaper and easier with yet to be developed expert systems that filter out unwanted content. Better and cheaper yet, inculcate a culture of self-censorship. You do it yourself, by posting under a pseudonym. Nobody forced you into hiding your identity; it was your choice to do so.
Some, like Reinaldo Arenas, wound up in the worst prison Cuba has to offer, and persecuted by government goons inside and outside prison, continued writing. He never felt he needed government permission to write. If free speech is important to you, then appreciate the choices made by Arenas, rather than joining in the hand wring over what grandstanding politicians get up to.
You're really dismissing Reinaldo Arenas' persecution and suffering, waving it away with " he never felt he needed government permission to write". Jesus Christ.
I'm not meaning to dismiss it. If neoteny wants details, s/he can look him up. I mentioned he was persecuted by government goons in Cuba's worst prison. I didn't think I had to expand on that. Arenas had an urge to write and he followed it. It's that urge that's going to keep free speech alive.
I'll lay odds contraptions will not be passing the Turing test in our lifetimes. We have to think to survive. Machines don't.
Alpha Go may not have passed the Turing test either, ie a master go player could conceivably know whether he was playing against a human or the computer. Failing the Turing test doesn't mean failing the task you were designed to perform - in this case winning at go.
Anyhow I don't see why it's so difficult to foresee a future where there are expert systems that are designed to keep publishers out of legal trouble. The alternative is to pay some human to do it. Although, I like my idea of getting the rubes pay for the privilege of censoring each other.
Unlike you I actually know things about software and I am sure it can't be automated. There is no automated system that could accurately flag and moderate comments for possible violations of liable laws. Pattern matching can be done yes. How do you pattern match to know if statement X made about person Y is true or not? That is a research effort and far far beyond the scope of even our most advanced AI. Maybe leave this discussion to actual technologists rather than talking out your asshole?
"How do you pattern match to know if statement X made about person Y is true or not? "
It's the person making the statement who can be censored with relative ease and lower cost.
Yes you can censor anyone . That is easy. Doing it accurately is not. In fact it isn't possible today. Listening to you comment on what is possible with algorithms is much like listening to a chimp talk about the nuances of rocket science. You simply have no clue what you are talking about.
"Doing it accurately is not."
Good enough for government. You must have heard the phrase before if you've been commenting here for any amount of time. Your government over the past few weeks has killed dozens, including infants, 'by mistake' and not a word of fuss about it have I seen, even in these pages. We have a tremendous tolerance for government inaccuracies.
You really are a first class retard. You make ignorant indefensible comments to justify all out assaults on the first amendment and your ultimate debate fallback is "People aren't complaining enough about the bombing campaign of the month". That could be true but it is completely irrelevant to the conversation at hand.
Look up the word 'sophist' in the dictionary to see mtrueman's picture.
I suggested that automating censorship will make it cheaper in the future. If this seems retarded to you, thanks for your input. We can discuss my many failings later if you like.
OT: SpaceX successfully put the same rocket into space twice. First time that's ever been done. It was also successfully recovered, so presumably they will try for a third time.
It's pretty funny that we're building vertically landing rockets that look and act so much like the ones from 1940-50's science fiction.
What's funny about that? Some of those writers knew enough engrg. to foresee it.
This is OT: The gay people of this world should form a country like the Jews have and use that nation state to murder the fuck out of people like that scumbag who rules Chechnya and all the other motherfuckers on this planet who kill gay people. There's no way Israel would let their people be murdered the way gays are murdered around this planet. Long past time for some payback.
http://www.msn.com/en-us/news/.....spartandhp
Gays don't have the balls to go after the Islamonazis. They talk big, but really they're a bunch of pussies who would rather go after the easy targets who don't cut their heads off.
Why don't they do it stealthily, so nobody knows it's going on, let alone who's doing it? Lots of "accidents", lots of "disappeared".
Key West?
I think a lot of this is just ignorant people freaking out over new technology. Consider these examples:
1) It's 1978 and two pimps are using their wall mounted telephones to arrange a tryst with an underage girl. Should we hold AT&T accountable? Of course not.
2) It's 1997 and a white supremacist has used Kinko's self-serve copy center to print off thousands of fliers making libelous statements about a local minority owned business. Should we hold Kinko's accountable when those fliers are left on car windshields at the local mall? Of course not.
3) It's 2017 and an angry guy prints nude pictures of his ex. He stuffs them into envelopes and snail mails them to all of her neighbors. Should we hold the US Postal Service accountable? Of course not.
4) It's 2017 and someone posts a racially offensive comment on Facebook. Should we hold Facebook accountable. Absolutely! It's Facebook's fault for providing the communication platform.
Give me a break.
In 2017, facebook is new technology to you? That's not how ANY of this works.
It's not new to me, but the people who want to hold web/platform/network providers responsible to the actions of users probably think that it's new.
My point is that businesses have provided communication services for centuries but we haven't made those businesses responsible for their users' content.Now some politicians want to change that and I suspect they'll argue "but this is different" when it's not all that different. Sure, a single Reddit post can be distributed worldwide nearly instantly but it's not truly different than Usenet newsgroups and dial-up. It's just an improvement along a continuum of speed and reach. I don't know what will come next but there will be something even faster and wider. To those who had only telegraphs or pony express, the telephone must have seemed like science fiction.
Those examples are not analogous to this.
Facebook isn't primarily a communication service. (yeah, they do have private messaging, but it's not the main purpose and not the target of this law) Users upload content and FB serves it to an audience. It's more like a newspaper letting you buy space in their classifieds to put nude photos of your ex there.
It's not like this is a new thing either. Section 230 was put in the CDA over 20 years ago with these kinds of things in mind.
Facebook isn't primarily a communication service. (yeah, they do have private messaging, but it's not the main purpose and not the target of this law) Users upload content and FB serves it to an audience.
uploading content and serving it to an audience ISNT communication?
"social network", it's right in the name, cmon.
What's right in the name? Communications platform sure as hell isn't.
It's not like a phone call. It's more like broadcasting, and yes I know it's not the same as broadcasting, but it's closer to that than a phone call. If a TV station broadcast an "advertisement" bought by a 3rd party that consisted of a video of his ex having sex with him, they would have been held accountable for broadcasting that since the early days of TV. It's not a new concept.
THAT is why Section 230 was written into the law in 1996 -- to protect newsgroups and BBSs (and later blog platforms and video upload sites) from similar responsibility, so long as they remove legally problematic content upon notice.
There was also reaction to a clumsy and counterproductive "children's rights" LP platform plank published in 1990. How that thing got into the platform and who wrote it, I would pay to find out.
You are right it isn't primarily a communication service it is almost exclusively a communication service. Take away the ability to communicate and there is nothing left except mafia wars.
Baloney. Some people probably use it for IMing, but it's a stupid platform to do that with if it's all you're doing. It's primarily about looking at content posted by others for an audience.
You can (and almost everyone does) restrict who can see your posts and whose you can see.
Is sending an email to a bunch of people to all the people in your department broadcasting instead of communication?
It is unusual for people to individually select which individuals can see a particular Facebook post or picture. That would be horribly tedious if someone does do it. The settings are usually just "my friends" or "friends of my friends" or general categories like that.
There's a fundamental difference between sending to multiple recipients and posting something where a limited number of people can see it.
It's like sending out party invitations via the mail to 10 specific members of your health club, vs. posting the party invite on a bulletin board in the health club where only members can see it. One is messaging, the other is broadcasting.
It is primarily about posting information to an audience and interacting with that audience as they comment. IE communication. Facebook is a communication platform.
If a magazine commenter makes libelous statements, the publication can't be sued for libel.
We're looking at you, dirtbags.
Right about now I bet you people are pining for the days when the only thing we had to panic about was clowns molesting our children as they walk alone through the woods at night.
Believe you me, the Russian Trump clowns are coming, mister!
Wait a minute, is this about the woodchipper comments or the sheeploving comments?
I'm sure I don't know what you're talking about.
Good to see ENB standing up for child labor.
Twitter is for squares. If this law drives criminal networks underground where they should be, it's a good thing.
Haha, right, things were so much better when prostitution was a street corner business, much safer.
Suited me OK.
Who cares what suits you?
Aside from unsafe street corner prostitutes, you mean?
Better to remove the federal exemption.
What if both parties perceive themselves as over 21? What then? Are we to allow such silly things as facts get in the way of individual perception?
Cushing also notes that this same idea of a "narrow" exception to Section 230 immunity "has been floated as a way to tackle the revenge porn problem."
Wait, revenge porn is a problem?
Vengeance is mine, sayeth the LORD GOVERNMENT.
Given how intertwined the taxman is with all things, all the steps necessary to commit a sex-trafficking crime probably contribute more to the public coffers than to any one involved in the crime - can we make those people disgorge their ill-gotten profits? If I'm using Facebook to solicit under-age prostitution, well, I didn't build that, and neither did Mark Zuckerberg. Barack Fucking Obama and his buddies Bernie and Hillary and Elizabeth Warren built that, didn't they? Sue those evil warthogs.
But this isn't about policing the intertubes, it's about making the deep pockets police them. Facebook shying away from anything remotely resembling anything that might be a liability to them isn't an unintended consequence of this bill, it's exactly the intent. First up, the drugs and the terrorism and the kiddie-diddlers, because why do you hate the children, and you just know hate speech is up next. It's free speech that they want to stamp out, the freedom to say things I disagree with.
"it's about making the deep pockets police them"
At best an intermediate step. Ultimately it's about making you police yourself. (And others.) Don't underestimate the internet commenter. It's conceivable that trusted stooges will actually pay for the privilege of censoring fellow commenters.
This article has me confused as all hell.
So the proposed Section 230 is good because it eliminates the bad shit the article spends most of its space describing?
Section 230 is the law as it stands now, the article deals with a proposed change to Section 230. For the children.
Putting aside the mythical nature of the sex trafficking issue, the egregious thing is that this law seeks to punish people for things that other people do. Why? Because they can't catch those other people. Classic "I'm looking for my keys here because the light is better" blundering, and deeply unjust.
ENB over on twitter has admitted that reason speaks with one unified editorial voice putting the lie to the cuck-commenters oft repeated claim that it is a diverse group of people with differing opinions when actually it is just like the The Federalist.
FTR: I love you all
Facebook is a fucking communication device PERIOD.
Example, "I'm having a drink right now", says John Doe to the rest of his family and friends who are on Facebook.
Regards
OiOiOi
Facebook is a toxic shit pipe but what does that have to do with my comment?
I once had an old Chevy I bought from a neighbor down the road.
This sort of thing was in the planning stage when the WWW was in its infancy. Only recently did I find out that a German picture book with English title "Show me" absolutely freaked the Landover Baptist Temperance, Abstinence and Antisex League. Few bother to read the original Nazi program of 1920, but it states:
"The publishing of papers which are not conducive to the national welfare must be forbidden. We demand the legal prosecution of all those tendencies in art and literature which corrupt our national life, and the suppression of cultural events which violate this demand.
"24. We demand freedom for all religious denominations in the State, provided they do not threaten its existence not offend the moral feelings of the German race."
Clearly, religious conservatism has found the entering wedge for censorship, exploiting the fact that minors are contractually incompetent by law. Americans could lighten up their scolding of German permissive attitudes just as Germans could lighten up their opposition to nuclear generating stations--to the benefit of taxpayers in both mixed economies.
RE: Proposed Tweak to Internet Law Could Spur Seismic Shifts in Web as We Know It
A bill related to sex trafficking and Section 230 could have far-reaching consequences for web content, publishers, and apps.
Hey, if internet censorship works in such paradigms of democracy and freedom like the People's Republic of China, North Korea and Cuba, it will work wonders here too.
You people need to have more faith in our obvious betters oppressing us.
By this logic, if a minor is engaged in prostitution on the street, the politicians who maintain that street should also be held accountable as sex traffickers.
This stupidly dangerous legislation isn't likely to go anywhere. The Wagner sow is just grandstanding to her ignorant, knuckle-dragging Missouri lemmings.
A few months back, some university put up a website where people could anonymously report rape and sexual abuse. The site was promptly flooded by people making fake claims of abuse against the faculty and senior staff. Not to actually accuse them, of course, but to show just how ridiculous the notion was. It seems that the right to confront your accuser is actually rather important.
I anticipate the blogs and commercial websites of these politicians and their supporters being similarly flooded, and screencapped comments offering paid sex with a trafficked minor on these websites being posted all over the internet.
DeNiro