Sen. Ron Wyden: Conservatives Are 'Totally Wrong' About Political Neutrality Under Section 230
"Section 230 has nothing to do with neutrality. Nothing. Zip. There is absolutely no weight to that argument," Wyden says. He oughta know. He wrote the damn thing.

The debate over free speech on the internet has turned everything "tursy-turvy" says Sen. Ron Wyden (D–Ore.).
"I'm up against a group of big-government Republicans who seem to think the right answer is to push the private sector—users and consumers and shareholders and managers—push them aside and, in-effect, deputize the federal government as speech police in violation of the First Amendment," Wyden tells Reason.
Let's back up all the way back to 1996 when the internet was barely an infant. It was that year when a younger version of Wyden was one of co-authors of Section 230 of the Communications Decency Act. It is all of one sentence long; just 26 words promising that online platforms will not be held liable for content provided by users or other publishers. And it is, as Wyden described it on Tuesday, "about the most libertarian law on the books."
But it's also under attack these days, largely from conservatives unhappy with how Facebook, Twitter, and other social media sites have been policing content—including everything from reducing the visibility of some posts, to temporary or permanent bans on controversial figures trafficking in conspiracy theories and outright racism. Sen. Josh Hawley (R–Mo.) has taken the lead by introducing a bill to amend Section 230 under the pretense of fighting tech companies' supposed "bias" against Republicans. His bill would eliminate Section 230 liability protections from online platforms judged not to be operating in a "politically neutral" manner.
"With Section 230, tech companies get a sweetheart deal that no other industry enjoys: complete exemption from traditional publisher liability in exchange for providing a forum free of political censorship," Hawley said when he introduced the bill last week. Since then, the notion that Section 230 has always included an implicit "deal" requiring platforms take a neutral political stance has become a talking point in some parts of the political right.
What does Wyden, the author of Section 230, think of that claim?
"Totally wrong," says Wyden. "Section 230 has nothing to do with neutrality. Nothing. Zip. There is absolutely no weight to that argument."
What the law does imply, according to Wyden, is that conservative blogs and websites can put their point of view out into the marketplace, "where users and consumers will make judgments about it." In other words, you have a right to speech online, but not a right to get your speech hosted on anyone else's website. The same is true for liberal perspectives online. "It's about making sure that all the voices get heard," Wyden says.
Conservatives like Hawley—and some of the newly emergent illiberal or "post-liberal" voices on the right—feel like their voices aren't being heard. They blame tech companies like Facebook and Google for this so-called "censorship." Under Hawley's bill, the misleadingly-titled "Ending Support for Internet Censorship Act," online platforms would have to hand over intellectual property to the federal government and would have to face a panel of partisan political appointees who would certify that the platform was operating in a "politically neutral" way.
As Reason's Elizabeth Nolan Brown wrote last week, the bill is essentially an attempt at resurrecting the old Fairness Doctrine—"a policy that was roundly denounced by conservatives for its chilling effect on free speech and its propensity to further marginalize non-mainstream voices—and apply this cursed policy paradigm to anything online."
Conservatives rushing ahead with a plan to put the federal government in charge of online speech makes about as much sense as if liberals thought it would be a good idea to have President Donald Trump and Attorney General William Barr policing websites, says Wyden.
He's also worried about how rolling back Section 230 would help entrench some of the very same tech companies that conservatives see as villains.
"This was an important law for the little guy when we wrote it, and it arguably is even more important today," Wyden says. Big tech companies like Facebook benefited from the freedom provided by Section 230 when they were starting up, Wyden said, but now that they have become dominant players online, they are trying to "pull up the ladder" behind them, he says.
"Frankly, if somebody rolled back some of Section 230, it helps the big technology people, like Facebook, hold off their small competitors," Wyden argues.
Where Wyden does believe the federal government should get more involved is on the debate over privacy and data. He's called for the Federal Trade Commission (FTC) to hold Facebook founder and CEO Mark Zuckerberg personally liable for his company's data breaches. In our interview, Wyden said he believed Zuckerberg had lied "on several occasions" about Facebook's privacy policies. "I do believe that the CEO should be held personally liable if they are found to have repeated misrepresented" those policies, said Wyden, who drew a parallel to how the government treats top executives at major financial institutions differently from small bankers.
In other words, Wyden's view is that the federal government should play a strong enforcement role when it comes to online privacy, but that it should have a fully hands-off policy when it comes to regulating speech online. It's an argument that not only libertarians but also conservatives and liberals interested in a free and open internet could agree with.
But the illiberal right—for which Hawley is acting as a sort of "spokesman in the Senate," according to Washington Free Beacon Editor Matthew Continetti—wants more government control over all online speech. Hawley's bill would require tech companies to get re-certified by the FTC as "politically neutral" every two years, effectively giving every new presidential administration a de facto veto over online speech. If that's not censorship, it's certainly brushing right up against it—and if you think too much of American society is already politicized, just wait until the fate of the internet is tied to the outcome of every presidential election.
It's easy to see why Wyden feels like the ground is shifting under his feet. For decades, he's been something of a rarity in Washington; a progressive Democratic with self-described "libertarian chromosomes" who has stood against executive power grabs and been a tireless advocate for free speech, especially online. Now, some of his across-the-aisle allies who used to share a skepticism of government power are missing in action, replaced by moralizing authoritarians who don't even seem to understand the obvious consequences of what they are proposing.
"For the life of me," Wyden says, "I can't understand how [rolling back Section 230] has become a big part of a political party that used to believe in less government."
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Wyden is lying his ass off. It absolutely does. The whole point of 230 is that internet companies should be treated as common carriers. Holding Youtube responsible for copyright violations on their platform is like holding the phone company responsible for violations committed over the phone
That made sense as long as they really were common carriers. And to be a common carrier means you let anyone on and don't discriminate on the basis of view point or anything else. If Youtube wants to kick off every conservative, good for them. It is their platform. But if they want that freedom, then they necessarily must take the responsibility that comes with it and be responsible for anything that goes on on their platform and not get out of it if they take it down.
Every time I think reason can't get any lower, they prove me wrong. Now they are standing for giving tech companies what amounts to special immunity while also giving them the full freedom to censor their platforms. Pathetic.
The whole point of 230 is that internet companies should be treated as common carriers.
Do you support Net Neutrality?
But if they want that freedom, then they necessarily must take the responsibility that comes with it and be responsible for anything that goes on on their platform and not get out of it if they take it down.
What is it you feel they should be held responsible for that they aren't currently being held responsible for?
In fact, aren't they removing content because of threats that they'll be held accountable for hosting, for example, racist conspiracy theories?
How does threatening to hold them accountable for content discourage censorship?
Do you support Net Neutrality?
This has nothing to do with net neutrality you half wit. This is about liability. Net Neutrality is about data transfer priorities.
What is it you feel they should be held responsible for that they aren’t currently being held responsible for?
Because they are claiming the right to control that content with impunity. Why do we claim the right to hold newspapers responsible for what they print or TV stations for what they broadcast? Because they control the content and therefor assume the responsibility for it. If Youtube and google want to exercise complete and arbitrary control over the content on their platforms, they have that right but must be treated accordingly and held accountable for that content. You can't claim control without assuming reposniblity.
Net Neutrality is about data transfer priorities.
That's the "problem" Net Neutrality claims to solve. Net Neutrality is exactly about declaring ISPs "common carriers."
Because they are claiming the right to control that content with impunity.
Why shouldn't they?
Why do we claim the right to hold newspapers responsible for what they print or TV stations for what they broadcast?
What do we "hold them responsible" for, exactly? Viewpoint neutrality?
For copyright infringement and libel you fucking moron. Social media platforms are immune from such things.
>>>held accountable for hosting, for example, racist conspiracy theories?
accountable by what?
accountable by what?
Congress.
You are completely, 100% wrong. That's fine if that is your opinion on how you want it to be (I think that would be terrible), but that is NOT what 230 says, or was ever intended to say.
> Wyden is lying his ass off. It absolutely does.
Um, no it doesn't. Here are those 26 words, reproduced as accurately I can get the server squirrels to reproduce it:
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
Notice that there's nothing in there about neutrality. Nothing in there are fairness. Nothing in there about common carriers. Those are the 26 words that make up section 230. The entirety of the 26 words.
Shadows and penumbras?
Tell that to BackPage
Right? They totally got railroaded with the shaft. Or shafted with the railroad. Something like that.
Why was that language made that way? Because the providers claimed to be common carriers and not publishers. Now they are claiming the power of publishers and that immunity needs to go.
All you are saying here is that "well they have it therefore they shoudl keep it". Bullshit.
Why was that language made that way?
Because unlike newspapers, radios, and TVs, a web host doesn't have active control over content. They don't control what goes up - they can only take something down after the fact.
Demonstrably false. Plenty of hosts utilize comment moderation prior to any post appearing. That others choose not to do so is also a fact, but also irrelevant.
Act like an editor, get treated like a publisher. The platform should not affect that principle. In passing the CDA Congress declared themselves the arbiter of all internet speech. The idea that they cannot modify that stance is absurd.
That others choose not to do so is also a fact, but also irrelevant.
It is very relevant. You're sidestepping my observation about the fundamental difference between web platforms and newspapers. The distinction is even currently recognized in the difference between "moderated" classified ads and publisher-generated content.
Act like an editor, get treated like a publisher.
Meaning what, exactly? People keep saying this but won't say what it means. Do we currently demand viewpoint neutrality from "publishers?"
People keep saying this but won’t say what it means. Do we currently demand viewpoint neutrality from “publishers?”
You're acting in bad faith at this point.
You’re acting in bad faith at this point.
How so?
TBH, what "Act like an editor, get treated like a publisher" sounds like to me is a variation on "that's a nice little business you have there. It would be a shame if we suddenly changed the liability rules on you now, wouldn't it? Let's discuss the political slant of what you've been publishing."
Maybe you can clarify for me why eliminating Section 230 is the solution to a lack of viewpoint neutrality.
Your defending excess legal protections for chosen entities, you realize that right?
No they didn't. Where are you getting this absolute bullshit from?
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
There's a definition of interactive computer service that would shed a huge light on why this statement--
Wyden is lying his ass off.
Is true.
But--as long as you leave off all the inconvenient facts that make your position a lie, you're just fine. Just like Wyden
"He oughta know. He wrote the damn thing."
It's fascinating how Reason flip-flops between being the most cynical and most naive bastards on earth.
the most cynical and most naive bastards on earth
I.e., "libertarians?"
wow, that's fucking rich you fucking retard. Yes, you, John, know more about the intentions underlying section 230 than the guy that actually wrote it. Holy shit, that is an impressive level of narcissism there.
This is going to the argument that politicians never lie it seems.
Exactly! When they act like neutral common carriers, they deserve to be treated like neutral common carriers. Whey they act like publishers, they deserve to be treated like publishers.
This is an example of why I'm not a libertarian.
Atlas Shrugged was supposed to be a warning, Not A Newspaper!
As someone who was actually working for an internet company back in 1996, Wyden's argument is somewhat right and somewhat wrong. It was meant to protect the fledgling industry from being sued out of existence, in particular to protect ISPs from being held liable for every website they hosted.
Missing from most of these debates is the fact that massive, popular, user-content-generated sites dominating the popular culture to the point they can impact elections were missing from the internet landscape back in 1996. Heck, Bill Gates hadn't even discovered it yet - he had his revelation in 1997.
Personally, I don't think Sec 230 is the key to this. I am more in agreement with those who think the resolution is through acknowledging that the current arbitrary practices are contract violations or fraud. The services provided aren't really 'free'. The user generated content or user provided data provided to those companies IS an item of value that is exchanged for it.
Internet companies were not treated as common carriers before the passage of Section 230. Nothing in the text of that section says that they are going to be treated as common carriers. Nothing even in the non-binding legislative history around the passage of that section says any such thing.
You may want them to be common carriers. That might even be the right social policy. But there was no condition attached to Section 230 which made the deal you are describing.
You really can't overstate how dishonest Wyden is being here. No one is saying anyone has a right to a platform. What they are saying is that you only get the immunity if you act as a common carrier. It would be nice if reason had someone bright enough to understand that rather than just reprinting what amounts to a Wynden press release.
What they are saying is that you only get the immunity if you act as a common carrier.
But does the law actually say that? I'm not seeing that language in the actual law.
Nope, it does not say that. I'm not sure where John is getting this bullshit from.
I am not sure how you can be this fucking stupid not to understand what is going on.
Here's the law: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
Period.
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
The argument is that since the tech companies are heavily curating their sites and purporting to be fact finders, they've become the "information content provider" rather than the "provider ... of an interactive computer service."
Think about a newspaper: there's much more legal liability in the news section than the classifieds because everyone understands that the classifieds aren't published as alleged facts.
“information content provider” rather than the “provider … of an interactive computer service.”
You can be both simultaneously. The law allows for that. What the law states is that one isn't liable for externally generated content. It doesn't become internally generated simply by deciding to delete/block it. If they were editing the content, then that's another question entirely.
If they were editing the content, then that’s another question entirely.
^ This. And GoogleNews in particular treads that line. But the question is - so what? We don't require Fox News or MSNBC to be viewpoint neutral - why should GoogleNews be?
Fox and MSNBC don't have the power to suppress and bury each others stories. Google does. They own the Memory Hole
They own the Memory Hole
How so? Do they have the power to ban content on the entire internet?
Google? Kind of. If they really wanted to they could certainly make it nearly impossible to find through search algorithm blacklists, deletion of videos on youtube, etc. Plus they have the contacts to reach out to other big tech companies to get them to join in.
Whether the publisher v. platform thing is even something worth considering is a different issue, but failing to acknowledge the actual amount of power Google has in influencing what the general populace sees is the kind of thing that will result in government stepping in.
Fox and MSNBC dont have the extra legal protections. what's with the dishonesty on this topic?
The law allows for that. What the law states is that one isn’t liable for externally generated content.
Newspapers are certainly liable for "externally generated content."
It's not clear whether the language is intended to normalize websites/hosts or create an exception.
Part of the original logic is that a web platform has considerably less control than a newspaper over "externally generated content." A newspaper has to actively curate and publish things. The appearance of content on web platforms involves mere passivity on the part of the operators.
Part of the original logic is that a web platform has considerably less control than a newspaper over “externally generated content.”
You managed to say something truthful. It is a miracle of Jesus and the Virgin Mary. Now square this true fact with Google and Youtube's claims to be able to ban any content for any reason and contrary to the TOS. If they want to exercise that kind of control, they are a publisher and should be treated as one.
You managed to say something truthful. It is a miracle of Jesus and the Virgin Mary.
John - I'm having a respectful disagreement. There's no need to be a dick.
Now square this true fact with Google and Youtube’s claims to be able to ban any content for any reason and contrary to the TOS.
Should a TOS contract be enforced by LEOs, or by civil action?
If they want to exercise that kind of control, they are a publisher and should be treated as one.
Meaning what, exactly? Do we demand viewpoint neutrality from publishers?
Meaning what, exactly? Do we demand viewpoint neutrality from publishers?
WE GET TO SUE THEM WHEN THEY LIE ABOUT US. THAT'S WHAT THIS WHOLE CONVERSATION IS ABOUT!!!
whew
WE GET TO SUE THEM WHEN THEY LIE ABOUT US.
What's stopping you now? If someone posts a video on YT lying about you, guess what? You can sue them. You just can't sue YT.
[For some reason there's no reply button under Square's comment]
Why don't people sue individual reporters rather than newspapers?
Why don’t people sue individual reporters rather than newspapers?
Because
1) the newspapers have considerably deeper pockets than the reporters
2) the reporters are typically direct employees of the newspapers who are generating content according to specific standards and defined editorial viewpoints
If Reason posts articles that are false and defamatory, they are not exempt from action simply "because internet." In fact, a certain incident involving metaphorical wood chippers reveals that an internet content provider can be held responsible for user-posted content in their comments already.
But what Section 230 means is that if you search the topic on Google and Reason's false and defamatory article comes up as part of the search results, Google can't be held responsible for the defamation committed by Reason.
Square... you do realize the 230 section has been greatly expanded beyond just content issues, correct? It is used in some of the trials right now about the abuse of the ToS in violation of contract arguments.
Part of the original logic is that a web platform has considerably less control than a newspaper over “externally generated content.”
And now that's clearly no longer the case. What is to be done?
And now that’s clearly no longer the case.
It's no longer the case that YT deals in user-generated content, or it's no longer the case that I can't magically post articles in the print media whenever I want without notifying them?
It's no longer the case that web platforms exercise considerably less control than newspapers.
It’s no longer the case that web platforms exercise considerably less control than newspapers.
That directly contradicts what I just said and offers nothing showing that what I said was in any way wrong.
If I want to post a screed on YT, I can have it up this afternoon. If I want to post a screed in the NYT, well . . . tough shit.
And what exactly are we saying YT should be held liable for? You must understand that they are de-platforming certain viewpoints because they are already afraid of being held liable for "radicalizing the youth."
You think if we delete Section 230 that the result is going to be Alex Jones back on YT and not the removal of all non-mainstream political thought whatsoever?
If Jared Taylor wants to publish in American Renaissance he can have the article up this afternoon. If he wants to post a YT video well … tough shit.
Seems pretty similar to me.
You must understand that they are de-platforming certain viewpoints because they are already afraid of being held liable for “radicalizing the youth.”
You’re insane if you think the tech behavior since the election is about avoiding legal liability.
You think if we delete Section 230 that the result is going to be Alex Jones back on YT and not the removal of all non-mainstream political thought whatsoever?
You continue to not understand the basic dynamic here.
You mean the "basic dynamic" of "the illiberal right wants to destroy Big Tech by any means necessary and sees the weakening/repeal of CDA 230 as a necessary step in order to obtain that goal"?
If Jared Taylor wants to publish in American Renaissance he can have the article up this afternoon.
Yes. He can publish in his magazine that he started. And no one can prevent him, not even Google or YouTube. That's the magic of Section 230 at work, right there.
I can publish something on YT this afternoon and he can't. Why is that? Is it because they grant me a specific privilege, or is it that they used to host his content and then removed it because they felt it violated their TOS?
You’re insane if you think the tech behavior since the election is about avoiding legal liability.
Really? So when they took down all the gun videos after a mass shooting, it wasn't because they were being threatened with being held liable for disseminating information on how to kill people? They haven't been de-platforming white supremacists because they're being threatened with the elimination of Section 230 and being held responsible for "radicalization" and "promoting right-wing violence?"
You continue to not understand the basic dynamic here.
So explain it to me. What am I missing? Should YT be open to liability for having aired Alex Jones' lies about Newtown?
And what exactly are we saying YT should be held liable for? You must understand that they are de-platforming certain viewpoints because they are already afraid of being held liable for “radicalizing the youth.”
And yet YT isn't de-platforming any of the content creators telling the youth to go out and punch people, or any of the creators telling the youth to erupt in violent rebellion.
Instead, they're de-platforming journalists who are reporting that this wave of censorship is backed by a deliberately political motive and not anything to do with the standards set forth in the ToS.
Or they're de-platforming comedians who point out the manifold errors in everything Vox posts.
And editing the content is exactly why they're in trouble.
Google is editing search results.
YouTube is editing search results and individual programs and channels.
Facebook is editing search results, who is able to see posts, how posts appear.
Exactly. Is Reason an interactive computer service? Or is it an information content provider?
When you edit, you become an information content provider. Wyden and Boehm can try to pretend otherwise, but they makes them ignorant, dishonest, or both.
Wyden? How about Reason? They keep churning out article after article saying the same damn thing, and have yet to acknowledge the Project Veritas Google expose
And?
But it is well within Google's right to take down videos from their platform! A Reason article on the subject would say exactly that.
Google didn't "take down a video", they rig all their searches to favor a narrative, and feel they are 'tasked with preventing' certain political outcomes.
They are now making in-kind contributions to one political side, worth billions.
This is exactly why reason doesn't want to touch it, so simpletons like you can plead ignorance
Rigging searches to favor a narrative is editing.
Rigging searches to favor a narrative is editing.
Redefining words to fit the argument is all the rage these days.
But buddy Google is allowed those freedoms just like you or I. It's too bad that Google has chosen such an illogical political ideology to attach themselves to but that's their choice. Again, I do not support government involvement in markets of any kind including rent seeking as you mentioned, but I won't cry out for the government to step in when businesses don't act how I want them to. Even if that wasn't an unprincipled approach, it won't work the way you think it will anyway.
I'm not your buddy, pal.
By rigging searches and autocompletes the way they have demonstrated that they do, they are in violation of federal election law.
So, we either get rid of ALL laws regarding speech, content, political contributions, or we structure them to apply evenly.
Sec 230 is not even in play here. We aren't talking about Google being liable for others content, we are talking about Google suppressing or promoting content on a viewpoint discrimination basis, and basically committing a fraud upon the users by deliberately skewing search traffic results.
Try watching the Veritas video
Why half-ass it? Let's just seize the platforms and distribute them evenly among the people. Everyone gets a share. It doesn't matter what the companies would rather do with their product because they are just capitalist pigs and we know better.
Sec 230 is not even in play here.
It's literally the exact topic under discussion here and what the article was about.
If you're saying that Section 230 is completely irrelevant to the question of biased content on the internet, then you 100% agree with this article.
Apparently Reason.com decided "Illiberal Right" is their talking point of the day.
Beyond that, though, I do agree. I don't think the issue is with section 230. It's not guaranteeing a platform for people to use for their speech, it's merely stating that web hosters are not liable for user-created content. The remedy is, as always, the free market. Find other platforms, and if they commit tortuous interference to shut down competitors, hit them with repeated lawsuits.
It’s not guaranteeing a platform for people to use for their speech, it’s merely stating that web hosters are not liable for user-created content. The remedy is, as always, the free market. Find other platforms, and if they commit tortuous interference to shut down competitors, hit them with repeated lawsuits.
^ This.
Regulation of the internet will back-fire on the Republicans.
Duh! Everything backfires on the Republicans.
Exactly. 'Big-Government Right' would do just as well.
"it’s merely stating that web hosters are not liable for user-created content. "
No, firstly that is not what it actually says, secondly what is being asserted here by Wyden and Boehm is that it also says that operator created content is likewise liability free.
what is being asserted here by Wyden and Boehm is that it also says that operator created content is likewise liability free
Where do they say that?
While I agree any viewpoint-based regulation is contrary to the purpose of Section 230 and would violate the First Amendment, the government could/should condition 230 immunity on the adoption of viewpoint-neutral due process protections that give uses clear notice of any censorship rules and ensure those rules are enforced fairly.
the government could/should condition 230 immunity on the adoption of viewpoint-neutral due process protections
And what would those look like? If you publish a Nazi you have to also a publish a Communist? If you can't find a Communist, you're not allowed to publish the Nazi? Or you have to make up a Communist perspective in order to achieve balance?
And what is the opposite of libertarian? Or will libertarian simply be declared a fringe view not worthy of consideration?
Communist and Nazi are not opposites
Oh, look - you almost got my point!
And you totally missed his. What part of neutral involves determining relative positions, merits or demerits?
It doesn't matter whether you discriminate against one, or the other, or both. That you discriminate at all is the issue. Any discrimination
(with the possible exception of declining to participate in outright criminality) makes you not neutral.
What part of neutral involves determining relative positions, merits or demerits?
The part where you write a law mandating neutrality, which will require a law enforcement body tasked with determining relative positions, merits or demerits.
Otherwise, let a jury figure it out.
I guess libertarians are now in favor of the positive right to not be sued?
I guess libertarians are now in favor of the positive right to not be sued?
I think you missed where the OP argued in favor of a positive legal obligation to maintain "viewpoint neutrality."
And yes - I agree with the original decision to shield platforms from liability for content posted on those platforms by people who are not them.
By the same token I don't hold sports-stadium owners responsible for crimes committed by people who happen to be in those stadiums. The right of the stadiums to let in only certain people does not make them responsible for the actions of the people they do let in.
Sports stadium analogy is the best argument on this thread.
You mean like the Dodgers being held liable for the injuries to a Giant's fan?
https://www.latimes.com/local/lanow/la-me-ln-stow-dodgers-verdict-20140709-story.html
Under due process regulations, Twitter could adopt any rule it wanted to adopt as long as it was sufficiently clear and enforced in a non-arbitrary manner.
My point is that the First Amendment protects Twitter's right to censor based on viewpoint, but not Twitter's right to do it secretly or arbitrarily
This type of concern can already be addressed via contract enforcement generally. Don't need to abolish CDA 230 in order to get Twitter, or any company, to enforce its own contracts.
This type of concern can already be addressed via contract enforcement generally.
^ This. It's already the case that "Twitter can adopt any rule it wants to adopt as long as it's sufficiently clear and enforced in a non-arbitrary manner."
As I've said before, people who feel they have been de-platformed in violation of a site's TOS, they have every right to sue, and if they are correct that that's what happened, they should win. I suspect, however, that most TOS's have a clause saying "we can remove inappropriate content, and we decide what is and isn't appropriate at our own discretion."
"most TOS’s have a clause saying “we can remove inappropriate content, and we decide what is and isn’t appropriate at our own discretion.”"
Exactly and due process protections would prohibit that type of arbitrary authority. It would be similar to requiring sites to adopt censorship rules that could withstand a "vagueness" challenge.
Exactly and due process protections would prohibit that type of arbitrary authority.
By "arbitrary authority" you mean "freedom of contract?"
Sites don't need to accept the government-granted immunity of Section 230. If they choose to accept it, it is reasonable for them to also accept a restriction on their freedom to insist on contract clause that give the arbitrary authority to censor.
If they choose to accept it, it is reasonable for them to also accept a restriction on their freedom to insist on contract clause that give the arbitrary authority to censor.
Why? What does viewpoint neutrality have to do with Section 230? I really just don't see any relationship between the two things at all. Eliminating Section 230 doesn't create any obligation of neutrality whatsoever - it just makes the whole internet vulnerable to silencing by the biggest corporate and political players.
Shriek posted a link to kiddie porn in the comment section. Eliminate Section 230, what keeps the Reason editorial staff from being locked up as distributors of child pornography?
I am arguing that they do not have to maintain viewpoint neutrality -- I am arguing that due process restrictions are themselves viewpoint neutral. My suggestion protects social media sites' right to discriminate based on viewpoint -- as long as they do so through rules that are clear and not arbitrarily enforced.
Here's what you said to kick off this thread:
the government could/should condition 230 immunity on the adoption of viewpoint-neutral due process protections that give uses clear notice of any censorship rules and ensure those rules are enforced fairly
How is that a change from the current state of things? Other than having enshrined this notion of "viewpoint neutrality" that will now be open to interpretation by regulatory authorities?
The point of Section 230 is to shield platforms from liability for content that they didn't create or curate. How can you demand that that content be viewpoint neutral when by definition Section 230 applies to content not generated by the host platform?
TOS's already need to be clear and consistently enforced or else the provider opens themselves up to action for contract violations. What would your added clause do to modify this state of things?
You are mixing up what I am saying. Due process regulations are "viewpoint neutral" because they do not restrict a social media site's authority to adopt censorship rules that discriminate based on viewpoint. Social media sites have a 1A right to curate content based on viewpoint. Any gov't restriction on that right is a viewpoint-based restriction.
And what would those look like?
I think he's coming at it from a censorship angle not a fairness doctrine angle. In other words, ongoing protections contingent not on going out to find or generating balancing content so much as contingent on not restricting existing or new content without some sort of mechanism for redress.
I think he’s coming at it from a censorship angle not a fairness doctrine angle.
I agree that that's where he's coming from, and my question wasn't meant to be an attack, exactly, as much as an honest question re: "how are you going to write that law defining viewpoint neutrality in a way that isn't going to have the opposite of the intended effect?"
And I have to ask - ongoing protections from what?
Section 230 liability protections.
Section 230 liability protections.
Liability for what? Viewpoint neutrality?
And what is the opposite of libertarian?
Everybody else?
So I guess that means libertarians would win half the Internet, in the name of "balance"?
So I guess that means libertarians would win half the Internet, in the name of “balance”?
While that would be awesome, something inside me feels like that's not what the outcome would be.
Viewpoint NEUTRAL.
This means that you take NO action based on the viewpoint of creator provided content. None.
You do not ban anyone, you do not promote anyone without announcing it. Your overall selection algorithms are based on popularity--most views, most talked about. Everything being based on numbers--and not content, context, feelings or any other subjective means of identification.
Do you see?
It's not about people policing speech--it's about getting them to stop.
They blame tech companies like Facebook and Google for this so-called "censorship."
LOL, we know--not suspect, know--that Facebook, Twitter, YT etc deplatform conservatives by various means. That isn't the marketplace of ideas, it's a few cranks behind the scenes deciding what is viewable to other people on what's become the de facto public square of the Internet. It's the case where the bakery that won't provide a gay wedding cake is the only one in the state, or at least the only one that doesn't have rats so that nobody buys anything there.
What it is not, is comparable to the Fairness Doctrine, which demanded equal amounts of content from both sides of the aisle. Removing s.230 wouldn't mandate equal amounts of content, just that whatever conservative content there is can't be hidden or deplatformed because the people with their fingers on the scales don't like it.
Nope. Removing 230 would mean you can try to sue YouTube over the content a user posted on YouTube. That's it. YouTube could still ban you at will because they don't like your political opinions. Why can't people understand this?
Removing 230 would mean you can try to sue YouTube over the content a user posted on YouTube. That’s it. YouTube could still ban you at will because they don’t like your political opinions.
^ This.
Worst of both worlds.
Nope. Removing 230 would mean you can try to sue YouTube over the content a user posted on YouTube. That’s it. YouTube could still ban you at will because they don’t like your political opinions. Why can’t people understand this?
Everyone understands this. What you don't seem to understand is that this would be the end of non-corporate videos on YouTube, i.e. the end of YouTube.
the end of YouTube
No, the end of user-generated content on the Internet.
No, the end of user-generated content on the Internet.
This isn't how markets work.
If every platform is suddenly responsible for anything posted on it then why would any platform allow it?
If every platform is suddenly responsible for anything posted on it then why would any platform allow it?
^ This. I.e., the original logic behind Section 230.
Seems like it'd be easier to just stop curating and get the exemption back.
Seems like it’d be easier to just stop curating and get the exemption back.
To borrow a phrase:
This isn’t how markets work.
There is no "exemption". Read the 26 words. They are very clear.
If Jared Taylor wants to publish in American Renaissance he can have the article up this afternoon. If he wants to post a YT video well ... tough shit.
Seems pretty similar to me.
You must understand that they are de-platforming certain viewpoints because they are already afraid of being held liable for “radicalizing the youth.”
You're insane if you think the tech behavior since the election is about avoiding legal liability.
No provider or user of an interactive computer service shall be treated as the publisher or speaker
Are you retarded?
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Based on the part you omitted, it seems safe to assume dishonesty.
Nice self own, dumbass
Not the end of user generated content. The end of user stolen content.
I’ll admit I have no idea what you’re saying here.
Removing s.230 wouldn’t mandate equal amounts of content, just that whatever conservative content there is can’t be hidden or deplatformed because the people with their fingers on the scales don’t like it.
What is that leads you to believe that conservatives will be the primary beneficiaries of empowering the government to regulate content on the internet?
Yeth!
I've been waiting for Reathon to do more articleth on their favorite progreththive libertarian.
Fuck off, thlaverth!
ctrl+f civil rights act 0/0
Are you a hypocrite or a coward?
"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
It's rather simple actually.
Why the fuck are the same conservatives who argue that the 1st Amendment means just what it says, so unwilling to take these 26 words as what they say? Why are they taking pages out of the proggy playbook and trying to read intent between the lines? Section 230 means exactly what it says.
Just because Wyden wrote the damn thing doesn't mean he knows what it says - James Madison wrote the Constitution and I dare say he didn't know a thing about what it said. The fool thought there was something in there about checks and balances and guarding the guardians and limiting the scope and power of Government Almighty. Section 230 means whatever the hell the government decides it means, just as "Congress shall make no law..." means "Congress may make a law if they say they have a good reason...".
When will people stop with the first amendment whining and just sue for breach of the terms of service? If the providers are not banishing in an equal manner, they are in breach of contract, no?
If violent statements from socialists are deleted as quickly as violent statements from anarchists, no problem. If not, sue the bastards.
(Disclosure: I never got any closer to a social media platform than reading the terms of service and the privacy policy. Both were unacceptable, so I closed the site and have not been back.)
When will people stop with the first amendment whining and just sue for breach of the terms of service? If the providers are not banishing in an equal manner, they are in breach of contract, no?
^ THISTHISTHISTHISTHIS
Both were unacceptable, so I closed the site and have not been back.
^ This is how this actually works.
If the providers are not banishing in an equal manner, they are in breach of contract, no
WTF would they write that into their own contracts that nobody reads.
Never sign a contract you haven't read.
Huh?
A good basic rule to live by.
Watching the anti-social dumbfucks who chanted "lock her up," fretted about partisan pizza pedophilia, patronize faith healers and televangelists, and spread racist birtherism whine about accountability for "objectionable" speech is entertaining.
Now, back to stomping clingers in the culture war . . .
The principles in question are pretty simple.
Suppose The New York Times decides to farm out it's web hosting. I set up a site to carry their content.
I am not liable for their content.
Right up until the moment when I start altering their posted content. AS SOON as I do that then I am as much liable for what gets published on their online site as they are.
Same goes for Google, Youtube, Facebook, etc.
They have long since ceased being site/platform providers and have crossed well over into the realm of publishers.
I’m not on Facebook and I don’t post content to YouTube so you’ll have to help me out here. Is editing user content something they do frequently?
Why do you think frequency matters? They both edit or otherwise exert direct control over user content.
So they aren’t editing user-generated content? They’re deleting user-generated content and you’ve decided to equate that with deletion of content? Because the dictionary definition of edit says that a thing has been edited if something has been removed from it?
They’re deleting user-generated content and you’ve decided to equate that with
deletionediting of content?Speaking of editing content ... edit button?
Saying deletion of content is a type of "editing" is trying to stretch the definition of "edit" beyond what is intended here.
The moderators at Twitter, Facebook, etc., do not alter the meaning of a particular item of content. They either allow it completely, or delete it completely. I suppose one could argue that Facebook has "edited" a user's timeline by deleting an item from that timeline, but the timeline itself isn't the user-generated content, it's the items on the timeline that constitute the content.
Horseshit. Choosing which PragerU videos get put in restricted mode is altering the meaning of the activity.
And timelines aren't even user content - they are publisher content generated by Facebook using user content as source material.
Interesting item about PragerU:
https://www.techdirt.com/articles/20180309/15270839397/youtube-shows-dennis-pragers-claim-discrimination-against-conservatives-is-laughable.shtml
The moderators at Twitter, Facebook, etc., do not alter the meaning of a particular item of content. They either allow it completely, or delete it completely.
No. They also engage in shadowbanning. This can be several things. On Facebook wrongthink posts may get left up, but might only show if one goes to the precise spot where they're posted--they will not show up in the feeds of any followers.
Since reading one's wall, with all the content one has selected on it is the primary way people use Facebook, the company has effectively edited out items they don't like without telling the poster.
A similar thing happens with Twitter.
I suppose one could argue that Facebook has “edited” a user’s timeline by deleting an item from that timeline, but the timeline itself isn’t the user-generated content, it’s the items on the timeline that constitute the content.
If they deleted the entire timeline it would be one thing--but going in and removing certain posts is DEFINITLY editing.
Do you people just not know what 'editing' means? Cutting things out is editing. Changing where things appear is editing.
Is that it? That you don't know what 'edit' means?
Deleting is not editing???
You have officially entered ChemJeff territory.
If by "editing content" you mean "altering the meaning of an item of content", then no, deletion of the item of content *in its entirety* is not altering the meaning of the deleted item.
Deleting is not editing???
No, it isn't. If you give me a book to publish, and I go through it and cut stuff out, rearrange things, and change some phrasing here and there, I have edited it. If I refuse to publish it altogether, or do publish it and then remove it from my catalog later, that is not editing.
You can argue that refusing to publish something is not the same as editing something. But it is exercising editorial control. When you do that you are not a neutral platform. You are - by your own admission - acting like a publisher.
Your catalog???
Again, think about what you are saying about who controls, and what it is they control.
Catalogs are run by content providers.
"your catalog" also implies ownership.
The idea you cannot be held liable for that which you own and control is offensive to liberty, it is effectively a form of nobility.
(as in, a title of nobility)
The idea you cannot be held liable for that which you own and control is offensive to liberty
Agreed. How about the idea that you can be held liable for something that you don't own and don't control? That's what Section 230 asserts.
Catalogs are run by content providers.
You're conflating two discussions. This particular subthread is about the meaning of "editing," not the meaning of "content providers." Since your definition of "content provider" depends on your definition of "editing," you are begging the question here. And in the real sense, not the "raising the question" sense.
Essentially you're saying that anyone who doesn't publish everything is a publisher and should be held liable for . . . something.
Not publishing something is just not the same thing as editing - that's why we have different words. You're having to do some Olympic-level stretching to make there be no distinction between these words.
But let's go ahead and let you steal these bases, just for the sake of argument.
What is it that you are advocating doing? How will eliminating Section 230 help alleviate the scourge of content suppression?
They don't edit content. They decide very broadly which items of content to allow or prohibit.
To take ThomasD's analogy further:
If he decided to carry NYT's web content on his service, didn't alter any of the articles, but chose to carry only the articles on news but not the articles on book reviews, then that is more analogous to what Google or Facebook are doing.
"They don’t edit content. They decide very broadly which items of content to allow or prohibit."
https://www.thefreedictionary.com/edit
Of all the stupid things you have ever said here on Reason, this one might take the prize.
Bravo.
Having you on Boehm's side has to be the most damning thing thing possible.
If I post a profane tweet, which runs afoul of the moderation policy, the moderators do not alter the content of the tweet, they might just delete the tweet entirely. That is what I mean.
Calling the deletion of a piece of content "editing the content" is misleading at best.
Content moderation is indeed a form of editing.
Do that and you expose your moderation process to examination, which could lead to liability for your editorial acts.
Don't edit, and you are still a neutral platform with full protection from liability.
If you are referring to "content" as "the entirety of a user's body of work", then yes, deleting one item from that corpus is a type of editing, because that would be altering the *overall meaning* of the entire corpus of work.
If you are referring to "content" as "one individual item in that body of work", then deleting that individual item is not "editing", because the deletion does not alter the meaning of that one individual item.
That is how I am referring to it.
So, it's editing, just not editing-editing.
Thanks Whoopee.
It's a fallacy of terms. Your view of "editing" only makes sense if you think that a person's Twitter timeline or Facebook feed represents some larger body of work that is supposed to convey some overall intended meaning, such as is the case with a movie or a novel. So even in your definition, deleting a scene from a movie is editing the movie, it is not editing the scene itself; because the meaning of the scene wasn't altered, the meaning of the MOVIE was altered. Get it?
So discussing whether Twitter "edits content" in this manner is now I suppose confusing, because what I meant by "edit content" and what you meant by "edit content" are two different things.
Twitter doesn't edit individual tweets. Twitter either deletes tweets in its entirety, or does not. Twitter DOES edit a person's timeline when it deletes tweets. But the timeline itself is not the user-generated content, it's the items IN the timeline that represent the user-generated content.
I suppose if someone were trying to use tweets to write a novel or something, and Twitter decided to delete a tweet in the middle of this "novel" and therefore altered the meaning of the "novel" then you might have a point. But AFAIK no one does that sort of thing.
"Twitter doesn’t edit individual tweets. Twitter either deletes tweets in its entirety, or does not."
So, I'm not 'editing a book' if I only delete full sentences?
That's stupid, even for you.
When you decide what can or cannot be said then you are exerting editorial control.
By deleting a sentence from a book, you're not editing the sentence, you're editing the book. That is my point. Because you'd be altering the intended meaning of the book, but you wouldn't be altering the intended meaning of the sentence - the sentence wasn't modified, it was simply eliminated.
So if you want to discuss Twitter "editing content", let's be more precise in what we mean by these terms.
Of all the stupid things you have ever said here on Reason, this one might take the prize.
Um . . . that list doesn't actually include entirely deleting something under the definition of "editing." If you intentionally misinterpret No. 4 and squint at it sideways, maybe a little bit, but you certainly haven't done anything to warrant that little victory dance there.
You think you get to decide what discrete unit of information elimination is not editing?
A youtube site consisting of multiple videos, because they eliminate one in it's entirety you say that's not 'editing' because they simply took the whole thing.
I see that like cutting a chapter out of a book.
You think you get to decide
I see that like
Don't you hate people who think they get to decide what stuff means?
Like pretending that a bunch of unrelated videos by unrelated content-creators are "sorta-like" chapters of a monograph such that to not publish absolutely everything in the universe is just like deleting chapters out of a book?
No, if they decided to not carry any news stories that were positive to one party, and also suppressed news stories that were harmful to the other, that would be analogous. Editorial control distorting the flow of information
Editorial control distorting the flow of information
Isn't a choice to carry only news articles but not book review articles ALSO "distorting the flow of information"? How is it not?
Would you prohibit ThomasD's hypothetical NYT web hosting service from selecting which articles to host and which ones not to host, even if it did not edit any of the articles for content?
Drop the strawman. This is not about prohibiting anything. It is about treating people equally. When you act like a publisher - be it in print on dead tree or pixels online - your responsibilities should be the same.
The guy who simply hosts a website is no more responsible for it's content than the kid who slings the paper onto your doorstep is responsible for it's.
If either of them start making alterations - ANY alterations - to the content of the publication then they become liable.
And if they start adding content (e.g. timelines or other sorts of selective feeds or streams) then they most certainly become publishers in their own right.
If either of them start making alterations – ANY alterations – to the content of the publication then they become liable.
But that is silly, in the broad manner that you are evidently construing the concept of "alterations to content".
If Twitter were to modify individual tweets *so as to change the intended meaning of the tweet*, so that the new meaning of the tweet was different than the user's original intent, then yes, I would agree with you. Because then Twitter would basically be committing a type of fraud - they would be lying about what a user meant by his/her own tweet.
But if Twitter were to delete a tweet without modifying it, then that does not change the user's intended meaning of the tweet itself, *because the intended meaning of the tweet wasn't altered in any way*.
What you're basically saying, is that if I write 3 books, and if the publisher modifies one sentence of one book only, then ALL THREE BOOKS have been "edited" because some small change was made to the entirety of my body of work. That's a quite frankly absurd way to use the definition of "edit".
If either of them start making alterations – ANY alterations – to the content of the publication then they become liable.
Okay - rather than quibble over whether Twitter deleting tweets counts as "editing," let's just acknowledge that, yes, there is a point at which a web platform modifies content enough to now be considered a publisher. We may not agree on where that line is, but we agree that the line does exist.
So what?
What does that have to do with Section 230, which shields content providers from content provided by others? Isn't your argument simply that once Twitter starts deleting tweets, or once a comment section becomes moderated, that the platform is no longer shielded by Section 230?
Why does Section 230 need to be gotten rid of? And what will getting rid of it do to promote content neutrality?
1. The content is not "provided by others." Your own arguments here have made it abundantly clear who owns it - in terms of both control and benefit derived. Your assertion is simply unsustainable based upon the behavior of these organizations. All I'm recommending is that this be acknowledged and they be treated like anyone else who did the same or similar on any other sort of platform (be it paper publication, or a coffeehouse.)
2. Section 230 does not need to be "gotten rid of" - an argument I have not made (what's with you two and the strawmmen.) Again my recommendation is that it be recognized that these organizations are more often acting as information content providers than they are acting as computer service providers. Stop giving them a free pass for what they claim - entirely out of convenience - to be.
This is pretty close, except you're missing the contract aspect of it.
When you're using a platform, usually they include Terms of Service. Often these are meaningless unless there's actual monetary implications of those Terms of Service. In the case of certain people who use Youtube or Twitter to promote their content, they're relying on that network to help drive their revenue stream. If those platforms want to remove that person, they're cutting them off from that revenue stream.
If you're banned without a clear violation of Terms of Service, you then have a case to sue for breach of contract.
So to apply to your case, if you're hosting for the NYT, and you've signed a contract promising to host their stuff, and then start pulling articles that you disagree with, you're actually hurting their financial bottom line by reducing their views, and they can (and will) sue for redress.
All that needs to be done is make sure these companies are held civilly liable for either arbitrary enforcement of their terms of service, or asking them for greater clarity when it comes to their TOS.
1. You ignore that, often by these so called TOS your property becomes effectively their property (via demonetization or other forms of restrictions.)
2. Your argument about the contract with the NYT assumes many, many things not in evidence. And also supposes that I remove content, as opposed to adding, or merely re-arranging it.
Odd that you see the TOS being rather nebulous but assume the agreement with the NYT must somehow be more binding...
And, to be clear, I'm not saying that your suppositions are off-base, just that it is curious that they are so easily accepted as the status-quo.
Based up the way the simple, plain meaning of some rather obvious words (edit, editor, publish, publisher, catalog, etc) have tripped you two up it is eminently clear that you are both arguing from a preferred outcome, not any sort of principle.
That's funny, I was thinking something similar about you. That you were trying to stretch the definitions of words in order to reach some pre-determined outcome and play some sort of gotcha game.
"A-ha! Twitter 'edits content' just like a book editor 'edits content' and therefore Twitter is just like a book publisher!"
Except the "editing" and the particular "content" being edited take different forms and should be understood in different ways.
A book editor working for a publisher changes the meaning of books by altering sentences, etc., within the manuscript of the book.
In the case of Twitter, there is no overall "book" that is to be edited, only individual items of content that are not modified by the moderators in the manner that a book editor might modify individual sentences. Instead the moderators either permit or reject in their entirety these individual items, without modification.
You really didn't need to go to all that trouble to tell me that you have no problem with Twitter (or Google, or Facebook, or whoever) controlling what other people can and cannot say.
On their own platforms? As long as they aren't committing fraud? No, I don't have a problem with their right to use their property as they see fit. I don't agree with every decision that they have made, but I do agree with their right to exercise their private property rights.
then why are they not equally liable for what appears on their 'private property?'
You mean, the words and videos and ideas generated by *others*? Why should they be held liable for the content and ideas of others?
If you go to a Bernie Sanders rally, and yell LIZZIE WARREN HATES THE JEWS, should Bernie Sanders be held liable for your libelous comments?
If your notion of how things are remotely reflected how things actually are then there would have been no need for the CDA or Section 230.
Yet, somehow, and for some apparent reason, they were passed into law...
Again, I'll make this obvious for you.
I run a coffeeshop, back in the pre-internet days. I sell drinks and snacks. But I also have a wall full shelves loaded with photocopied books and publications that I a "allow" patrons to leave behind and other patrons to freely access.
Why should I be held liable for other people's copyright violations?
Now you are changing the subject to that of copyright law and intellectual property. That is an entirely different subject than what CDA 230 is about. Deleting a tweet is not stealing someone's intellectual property.
Intellectual property rights are the entire heart of this matter.
Why should I be held liable for other people’s copyright violations?
I find it hard to believe you would be. Is this something that actually happened?
You are perhaps unfamiliar with the dust up over file 'sharing' sites not that long ago?
Okay, so why do you think CDA 230 was passed in the first place?
It was an attempt to provide the same sorts of protections afforded to other common carriers - be it the telephones, or UPS/Fed-Ex from legal liability over things that they largely did not control (what was said on the phone, or what was inside the package.)
But it was poorly constructed and has now allowed these organizations to both control content and claim they are not responsible for said content.
But, this has already be repeated ad infinitum, and you obviously don't give a shit. You are incapable of good faith discussion.
Because, were that the case then not just Youtube, but all of Alphabet would be owned by BMG, for all the pirated music/video content alone.
Are you under the impression that YT doesn't remove copyrighted material when requested to by the copyright holders?
Because that impression would be wrong.
when requested
LOL. They go out of their way to police all sorts of other content, their algorithms can identify and promote myriad sorts of things that they think will draw eyeballs.
Yet somehow they have to wait for a formal DMCA take down notice to deal with copyright violations.
Are you really this stupid to try to pretend that Youtube wasn't willingly and knowingly built upon gross abuse of copyright and the fig leaf of 230 protections?
Because, let's face it, the principals are what you value here.
Change those and I doubt anyone here would expect your position to remain unchanged.
Because, let's face it, when you lose an argument, you then resort to stuffing words into people's mouths.
Do you seriously believe that Dean Baquet, Joseph Kahn, Rebecca Blumenstein, Tom Bodkin (all at the NYT) spend their time "altering sentences" in pieces submitted by journalists?
The primary job of many editors is selecting acceptable content and rejecting unacceptable content, according to the preferences and mission of the publisher. That's what "editing" means these days.
You're thinking of "copy editing", which is a small subset of what editing comprises.
Wyden says. He oughta know. He wrote the damn thing.
If only the people who wrote laws controlled their interpretation and execution.
My questions for Wyden are:
Can a provider or user of an interactive computer service also be an information content provider?
And, if so, can they be liable for the content?
Or are the two always and everywhere mutually exclusive?
Can a provider or user of an interactive computer service also be an information content provider?
Yes.
And, if so, can they be liable for the content?
If they generated it? Yes.
Or are the two always and everywhere mutually exclusive?
No.
From the fourth paragraph it's apparent that the author is either incompetent or lying.
For anyone interested in what Section 230 actually says, you can find it here: https://www.law.cornell.edu/uscode/text/47/230
Thanks for the link
Under defintions
(emphasis added)
Pretending that Google, Youtube. Facebook, Twitter, etc, by the nature of their editorial acts, do not meet the definition of content providers is ludicrous.
Why not just fix it the simple way and pass a law that says nobody's posts, etc can be censored at all?
Nobody with any power wants that. The Democrats and Republicans _all_ want to censor something, whether it's wrongthink, news embarrassing to prominent political candidates, or just nude selfies.
This argument is bonkers. What this really is is an argument against the centuries-old causes of action for libel. Make that argument.
I have a simpler solution:
A law that states that no website over 1 million users may censor any legally permissible speech in the USA on it's website or in its comment section.
That's it. They can allow SJWs filtering options if they want to block certain types of stuff. But this shit has to stop.
Ask yourself this question: If the entire media, and all the tech giants, were not slanting things so heavily politically... What do you think the political climate would look like in the USA?
I suspect the entire country would be far more towards the right/libertarian side of the spectrum. But all their manipulation is literally tilting the entire country hard to the left because of their propaganda.
Principles are cool... But sometimes winning is more important. I think a law like the above achieves freedom without doing basically much harm from a libertarian persepctive, and it gives the companies cover for telling the SJWs it's not their choice to allow "evil" people, they have to. It even excludes an exemption for small businesses. It is win win win.
The only people that would be opposed to such a law are those who are against free speech... And such people can suck a dick.
That would be an improvement, but I don't think it's ultimately a solution.
My preferred solution would still be to strike Section 230 entirely.
I dunno.
The thing is one needs to balance interests. In criminal justice, we don't want to be throwing endless innocent people into prison... But we don't want a system so lenient it can never put a guilty person away either.
Something like the above would curtail the BS to a large degree, while still maintaining a lot of freedom. I'm sure some gamesmanship would still happen, but it may be a good balance between things.
AFAIK, Section 230 is primarily about civil liability, not "criminal justice".
Apparently, giving sweetheart exemptions to legal liability to large tech companies is "libertarian" according to Wyden.
I call it cronyism and corruption.
[…] basic level of decency,” Wyden tweeted. He elaborated on his criticism of the legislation in an interview Tuesday with […]
[…] basic level of decency,” Wyden tweeted. He elaborated on his criticism of the legislation in an interview Tuesday with […]
[…] of conservatives argue the section has been misinterpreted. Sen. Ted Cruz, for example, says “neutrality” is a requirement to receive the legislation’s protections. Legal scholars, the law’s authors […]
[…] of conservatives argue the section has been misinterpreted. Sen. Ted Cruz, for example, says “neutrality” is a requirement to receive the legislation’s protections. Legal scholars, the law’s authors […]
It is amazing the amount of effort some conservatives will spend to get to say their tripe on common platforms. Look at the models from past. Conservatives funded and put out Fox news to get their view point spread. Talk radio was the same way. I am old enough to remember when talk radio had a variety of viewpoints (Rus Limbaugh was followed by Alan Combs) . That ended and conservatives took over. So now they are complaining about getting about getting kicked off Facebook. Well stick a crowbar in your wallet and fund your own platform.
They already have been... The problem is nowadays the leftists are literally going after ISPs for hosting people, making them lose their payment processors, etc.
If there was no discrimination beyond the platform, I have little doubt a lot more of these businesses would have grown a lot more. I think they will eventually anyway... The question is what will happen in the meantime with all of the slanting of elections that will happen?
Also, why in gods name is it acceptable for supposedly neutral platforms to be doing this shit in the first place? Centrists, and even leftists, should be outraged about this shit on principle. Don't commies realize if the "next" Facebook is owned by a conservative that it could come back to bite them in the ass?
[…] of conservatives argue the section has been misinterpreted. Sen. Ted Cruz, for example, says “neutrality” is a requirement to receive the legislation’s protections. Legal scholars, the […]
[…] basic level of decency,” Wyden tweeted. He elaborated on his criticism of the legislation in an interview Tuesday with […]
[…] basic level of decency,” Wyden tweeted. He elaborated on his criticism of the legislation in an interview Tuesday with […]
[…] Thread 6/19/19 Members of Congress Sen. Wyden’s Twitter Thread 6/19/19 and in Reason 6/25/19 Former Rep. Chris Cox WSJ Op-ed 6/23/19 Civil society American Action […]
[…] figment of Hawley’s imagination. As Sen. Ron Wyden (D-Ore.), who wrote Section 230, recently told my Reason colleague Eric Boehm, the provision “has nothing to do with […]
[…] figment of Hawley’s imagination. As Sen. Ron Wyden (D-Ore.), who wrote Section 230, recently told my Reason colleague Eric Boehm, the provision “has nothing to do with […]
[…] few controversial topics lend themselves to uniform agreement about censorship or moderation. Far from being about neutrality, Section 230 was intended to allow companies to reach their own conclusions in these gray areas. As […]
[…] few controversial topics lend themselves to uniform agreement about censorship or moderation. Far from being about neutrality, Section 230 was intended to allow companies to reach their own conclusions in these gray areas. As […]
[…] few controversial topics lend themselves to uniform agreement about censorship or moderation. Far from being about neutrality, Section 230 was intended to allow companies to reach their own conclusions in these gray areas. As […]