FCC Chair Brendan Carr Wants More Control Over Social Media
Carr advocates greater control over social media by federal regulators, despite a reputation for supporting free speech.

In his short time as chairman of the Federal Communications Commission (FCC), Brendan Carr has been no stranger to using his power against disfavored entities. The chairman's targets have primarily included broadcast networks and social media companies.
Recently, Carr revealed a fundamental misunderstanding about one of the most important laws governing the internet and social media.
On February 27, digital news outlet Semafor held a summit in Washington, D.C., titled "Innovating to Restore Trust in News," which culminated in a conversation between Semafor editor-in-chief Ben Smith and Carr.
"The social media companies got more power over more speech than any institution in history" in recent years, Carr told Smith. "And I think they're abusing that power. I think it's appropriate for the FCC to say, let's take another look at Section 230."
Section 230 of the Communications Act effectively protects websites and platforms from civil liability for content posted by others. It also protects a platform's decision to moderate content it finds "objectionable, whether or not such material is constitutionally protected."
Like many conservatives, Carr looks askance at social media's latitude to moderate content with what he perceives as impunity. "The FCC should issue an order that interprets Section 230 in a way that eliminates the expansive, non-textual immunities that courts have read into the statute" and "remind courts how the various portions of Section 230 operate," he wrote in a chapter of The Heritage Foundation's Mandate for Leadership, more popularly known as Project 2025.
As Reason noted earlier this month, Carr is mistaken. The Supreme Court ruled last year in Loper Bright Enterprises v. Raimondo that government agencies like the FCC do not have the authority to "clarify" or "interpret" statutes as they see fit; Congress and the courts share that job.
Smith pressed Carr on this point. "The Supreme Court has recently ruled pretty strongly," he noted, "that regulatory agencies are not allowed to go rooting around regulation, looking for new mandates to go dive into the private sector and enforce things on them. And I think a lot of people think your Section 230 aspirations are going to ultimately hit a legal wall, and it just feels like you're really interested in expanding the power of government in a way that feels new."
The comparison is "apples and oranges," Carr replied. Unlike regulatory endeavors undertaken by previous FCC regimes, "social media content moderation is regulated by Congress through Section 230, and so there's a question of how should that thing apply to social media."
This, too, is completely backward. "Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit," according to the Electronic Frontier Foundation (EFF). "This reinforces the First Amendment's protections for publishers to decide what content they will distribute."
"The statute was never intended to be a legal 'stick' to impose conditions on providers' behavior," adds the Wikimedia Foundation, the nonprofit that hosts Wikipedia. "Instead it offers a legal 'carrot' to providers who do indeed take steps to moderate content by shielding them from liability for their moderation efforts."
"Section 230 grants complete immunity for publisher or speaker activities regardless of whether the challenged speech is unlawful," according to a February 2024 report from the Congressional Research Service. "In contrast, the First Amendment requires an inquiry into whether the challenged speech is constitutionally protected and may provide limited or no immunity for certain activities."
This was the explicit intent of the law: to encourage platforms to moderate by allowing them the freedom to make moderation decisions without having to go to court to justify them or fearing that a poor decision could be held against them.
In fact, in the absence of Section 230—or even if it were merely weakened—platforms could be subject to lawsuits if they remove any content, even though choosing what content to allow on your privately-owned website is protected by the First Amendment. Giants like Facebook and YouTube could likely weather that kind of onslaught, but smaller competitors could be driven out of business or otherwise made useless.
Without Section 230, Yelp—the online platform where users rate and review businesses—would be vulnerable to lawsuits from business owners who disagreed with negative reviews. "A user who believes a review violates our content guidelines can flag it for removal—this includes reviews that a final court of competent jurisdiction has deemed to be defamatory," Yelp general counsel Aaron Schur told the EFF. "We do not take sides in factual disputes, however, so we do not remove reviews that appear to reflect the experiences of the reviewer….[Section] 230 is pivotal to our business in this regard: reviews are the responsibility of the people who write them, not the platform that hosts them."
"Absent [Section] 230, websites like Yelp would be pressured to avoid liability by removing legitimate, negative reviews, and they would deprive consumers of information about the experiences of others," Schur added.
If Section 230 were weakened or revised, it would empower the government to wield control over internet platforms by opening them up to lawsuits. Smith said as much to Carr, marveling that Carr seemed "so eager to get a government agency into the business of these private companies."
"I'm eager to apply the law as passed by Congress," Carr retorted. But if he were truly eager to apply Section 230 as Congress passed it, the best way to do so would be to keep his hands off.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Questioning the Trump administration’s authority is lawfare.
Drunk or retarded?
Yes
Yeah, vulgar. Embrace the magic of "And"
Actually your childish silliness establishes a point you can't see: The Internet is llike a transnational company , it will never hew to stockholder or political control UNTIL at least one country starts doing what it can, and that would be Trump and the US
Remember the Nigerian "You've just won a million dollars" ? that is a prime example. We knew where it was , we could see it fleecing folks in many countries but no unified response.
Simple really.
Just repeal section 230 and let all the soon to be unemployed lawyers from the federal government fight it out in civil court.
No need for the feds to give even one tiny damn.
Nope enforce 230. If Google, reddit, and Facebook want to edit comments (like they admit they do) , they are publishers and are now liable for the comments left up
"...want to edit comments (like they admit they do)..."
Citation(s) please!
PS, "I disagree with the cuntents of your cumment."
Now, did I just edit your cumment?
The publisher/platform dichotomy is a red herring.
Sarc is drunk already
He’s never not drunk.
Conservatives would sure save themselves a lot of trouble if they'd just GO AFTER the source of the problem (Democrat Politicians doing 3rd party Censoring) instead of trying to prosecute the middlemen.
Otherwise this whole nation is going to loose their 1A by by-partisan agreement only differentiated by WHO gets to decide what in government.
Don't kid yourselves Republicans. Whatever 'government' regulation over social media you want; the Democrats are going to use 100-Times WORSE when they get the White House. As-if they haven't already demonstrated that.
Prosecute the 1A violations (government shall not abridge) not prosecute "the people" (middlemen) because it's easy.
Demon-Craps done shit first and worst, so "Team R" should get to do TWATEVER THEY WANT!
(Twat could POSSIBLY go wrong with this BRILLIANT idea here, anyway?)
PS, EXCELLENT ARTICLE, Joe Lancaster!
There are definitely some constitutional issues when people demand restricting communication in an environment where free speech is an inalienable right.
I’ve learned that problems are unrecognized opportunities and that the one unavoidable thing that separates us is also the thing that brings us all together in peace.
The same genetic hardwiring that is responsible for evolutionary success when done properly is also responsible for our bigotry and hatred when done improperly. This represents both our human problem and opportunity.
The fields of science and logic have been developed to enable us to do this properly. They enable us to scrutinize our experiences and imagination which have become beliefs to determine if they are or aren’t real.
Each of us performs this process with varying success as we navigate life. Consciously and subconsciously we establish in our minds a picture of our reality which we use to make decisions.
You can see where this is going.
As important to life and happiness that the recognition of reality is, misunderstanding it is equally destructive.
The solution isn’t to stop doing it. Everything isn’t equal. We can’t deny this subconscious process and still evolve so we need to develop the tools to recognize and perform it properly.
I use the terms truth and reality interchangeably. I’ve found that the resistance to do so is indicative of the improper recognition of reality.
Rob, my famous 3 comments about what you say
1)Just because speech is involved DOES NOT make it a 1A issue
2) You increase misinformation when you restrict all players. The government must not enact foolishness such as Biden's Government Diisinformation Board (what a stupid man Biden is) but it must not let the big money and amoral business folk win over the little guy by default
3) Your quote about Evolution shows you know nothing about it. "genetic hardwiring that is responsible for evolutionary success when done properly" does it even make sense to say 'responsible' and then deny that claim by some alien agent who is the subject of 'done properly' who is that ???? And doesn't 'properly' imply someting immaterial and thus totally unrelated to 'evolution"YOU ARE TALKING NONSENSE
Simply questioning what I have said doesn’t refute it. My statements are the culmination of decades of correctly aka properly applied logic and science.
Aside from merely questioning, what makes you believe that anything you say or think is true aka real?
Is it poorly applied logic and science that you recognize you base your perception on, or do you think that you apply logic and science “correctly” in your response to me?
These are my correctly applied logical responses that refute your 3 “famous” points.
1) Anything that restricts the freedom of speech is a 1A issue.
2) Criminalizing and prosecuting all lying does not logically or actually increase lying.
3) Evolution has a purpose. It logically must be to promote life. Therefore any actions that we take that don’t actually promote life must be counterproductive or incorrectly applied.
It’s just correctly applied logic. Good luck refuting it.
I suspect that you will have only two choices
1) agree with my correctly applied logic
2) maintain the incorrect position of your poorly applied logic
If that interpretation of Section 230 is correct and will be upheld by the courts, then 230 needs to be repealed and replaced. What we need is a 230-like law that recognizes an agreement between web site providers and regulators that protection from liability for user content is IN EXCHANGE FOR certain restrictions on their ability to censor and curate user content. I believe the current Section 230 requires that, but apparently I don't read it like judges do.
I believe the current Section 230 requires whining crybabies to SNOT sue me when I decide (on MY web site) to take down their posts that piss me off! Per our user agreement! Use of Government Almighty courts, for the whining crybabies to SUE me over MY cuntrol of MY web site, is Marxism by another name! Long live Section 230, and long live property rights!
(Outside of S-230 altogether, you are also FREE to whine and cry to the courts, if you ask me for your $0.00 that you paid me to post freely, and I refused to pay you your $0.00 back!)
Funny how Loper Bright was so good when it was decided during the Biden admin...
S230: "you're free to host what you like, and if you don't like it you can remove it."
Democrats: "We want you to say this, not that"
Republicans: "That's appalling! We want you to say that not this,"
No one here believes you are neither Democrat or Republican so you can be ignored. As to your self-contradicting statement itself: That is the same administration that would have given us Government Disinfromation Board --- and would you really be hailing Biden then ????
There is of course a 3rd response. "Do whatever you like about comments but fully state to all users what you are in fact doing--gving them the chance to retract or rescind what you've edited their real comments to be"
When Zuckerberg complained about Biden "forcing" him to censor that was almost certainly a lie. Just as he gave in to Biden by doing what Biden wanted, he is now doing what Trump wants . do you not see that !!!!!!!!!!!!!!!! THERE IS NO DIFFERENCE AT ALL
I rejected your view as demonic and Nazi like after the Supreme Court and ACLU both attacked Kamala Harris for in effect DOXING anybody who posted to a cause that she didn't like
HERE IS THE HORRIBLE STORY
How Kamala Harris Earned Rebukes from ACLU and SCOTUS on Privacy
'The Breaches of Confidentiality Here Were Massive'
By Jerry Rogers
August 22, 2024
https://www.realclearpolicy.com/articles/2024/08/22/how_kamala_harris_earned_rebukes_from_aclu_and_scotus_on_privacy_1053395.html
No one here believes you are neither Democrat or Republican so you can be ignored.
And then you proceed not to ignore me, idiot. I don't care what people believe about my opinions.
"This was the explicit intent of the law: to encourage platforms to moderate by allowing them the freedom to make moderation decisions." This has it backwards.
Actually, platforms need no incentive to remove material; they need an incentive to keep it up, because that is when they could be held liable for defamation etc. The law is designed to protect sites when they show material (from others) which might generate liability.
digital news outlet Semafor held a summit in Washington, D.C., titled "Innovating to Restore Trust in News,"
"Semafor is a news website founded in 2022 by Ben Smith, a former editor-in-chief of BuzzFeed News and media columnist at The New York Times, and Justin B. Smith, the former CEO of Bloomberg Media Group."
BuzzFeed and NYT had a baby, and we're supposed to treat it like royalty instead of the nepo baby that it is? Sorry, we don't agree that this Leftist drivel is "news".
Want to trust anyone?
Criminalize lying.
Otherwise shut the fuck up.
He might be wrong if you took him as predicting how the Supreme court would rule, but he's not wrong about the intended meaning of Section 230: It was only supposed to immunize a narrow range of moderation of consensus "objectionable" content. Then the judiciary decided to read that catchall, "or otherwise objectionable" as granting platforms total editorial control without assuming any liability.
Such catchalls are supposed to be interpreted as meaning the same sort of thing as the rest of the list, it's a general principle of legal interpretation.
So, if Section 230 were being interpreted as it's authors meant it to be, the platforms would actually be assuming direct liability any time they engaged in more than limited moderation of content found objectionable by the users.
It's also notable that Section 230 anticipated the widespread use of third party filters chosen by users to accomplish any moderation beyond that. But the platforms systematically defeat the use of such filters, rather than accommodating them, as Section 230 directs.