TikTok Users Say They're Being Censored After New Owners Were Announced. The Company Says It's a Tech Issue.
Here's why I believe TikTok.
TikTok creators' worst nightmares seemed to be coming true this week. Just a few days after the company announced a deal to spin off U.S. operations from TikTok's Chinese-affiliated parent company, ByteDance, some people posting political content started seeing zero views on their posts. And this attention deficit was especially stark on videos about Immigration and Customs Enforcement (ICE) actions in Minneapolis, some creators alleged.
For many TikTok creators, it was confirmation that the new investors were intent on doing the Trump administration's bidding. "TikTok is now state-controlled media," California state Sen. Scott Wiener (D–San Francisco) commented.
Or…maybe not? Low viewership counts were caused by technical difficulties, according to the spinoff company, known as the TikTok USDS Joint Venture. (USDS stands for United States Data Security.) "We're working to restore our services following a power outage at a U.S. data center," it posted on Monday morning.
You are reading Sex & Tech, from Elizabeth Nolan Brown. Get more of Elizabeth's sex, tech, bodily autonomy, law, and online culture coverage.
That evening, it noted that "creators may temporarily see '0' views or likes on videos, and [their] earnings may look like they're missing. This is a display error caused by server timeouts," and "actual data and engagement are safe."
A Conspiracy To Censor the NFL?
What should we make of this? Could some variation on Hanlon's razor actually apply here? (Replace "stupidity" with "massive winter storm.")
I'm inclined to believe so, for several reasons.
For one thing, there's evidence that it was not just political content or anti-ICE content that got throttled by what TikTok called a "major infrastructure issue triggered by a power outage at one of our U.S. data center partner sites."
The NFL, for instance, reportedly had several posts showing zero views.
As of now, NFL video view counts appear to be back to normal—and so do views on videos that Brian Krassenstein claimed had been throttled because of their political leaning.
We've made significant progress in recovering our U.S. infrastructure with our U.S. data center partner. However, the U.S. user experience may still have some technical issues, including when posting new content. We're committed to bringing TikTok back to its full capacity as…
— TikTok USDS Joint Venture (@tiktokusdsjv) January 27, 2026
Other creators—someone posting about HIV, someone "yapping about being sick," to name a few—also reported seeing zero views on new posts. "From what I'm seeing it's across platform affecting everyone," noted Preston Stewart of the Unclassified podcast.
What's more, it seems that TikTok wasn't alone in having issues. There was a "spike in Microsoft 365 outages on Monday, during a large winter storm that caused widespread power outages throughout much of North America," notes Techspot.
What about the fact that so many people posting about Alex Pretti's killing and ICE 's aggression in Minneapolis saw videos affected? One could easily chalk it up to timing. Pretti was killed on Saturday, January 24, and people subsequently started posting a lot about it. TikTok's tech issues happened not long after, thus sweeping up a lot of those videos.
Why didn't we hear more about nonpolitical content being affected? That could come down to salience. People who were primed to be concerned about censorship of their content quickly noticed and publicly commented on the lack of eyeballs, and this was quick to gain pickup on other social platforms. Those whose dance videos or makeup tutorials didn't do well would likely be less inclined to attribute it to a coordinated plan, less inclined to post about it, or less likely to get attention when they did.
Supervillain Speculation
There are broader (and more speculative) reasons to be skeptical of "TikTok is now state media" narratives. Even if the new TikTok owners do eventually intend algorithmic changes that would benefit Republicans, it would be mighty ballsy—and dumb—to immediately roll these out in extreme force on week one.
If you're a supervillain intent on turning a beloved app into a U.S. propaganda machine, it makes a lot more sense to start gradually tweaking the algorithm to send fewer eyeballs to content that's critical of President Donald Trump and more to content that benefits his administration. You would want the changes to happen slowly and subtly, giving you plausible deniability. Otherwise, you risk alienating a huge swath of your users and your creators, not to mention completely burnishing any reputation you've built up for being a relatively neutral marketplace of values.
Even if the goal was more immediate—to take attention away from a federal agent killing Pretti or hiding negative reactions to this—the "hide everything" tack would be poor strategy. Suddenly suppressing all such content only leads creators to talk about it more, on TikTok and elsewhere. (See: The Streisand effect.)
Call me naive, but I tend to imagine my supervillains a bit smarter than this.
Are Changes Coming?
Then again, I also remain skeptical that the new TikTok will start seriously stymieing nonright viewpoints.
TikTok is a business. The new investors paid a lot for this business. And it's bad for business to engage in open and obvious suppression of certain political views.
I think it's possible that we'll start to see some subtle or less significant changes to the way content is moderated, featured, etc. New ownership always brings new priorities and sensibilities. And part of the point of the government forcing ByteDance to sell U.S. TikTok was due to concerns about the Chinese government having potentially influenced the existing algorithm.
As part of the new venture, U.S. TikTok "will retrain, test, and update the content recommendation algorithm on U.S. user data," it said.
But I don't expect we'll even see TikTok go the way of Twitter/X and radically shift its rules or tone. In the latter case, we've got a platform hemmed by someone with strong political views, a cultural agenda that he's been open about, and the ability to more or less unilaterally make decisions for the company if he so chooses. This is not the case with the new TikTok USDS Joint Venture, no matter what views or business ties certain investors and directors may have.
Follow-Up: Sweden's Criminalization of Online Sex Acts
Last May, authorities in Sweden—where paying for sex is illegal—voted to criminalize the purchase of remote sexual services. While working for an online sex business or selling custom pornographic images online is still legal, patronizing such businesses and individuals became a crime as of July 1, 2025.
"The purchase of live cam shows, where customers attend and tip, is now illegal in Sweden. So is the purchase of 'customs', where people request an OnlyFans creator to make something specifically for them," explains Amy Moretsele at Polyester. "Producing and consuming pornography remains legal, however, as it's seen as expressive material and protected under freedom of expression laws."
Moretsele talked to online sex workers about the effect the new law has had on their work and their lives.
Amanda Breden, 33, has been creating on OnlyFans for five years following a decade-long career as a glamour model in Sweden. "At that time, the platforms and companies I worked with were much more controlled by other people, and I had limited influence over how my image and content were used," she tells me.
"That's one of the reasons I felt genuinely happy and empowered when OnlyFans came along. It allowed women like me to take back control, manage our own platforms, set our own boundaries, and decide for ourselves how we wanted to work — without outside interference."
According to Amanda, most of her customers are respectful, curious, and genuinely interested in supporting creators they like. "Often, it's less about sex and more about exclusivity, communication, and feeling seen," she said.
More here.
Follow-Up: TikTok To Settle in Social Media Suit
As a landmark case against social media platforms gets underway in Los Angeles this week, defendants have been dwindling. First, Snap settled with the K.G.M.—a 19-year-old woman who says she was addicted to and harmed by YouTube, Instagram, Snapchat, and TikTok as a minor—a few days before the trial started. Now, TikTok will also settle, per reports yesterday.
"A lawyer for [K.G.M.] confirmed that an agreement in principle had been reached hours before jury selection started in Los Angeles on Tuesday," reports Bloomberg Law. "The woman's similar claims against Meta Platforms Inc.'s Facebook and Instagram and Google's YouTube will proceed as planned for now. Terms of the proposed settlement have not been disclosed."
More Sex & Tech News
Does Section 230 apply to AI? Experts aren't sure. In Reason's February/March print issue, I explored why applying Section 230 to generative AI is a hotly debated issue and where the fault lines lie.
A clean slate for some trafficking victims: The "Trafficking Survivors Relief Act," a measure signed into law by Trump last Friday, "establishes a process to allow human trafficking victims to file motions to vacate their convictions and expunge their arrest records for certain criminal offenses committed as a direct result of their being trafficked."
The U.S. Sun profiles the Fokkens, 83-year-old Dutch twins with an illustrious history in Amsterdam's red light district. "Their fascinating journey could be spun into a Netflix series, Martine [Fokkens] suggested," per the Sun. "They would want it to reflect … the humour, the rules, the mess, the loneliness, the joy and much more. They don't want a glossy lecture empowerment, or a tidy moral lesson packaged for social media. They want the real thing, told the way they lived it; funny, blunt, sometimes filthy, sometimes unexpectedly tender."
A constitution for AI: Anthropic recently released Claude's constitution, which it describes as "a holistic document that explains the context in which Claude operates and the kind of entity we would like Claude to be." The document's "primary audience is Claude, not us," notes The Argument's Kelsey Piper. The document "tries to ground the behavior it wants from Claude in arguments about why Claude should want to be trustworthy to Anthropic even if it becomes persuaded that Anthropic is making a major mistake, and not act to subvert its creators even if it both could do so and thinks it's the right thing to do." Piper continues:
Its principal authors are philosophers Amanda Askell and Joe Carlsmith, and it is very fundamentally a work of philosophy: an effort to rigorously explain one's hopes to a child who learns only from text and has vacuumed up a whole internet full of it.
Ultimately, I am skeptical that this approach will work. The fact that Anthropic is racing ahead to try to build a superintelligence by using each Claude to hasten the building of the next Claude does not seem to me like a good idea.
But I think Claude's Constitution reflects the best possible implementation of this approach — an honest, careful, nuanced document that at least doesn't fail by lying to itself or to Claude about what problem we're trying to solve. If it doesn't work, it's because nothing in this vein will work at all — and if it does work, I do like the person that it proposes that Claude be.
More for the "everything is TV" files: "Even Instagram and Substack are TV now."
States move to ban AI data centers: "In Georgia a state lawmaker has introduced a bill proposing what could become the first statewide moratorium on new datacenters in America," reports The Guardian. "The bill is one of at least three statewide moratoriums on datacenters introduced in state legislatures in the last week as Maryland and Oklahoma lawmakers are also considering similar measures."
Does videotape privacy law apply to the internet? "The Supreme Court on Monday agreed to hear a case which asks whether the 1988 Video Privacy Protection Act (VPPA) applies to users who sign up for newsletters from websites that use Meta's tracking technology," reports Jurist News. More:
The lawsuit accuses Paramount Global of violating the VPPA by disclosing digital subscribers' identities and video media information, without proper consent, to Facebook for targeted advertising purposes. Paramount Global runs 247Sports.com, through which plaintiff Micheal Salazar had subscribed to a free email newsletter and viewed video clips. The website utilized Meta Pixel, which allegedly sent Salazar's Facebook ID and browsing data to Facebook after his engagement with the site.
The case asks whether a person has to rent, purchase, or subscribe specifically to video content to be considered a "consumer," or whether subscribing to any services, such as a free newsletter, triggers VPPA protections.