The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Class Action Not Allowed in Suit Alleging Psychological Harm to TikTok Content Moderators
From yesterday's Order Denying Motion for Class Certification, by Judge Vince Chhabria (N.D. Cal.) in Young v. ByteDance Inc.:
[1.] The primary relief sought by Young on behalf of the class is forward-looking: he wants the Court to order TikTok to institute various protective measures to mitigate the risk that content moderators will suffer psychological harm from reviewing disturbing videos and photos. But Young has no standing to seek such relief in federal court. He is no longer employed as a TikTok moderator and has no intention of returning to that work. He had already left his job by the time he sued TikTok, so the "capable of repetition" doctrine is not available to him. Furthermore, even if Young had quit after he filed suit, he has not submitted sufficient evidence to support his assertion that no monitor would remain in the job long enough to pursue a class action.
[2.] The secondary form of relief Young seeks is something he calls a "medical monitoring fund." There is no mention of this proposed fund in the motion for class certification, much less an explanation of how it would work. On reply, Young makes general reference to it, but still doesn't really explain it. At the hearing, Young's counsel explained that TikTok should be ordered to put money in a fund that would be used to benefit the class in two ways: to monitor all class members for signs of emotional distress, and to provide people treatment (for example, therapy) in the event they need it.
Even after this explanation, the Court lacks the information necessary to assess whether such a fund would be feasible. For example, while Young uses the phrase "medical monitoring," he seems to be asking for more than that: by saying it's for treatment in addition to monitoring, he's proposing to provide compensation for injuries that have been or will be suffered by class members. In the same breath, Young says that this is "not a personal injury case," even though he is proposing that class members receive at least partial compensation for their personal injuries. Nor has Young explained how fund administrators would be able to determine whether any mental health condition suffered by a class member was caused by watching disturbing videos or by something else in their life, or some combination of both. It's not even clear, based on the parties' briefs, that this proposed fund should be considered retrospective relief or prospective relief, or a combination of the two. This is not to suggest that a fund could never be a proper remedy in a case like this, but Young has not adequately explained how it would work.
[3.] Young has not shown that the question of whether TikTok retained control over the moderators' work for the vendors (or exercised that control) is apt to generate the same answer across the class. Based on all the evidence (of how the software works, how moderators were trained, who trained them, etc.), perhaps it's possible that a jury could answer this question the same way for all class members. But it's undisputed that TikTok had much more direct involvement in the training and supervision of moderators who worked for third-party agencies (TPAs).
TikTok had less involvement in the training and supervision of moderators who worked for business process outsourcing vendors (BPOs). Based on these differences, it's possible—maybe even likely—that a jury would apply the "retained control" exception to TPAs but not BPOs. Only 335 people worked as moderators for TPAs, while more than 12,000 worked for BPOs. Young was one of the few who worked for a TPA, and he's the only named plaintiff, which means that the class certification motion is contemplating a trial where the vast majority of class members could be effectively unrepresented on a key threshold question in the case.
[4.] Even the number of TPA moderators—335—overstates the size of the class that Young could (if there were not so many other problems) represent. That's because more than two-thirds of these people signed arbitration agreements with the TPAs. (An even higher percentage of moderators signed arbitration agreements with the BPOs.) The Court already ruled that equitable estoppel prevented former named plaintiff Ashley Velez from suing TikTok in light of her arbitration agreement. This would likely apply to the other proposed class members who signed arbitration agreements—or at least Young has not shown how it wouldn't.
[5.] Tne element of Young's negligence claim is breach—that TikTok breached the standard of care for minimizing exposure to disturbing content, and/or for implementing measures to mitigate the psychological impact of that exposure. The evidence in the record creates some concern that this question could generate different answers across the class, because moderators received different explanations/warnings about the type of content they would be exposed to, because there could be variation in the type/amount of training they received, and because there may have been different mitigation measures available at their particular jobs.
It's not clear that this concern alone would defeat class certification—it may be that the question could be presented to the jury in a way that allows it to conclude that TikTok breached its duty to all moderators (or to none of them). But this adds to the weight of the concerns discussed above and below.
[6.] An additional element of Young's claim is demonstrating that TikTok's alleged breach of the standard of care caused him harm. Young has not shown that the jury could make the same finding on this element as to the entire class (or what would remain of the class after accounting for the arbitration agreements and the differences between TPAs and the BPOs).
Young responds that the harm in this case is not the actual mental condition (such as PTSD) that a moderator might suffer, but the risk of suffering from such a condition. That's fair, but then why does Young seek a fund that would compensate moderators for the harm they suffered—and without explaining how the harm could be connected to the content moderation as opposed to other potential traumas?
[7.] Up to this point, the reader may have been assuming that the same state law would apply to all proposed class members. That's not so. At the pleading stage, the Court assumed California law would apply, because neither side suggested that any other state's law would apply.
But now reality kicks in. The proposed class members worked in at least 24 different states, with fewer than 400 of those class members living and/or working in California. (It's unclear from the record how many of these 400 worked for TPAs and didn't sign an arbitration agreement.) As TikTok now contends (and Young is wrong that TikTok forfeited this argument), the law of the state where a given moderator worked will apply to that moderator's claim. That's obviously correct. See Cotter v. Lyft, Inc. (N.D. Cal. 2014). It's not clear whether there are significant differences in the law of the 24 states, but Young has not shown that there aren't.
[8.] Finally, the Court is not convinced that the class action device would be superior to individual adjudication in this context. This is not like a consumer case where each proposed class member's monetary harm is so low as to destroy any incentive to bring an individual suit. Nor is this like a wage-and-hour case where the individual recoveries could be small while proving liability and damages across the class is fairly mechanical. If a current or former moderator is suffering from PTSD based on TikTok's failure to protect them from the harm caused by disturbing content that TikTok knew its moderators would be exposed to, it seems likely that the moderator could present their evidence in an individual trial and obtain a substantial recovery, without having to deal with all the problems discussed in this ruling….
Jesse A. Cripps, Jr., Lauren Margaret Blas, Leonora Cohen, and Viola Li (Gibson, Dunn & Crutcher) represent TikTok.
Show Comments (6)