The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Many thanks to the editors for inviting me to discuss my new article, Very Broad Laws, in which I develop a due process argument against extreme breadth in criminal law; my goal is to lay the conceptual groundwork for a "void-for-breadth" doctrine, whereby courts can bring the hammer down on criminal statutes that are so porously drafted, they effectively sweep in large swaths of everyday conduct. These types of criminal statutes suffer the same essential infirmity, I suggest, as vague and ambiguous criminal statutes: they fail to give ordinary people "fair notice" of how the legal system is likely to respond to their conduct. This is a due process problem because it frustrates predictability. When people lack a meaningful sense of what conduct invites serious intrusion into their lives — either because criminal statutes fail to convey what conduct they prohibit, or because they sweep so broadly that no one has a clue what the "actual" prohibition is — the rule-of-law is undermined.
The problem has not been lost on courts, including the Supreme Court, but the judicial response has been treat breadth as a species of vagueness or ambiguity. This is conceptually unsound. And not surprisingly, it has produced doctrinal confusion. In response, I argue that judges should identify the proverbial spade for what it is, and attack the breadth problem directly. By enforcing constitutionally-derived limits on broad lawmaking, the hope is that courts can encourage narrower laws, just as constitutionally-derived limits on indeterminacy — the rule of lenity and the void-for-vagueness doctrine — have long encouraged more precise laws.
For a concrete example of what I have in mind, consider Yates v. United States, the well-publicized "fish case" from 2015. After a wayward boat captain instructed his crew to throw some undersized snapper back into the ocean, in order to avoid being sanctioned by federal fishing and wildlife authorities for non-regulation catch, an entrepreneurial prosecutor decided to charge him under the obstruction of justice provision of Sarbanes-Oxley (Sec. 1519), a statute that carries a maximum sentence of twenty years, and that criminalizes the destruction or concealment of "record[s], document[s], or tangible object[s]" in order to obstruct a federal investigation.
The boat captain appealed his conviction on the theory that fish are not "tangible objects," and a majority of the Court agreed. Although fish are certainly objects, and it would be hard, abiding the normal parameters of English, to call them intangible, the Court reasoned that "ambiguity … should be resolved in favor of lenity," not "reading [Sec. 1519] expansively to create a coverall spoliation-of-evidence statute." In dissent, Justice Kagan pounced on this conflation of breadth and ambiguity. "Even in its most robust form," she wrote,"[the] rule [of lenity] only kicks in when, after all legitimate tools of interpretation have been exhausted, a reasonable doubt persists regarding whether Congress has made the defendant's conduct a federal crime." But here, "[n]o such doubt lingers." Although "the [Court] points to the breadth of [Sec. 1519] as [if] breadth were equivalent to ambiguity, … [i]t is not. Section 1519 is very broad. It is also very clear. Every traditional tool of statutory interpretation points in the same direction, toward 'object' meaning object [and including fish]. Lenity offers no proper refuge from that straightforward (even though capacious) construction."
My article (1) defends Justice Kagan's analytic position in Yates, while (2) embracing with the majority's overall perception of the case. The Yates Court was right, in other words, to discern a problem here. But the problem is not one of ambiguity. In fact, it is not a linguistic problem at all. It is a normative problem — a concern about the unpalatable results that follow from reading criminal statutes in wooden, but linguistically viable, ways.
Yates is far from the only example of this pattern. Others recent cases in which the majority reins in an alarmingly broad criminal statute by pretending that the problem is linguistic — over vociferous dissenting opinions — include: Marinello v. United States (rejecting an extremely broad theory of tax obstruction), Bond v. United States (invalidating a prosecution for garden-variety assault under a federal statute criminalizing the use of "chemical weapons"), and Morales v. Chicago (striking down an ordinance criminalizing public association with known gang members). Nor is the phenomenon limited to the Supreme Court. On the contrary, the pattern is traceable in lower federal courts and state jurisdictions alike.
In the article itself, I explore these examples in more detail. (And I'll provide more color in subsequent posts.) In each case, however, the same core point holds true. The court would have been on sturdier footing, both conceptually and doctrinally, if it had acknowledged the normative problem — that extremely broad criminal laws subvert due process — and fashioned remedies accordingly. Generally speaking, those remedies fall into two categories. The first, which enjoys favor among courts as well as numerous scholars, is to constrain the operation of broad laws through extra-textual limiting principles. I refer to this as the "rule of narrowness." The second is to limit legislative authority to enact very broad laws in the first place — by striking down statutes (or portions of statutes) as unconstitutionally broad. This, I call the "void-for-breadth" doctrine.
Ultimately, each tool has virtues and drawbacks. The first is nimbler. It leaves the overall statutory scheme intact, while still allowing judges to push back against overzealous, unfair, or otherwise-unreasonable enforcement decisions. The second is stronger medicine — in some cases, overly so — but it also attacks the problem at its root, responding to the reality that lawmakers often have an incentive to write broad laws. In other words, because legislatures frequently want criminal statutes to be broadly drafted and enforced, it stands to reason that limits on legislative authority, above and beyond constraints on enforcement power, would be necessary to corral the problem.
Ultimately, I argue that both tools should be available, just as courts have long taken a dual approach to linguistic indeterminacy, using the rule of lenity to dispel ambiguity, and the void-for-vagueness doctrine to curb amorphousness. Which anti-breadth tool should be used in which contexts is a complex question, but I close by suggesting that it roughly tracks a dichotomy (delineated in the article) between two different modes of breadth.
In one mode, the problem is essentially one of disproportionality: relatively minor violations triggering extremely draconian penalties. In that case, the proper remedy will typically be the rule of narrowness. In the other mode, the problem is, so to speak, the criminalization of everyday life: even if the relationship between conduct and penalty is not terribly asymmetrical, the statute simply criminalizes too much. In this case, by contrast, the proper remedy will usually be the void-for-breadth doctrine.