The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

The Reverse Spiderman Principle: With Great Responsibility Comes Great Power

A quick sketch of a longer article that I'm writing for a symposium; I'd love to have readers' suggestions and reactions!

|

An entity—a landlord, a manufacturer, a phone company, an Internet platform, a manufacturer of self-driving cars—is making money off its customers' activities. Some of those customers are using the entity's services in ways that are criminal or tortious. Should the entity be held responsible for the customers' activities?

This is a broad question, and there might be no general answer. But in this essay, I'd like to focus on one downside of answering it "yes": What I call the Reverse Spiderman Principle—with great responsibility comes great power. We should think whether we want to empower such entities to surveil, investigate, and police their customers' behavior.

* * *

Let me begin by offering three examples of where courts have balked at imposing legal liability, because they didn't want to require or encourage businesses to police their customers. The first came in the early 1900s, where some government officials demanded that telephone and telegraph companies block access to their services by people suspected of running illegal gambling operations. Prosecutors could have gone after the bookies, of course, and they did. But they also argued that the companies should have done the same—and indeed sometimes prosecuted the companies for allowing their lines to be used for such criminal purposes. No, held many courts; to quote one:

A railroad company has a right to refuse to carry a passenger who is disorderly, or whose conduct imperils the lives of his fellow passengers or the officers or the property of the company. It would have no right to refuse to carry a person who tendered or paid his fare simply because those in charge of the train believed that his purpose in going to a certain point was to commit an offense. A railroad company would have no right to refuse to carry persons because its officers were aware of the fact that they were going to visit the house of [the bookmaker], and thus make it possible for him and his associates to conduct a gambling house. Common carriers are not the censors of public or private morals. They cannot regulate the public and private conduct of those who ask service at their hands.[1]

If the telegraph or telephone company (or the railroad) were held responsible for the actions of its customers, then it would acquire power—as "censor[] of public or private morals"—that it ought not possess.

Now those companies were common carriers, denied such power (and therefore, the courts said, responsibility) by law. But consider a second example, Lunney v. Prodigy Services Co., a 1999 case in which the New York high court held that e-mail systems were immune from liability for allegedly defamatory material sent by their users.[2] E-mail systems aren't common carriers, but the court nonetheless reasoned that they shouldn't be held responsible for failing to block messages, even if they had the legal authority to block them: An e-mail system's "role in transmitting e-mail is akin to that of a telephone company," the court held, "which one neither wants nor expects to superintend the content of its subscribers' conversations."[3] Even if e-mail systems aren't forbidden from being the censors of their users' communications, the law shouldn't pressure them into becoming such censors.

And here's a third example: Castaneda v. Olsher, where a mobile home park tenant injured in a gang-related shootout involving another tenant sued the landlord, claiming it "had breached a duty not to rent to known gang members." No, said the court:

[W]e are not persuaded that imposing a duty on landlords to withhold rental units from those they believe to be gang members is a fair or workable solution to [the] problem [of gang violence], or one consistent with our state's public policy as a whole. . . .

If landlords regularly face liability for injuries gang members cause on the premises, they will tend to deny rental to anyone who might be a gang member or, even more broadly, to any family one of whose members might be in a gang.[4]

This would in turn tend to lead to "arbitrary discrimination on the basis of race, ethnicity, family composition, dress and appearance, or reputation," which may itself be illegal (so the duty would put the landlord in a damned-if-you-do-damned-if-you-don't position).

But even apart from such likely reactions by landlords being illegal, making landlords liable would jeopardize people's housing options and undermine their freedom even if they aren't gang members, putting them further in the power of their landlords: "families whose ethnicity, teenage children, or mode of dress or personal appearance could, to some, suggest a gang association would face an additional obstacle to finding housing." Likewise, even if landlords respond only by legally and evenhandedly checking all tenants' criminal histories, "refusing to rent to anyone with arrests or convictions for any crime that could have involved a gang" would "unfairly deprive many Californians of housing." This "likely social cost" helped turn the court against recognizing such a responsibility on the part of landlords.

* * *

Now of course many such companies (setting aside the common carriers or similarly regulated monopolies) have great power over whom to deal with and what to allow on their property, even though they aren't held responsible—by law or by public attitudes—for what happens on their property. In theory, for instance, Prodigy could have decided that it wanted to kick out users who they thought were using their e-mail for evil purposes (whether libel or, say, anti-capitalist advocacy).

But in practice, in the absence of responsibility (whether imposed by law or social norms), many companies will eschew such power, for several related reasons—even setting aside the (presumably minor) loss of business from the particular customers who are ejected:

  1. Policing customers takes time, effort, and money.
  2. Policing customers risks error and bad publicity associated with such error, which could alienate many more customers than the few who are actually denied service.
  3. Policing customers in particular risks allegation of discriminatory policing, which may itself be illegal and at least is especially likely to yield bad publicity.
  4. Policing some customers will often lead to public demands for broader policing: "You kicked group X, which we sort of like, off your platform; why aren't you also kicking off group Y, whom we loathe and whom we view as similar to X?"
  5. Conversely, a policy of "we don't police our customers"—buttressed by social norms that don't require (or even affirmatively condemn) such policing—offers the company a simple response to all such demands.
  6. Policing customers creates tension even with customers who aren't violating the company's rules—people often don't like even the prospect that some business is judging what they say, how they dress, or whom they associate with.
  7. Policing customers gives an edge to competitors who can sell their services as "our only job is to serve you, not to judge you or eject you."

Conversely, imposing legal responsibility on such companies can encourage them to exercise power even when they otherwise wouldn't have. And that is in some measure so even if responsibility is accepted as a broad moral norm, enforced by public pressure, not just a legal norm. That norm would increase the countervailing costs of non-policing. And it would decrease the costs of policing: For instance, the norm and the corresponding pressure will likely act on all major competitors, so the normal competitive pressures encouraging a "the customer is always right" attitude will be sharply reduced.

Likewise, when people fault a company for errors or perceived discrimination, the company can use the norm as cover, for instance arguing that "regrettably, errors will happen, especially when one has to do policing at scale." "After all, you've told us you want us to police, don't you?"

Accepting such norms of responsibility can also change the culture and organization of the companies. It would habituate the companies to exercising such power. It would create bureaucracies within the companies staffed with people whose jobs rely on exercising the power—and which might be looking for more reasons to exercise that power.

And by making policing part of the companies' official mission, it would subtly encourage employees to make sure that the policing is done effectively and comprehensively, and not just at the minimum that laws or existing social norms command. Modest initial policing missions, based on claims of responsibility for a narrow range of misuse, can thus creep into much more comprehensive use of such powers.

* * *

So far, there has been something of a constraint on calls for business "responsibility" for the actions of their customers: Such calls have generally involved ongoing business-customer relationships, for instance when Facebook can monitor what its users are posting (or at least respond to other users' complaints).

Occasionally, there have been calls for businesses to simply not deal with certain people at the outset—consider Castaneda v. Olsher, where plaintiffs argued that defendants just shouldn't have rented the mobile homes to likely gang members. But those have been rare. Few people, for instance, would think of arguing that car dealers should refuse to sell cars to suspected gang members who might use the cars for drive-by shootings or for crime getaways.[5] Presumably most people would agree that even gang members are entitled to buy and use cars in the many lawful ways that cars can be used, and that car dealers shouldn't try to deny gang members access to cars. (If the legislature wants to impose such responsibilities, for instance by banning sales of guns to felons or of spray paint to minors, then presumably the legislature should create such narrow and clearly defined rules, which rely on objective criteria that don't require seller judgment about which customers merely seem likely to be dangerous.)

But now more and more products involve constant interaction between the customer and the seller. Say, for instance, that I'm driving a self-driving Tesla that is in constant contact with the company. Recall how Airbnb refused to rent to people who it suspected were coming to Charlotteville for the "Unite the Right" rally.[6] If that is seen as proper—and indeed as mandated by corporate social responsibility principles—then one can imagine similar pressure on Tesla to stop Teslas from driving to the rally (or at least to stop such trips by Teslas of those people suspected of planning to participate in the rally).

To be sure, this might arouse some hostility, because it's my car, not Tesla's. But of course Airbnb was refusing to arrange bookings for other people's properties, not its own. Its rationale was that it had a responsibility to stop its service from being used to promote a racist, violent event. But why wouldn't Tesla then have a responsibility to stop its intellectual property and its computers (assuming they are in constant touch with my car) from being used the same way? To be sure, Tesla's sale contract might be seen as implicitly assuring that its software will always try to get me to my destination. But that is just a matter of the contract—if companies are seen as responsible for the misuse of their services, why wouldn't they have an obligation to draft contracts that allow them to fulfill that responsibility?

Now maybe some line might be drawn here: Perhaps, for instance, we might have a special rule for services that are ancillary to the sale of goods (Tesla, yes, Airbnb, no), under which the transfer of the goods carries with it the legal or moral obligation to keep providing the services even when one thinks the goods are likely to be used in illegal or immoral ways. (Though what if I lease my Tesla rather than buying it outright?) Or at least we might say there's nothing irresponsible about a product seller refusing to police customers' continuing use of the services that make those products work.

But that would just be a special case of the broader approach that I'm suggesting here: For at least some kinds of commercial relationships, a business should not be held responsible for what its customers do—because we don't want it exercising power over its customers' actions.

* * *

[Again, this is of course just a sketch of what concerns me here. I'll be elaborating on this, and also discussing areas where businesses are indeed held responsible for the actions of others—certainly of their employees but also, sometimes, of tenants and other customers.[7] I'll also discuss how responsibility to prevent crime on one's property has led to pressure to institute more surveillance, for instance with many courts recognizing a duty to install cameras on commercial property. I'd love to hear any suggestions or reactions you folks might have, since I still have plenty of time to work on this.]

[1] Commonwealth v. Western Union Tel. Co., 67 S.W. 59, 60 (Ky. 1901); see also Pa. Publications v. Pa. Pub. Util. Comm'n, 36 A.2d 777, 781 (Pa. 1944) (cleaned up); People v. Brophy, 120 P.2d 946, 956 (Cal. App. 1942).

[2] The case turned on conduct that happened before the enactment of 47 U.S.C. § 230, which provided such immunity by statute. The court therefore addressed whether a libel claim was available in the first place, thus avoiding the need to determine whether § 230 was retroactive.

[3] Lunney v. Prodigy Servs. Co., 723 N.E.2d 539, 542 (N.Y. 1999).

[4] 162 P.3d 610 (Cal. 2007). On this point, the Justices were unanimous.

[5] But see Andrew Jay McClurg, The Tortious Marketing of Handguns: Strict Liability Is Dead, Long Live Negligence, 19 Seton Hall Legis. J. 777, 816 n.178 (1995) (quoting a proposal that gun sellers must, on pain of liability for negligence, "be especially alert to, and wary of, gun buyers who display certain behavioral characteristics such as … appear[ing] in unkempt clothing and hav[ing] a slovenly appearance").

[6] https://www.thedailybeasairbnbt.com/airbnb-uber-plan-to-ban-unite-the-right-white-supremacist-rally-participants

[7] Cf. Morris v. Giant Four Corners, Inc., 498 P.3d 238 (N.M. 2021) (a rare case holding a gas station liable for selling gas to a drunk driver). Many courts reject such negligent entrustment claims when someone is selling goods, as opposed to merely lending them or renting them out.