The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
Cybersecurity, European privacy, Snowden, and more
An interview with Cyber Insecurity News
The second half of my interview with Cyber Insecurity News has been posted, here. It deals with a lot of stuff from my career, including DHS, European privacy negotiations, Snowden, and cybersecurity. (The first half is here.) Here's an excerpt that captures some of my thinking on cybersecurity:
CIN: Is the government providing enough intelligence information—threats they're aware of—to companies? I know that's sometimes a complaint that companies have.
SB: Yeah, I do hear that. And I take that with a grain of salt. One thing I've learned from watching the intelligence community for 25 years is that intelligence in the abstract is almost never useful. For the intelligence to get good, you have to have a customer who understands the intelligence and how it's being collected, and can tell the intelligence officers exactly what he wants and what's wrong with the intelligence that has been collected. You don't get good intelligence if you just try to go out and steal the best secrets you can. Because you usually don't know what secrets really matter. You need a very sophisticated consumer who can say, "OK, I see what you've brought me. There are some interesting things here. But it isn't exactly what I need. Go look for X." The spies on their own would never think to look for X. They're just not as deep into the technology or the situation as the consumer. So for an exchange of intelligence to be really useful, you have to make the companies your customer. I don't think that's likely to happen. Part of the private sector's unhappiness with what they're being given is their assumption that since what they got wasn't that useful, they must not have gotten the good stuff. That's not how intelligence works, in my experience. There isn't "good stuff" just hiding out there. The good stuff—you have to dig it out as a customer. That isn't likely ever to be something that the intelligence community does for individual companies.
CIN: What can companies be doing better?
SB: I think companies need to ask themselves—and they're starting to do this—not so much what's on my checklist of technologies to deploy, but who wants my stuff? And what tactics and tools are they using this week to get it? Those are things that are knowable. Those are things that you can get from intelligence. You can get it from the private sector forensics people as well. So knowing that there's a North Korean or Chinese campaign to get information on, let's say, windpower means that, if you're a windpower company, you suddenly have a whole new set of adversaries, and you're going to have to spend a whole hell of a lot more money on cybersecurity or lose your technology. You constantly have to do a cost-benefit analysis there. But the key is knowing who's breaking into your system, and why: Are they motivated by money, are they motivated by the desire to steal your technology? Are they criminals or governments? All of those are things that should be part of your cybersecurity planning.
CIN: You just said the magic word: "money." When I talk to CISOs at companies, if we talk long enough and they feel comfortable enough, they will almost always say something about budgets, and how the management of companies so often underestimates how much is needed. And they routinely underfund cybersecurity budgets. What do you think?
SB: I agree. But I would also like to speak up for management and the board. Their most common complaint is: "My CISO comes in and says, 'Bogeyman, bogeyman, bogeyman! Give me more money!' He tells me that the North Koreans could do these things to me. Should I be worried about that or not? I feel as though I'm chasing goal posts that will always recede." So there is a culture clash there. That's one of the reasons that I like a model that asks, "Who wants my stuff, and what are they doing to get it?" Once you have that answer, you can also ask the question, "What would it cost to make sure they don't get it? What do I have to do to defeat every one of the tools and tactics they currently are using?" Then, when I know how much that will cost, I can compare the cost of letting them win to the cost of making sure they don't. So I think you can reduce this to a cost-benefit analysis, if you're disciplined about identifying who's likely to attack companies like yours, and if you have up-to-date intelligence on how they're getting into the systems they attack.
CIN: If you were the country's cybersecurity czar, what would be the first things you would do?
SB: I would say, "If you are in a business that the lives of a large number of Americans depend upon—pipelines, refineries, electrical power grids, water, sewage systems—you're going to have to meet certain cybersecurity standards. We're going to continue to use regulatory agencies that you are familiar with, if you are regulated already at the federal level, but we're going to find a way to make you meet these standards. And when we're done getting you to meet those standards, the people that you do business with are going to have to meet those standards. We can't afford to have weak links in our security process." I hate to say that, because I see all the downsides to regulation. It slows everything down. It raises the cost of everything. It locks in a lack of competition. But there's no obvious alternative, other than saying, "If you're hacked, we're going to expropriate your company, because you don't deserve to be in business." That would motivate people—sort of like the death penalty for companies. I don't think we really want to do that. So, better to come up with a constantly shifting set of best practices, and impose them on people who currently don't care enough to adopt them. They're not evil people; they just don't have enough skin in the game to worry about security.
CIN: What advice would you like to dispense before we conclude this conversation? Anything special you have to say to general counsel?
SB: Yes. It's from a theme that we've been talking about—the need to think about who your adversary is. And it's one of the hidden advantages in having your general counsel involved in cybersecurity. Engineers are used to protecting against inanimate nature. They're really comfortable asking whether gravity is going to bring a bridge down. They're much less comfortable asking, "What if the Russians wanted to bring it down, and they had all the explosives they needed?" They don't do adversarial thinking as a routine part of their job description. And neither do the risk managers, who are most comfortable asking questions like, "Is this a risk we can self-insure, or do we have to get insurance for it?" In contrast, if you're a good general counsel, you wake up every day and you say, "I know there are people—maybe competitors, maybe the plaintiffs bar, maybe governments—people out there who want to do my company harm using legal tools. I need to know who they are. I need to know what tools they will use to attack, and I need an early warning as they develop new ones. When the plaintiffs bar develops new lawsuits, I want to watch that. I hope that they try them out on somebody else first. But if they don't, my job is to figure out what we will do in response." In other words, the way of thinking about cybersecurity that I'm advocating is a way of thinking about corporate interests that lawyers are already familiar with. It should give general counsel a little bit of confidence that they have a perspective to bring to cybersecurity that isn't just, "What will the regulators think if we don't do this?" They can insist on more deeply adversary-based cybersecurity planning.
CIN: Based on your experience as an outside lawyer who advises general counsel, do you think that many GCs have a sense that cybersecurity is a really important part of their jobs, and have a seat at the table at their companies to help tackle it?
SB: Yes and yes. First, you'd be crazy as a CEO not to give your general counsel a seat at the table, because liability is a significant part of the calculation here. In practically every tabletop exercise we run, people start turning to the general counsel almost routinely. It's odd how often questions that easily could be answered by somebody else—there's no necessary connection to the law—end up getting lateraled to the general counsel for one reason or another. I'm a little less confident that general counsel always bring to the task a breadth of strategic thinking that maximizes their value to the company. They will always say, "I can tell you what our regulatory obligations are. I can tell you what our privacy statement says. I can tell you what the liability decisions of the court might be, or what the FTC would say about this." Those are all things that general counsel should be doing. I feel a little bit like a proselytizer when I say, "No, that isn't enough; what you need to bring to this, in addition to all that, is the sense that your company is engaged in an eternal battle with institutional opponents, and that your job is to make sure your company is as fully armed for that battle as possible."
Show Comments (9)