The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Cryptocurrency's Structural Security Problem

Our best cybersecurity tool doesn't work very well for decentralized cryptocurrencies

|

I explore the remarkable number of failures in cryptocurrency security for Lawfare. I argue that security really is worse for cryptocurrency, because the decentralization that proponents treasure makes it hard to safely disclose and fix security holes:

Software security flaws … are ubiquitous in digital products. Like writers who can't see their own typos, most coders have trouble seeing how their software can be misused. The security flaws in their work are usually found by others, often years later. Indeed, security researchers are still finding serious holes in Windows today—30 years after it became the world's dominant operating system.

Companies like Microsoft have improved their products' security by making peace with [security] researchers. There was a time when software producers treated independent security research as immoral and maybe illegal. But those days are mostly gone, thanks to rough agreement between the producers and the researchers on the rules of "responsible disclosure." Under those rules, researchers disclose the bugs they find "responsibly"—that is, only to the company, and in time for it to quietly develop a patch before black hat hackers find and exploit the flaw. Responsible disclosure and patching greatly improves the security of computer systems, which is why most software companies now offer large "bounties" to researchers who find and report security flaws in their products.

That hasn't exactly brought about a golden age of cybersecurity, but we'd be in much worse shape without the continuous improvements made possible by responsible disclosure.

And that's the problem for cryptocurrency. Responsible disclosure just won't work there, at least not as it's traditionally been understood.

[C]ryptocurrency is famously and deliberately decentralized, anonymized, and low friction. That means that the company responsible for hardware or software security may have no way to identify who used its product, or to get the patch to those users. It also means that many wallets with security flaws will be publicly accessible, protected only by an elaborate password. Once word of the flaw leaks, the password can be reverse engineered by anyone, and the legitimate owners are likely to find themselves in a race to move their assets before the thieves do.

My very tentative decentralized solution is the "responsible rescue" of vulnerable wallets:

The Nomad hack illustrates what might be called the decentralized "rescue" of compromised wallets. The company noticed that some of the people exploiting the flaw said they were doing it to protect the assets. It issued a public appeal to "white hat hackers and ethical researcher friends" to send any funds they rescued to a wallet created for that purpose. It further sweetened the pot by offering a 10 percent bounty for returned funds and promising not to pursue legal actions against those who returned funds. So far, the company reports that $32 million of the $190 million that was stolen has been returned….

[B]ut cryptocurrency rescuers are taking big legal risks…. To reassure good-faith rescuers, legal and financial incentives need to be more systematic and much more certain.