Reason.com

Free Minds & Free Markets

Why We Don't Need a Department of Technology Policy

Let's focus on overturning existing government policies that undermine security.

Donald Trump meets with tech leaders at Trump Tower December 14, 2016Polaris/NewscomThere seem to be embarrassing new "Internet of Things" failures every week now. Sometimes, they are on the humorous side, like when a "smart toilet" was hacked to randomly flush at startled bathroom-goers. Other times, they can be disturbing, as in case of critical vulnerabilities in St. Jude's implantable cardiac devices that could put users' lives in the hands of hackers. But in all cases, these failures tend to grab headlines and inflame calls for government regulation.

It's not hard to see why. When faced with some kind of public dilemma, many people immediately assume that the government alone can solve the problem. And when you throw in futuristic fears about losing control of everyday things around us, the prospect of a savior from above seems all the more necessary. But we must take care that such "solutions" don't create more problems than they supposedly solve. Such would almost certainly be the case with one recent proposal: a "Department of Technology Policy."

A 'World-Size Robot'

Recently, Bruce Schneier, a veteran in information-security studies and leading voice in technology policy, penned a long article for the New Yorker in which he argues for the creation of a new federal agency—the "Department of Technology Policy"—that would consolidate control of technological regulations into a single body. Schneier explains how the incredible rate of "smart"-device adoption has created some new and unprecedented security challenges.

Few people realize just how quickly IOT devices have saturated the world around them. This will only accelerate—Schneier likens the rise of IOT technologies to building a "world-size robot," with all of the sensors, commands, and computations to match. And with an expanded connected reality comes an expanded digital threat set. Computer bugs and software vulnerabilities no longer merely endanger personal data and hardware, they can potentially shut down connected home devices or hijack moving cars and even cause us physical harm.

Indeed, there have been considerable security problems with connected devices. Often, the issues are theoretical: Security researchers warn the public at conferences and in journals of major vulnerabilities they discover in popular consumer routers or printers or security cameras—vulnerabilities which may or may not end up getting patched.

But sometimes these vulnerabilities are actually exploited. Last October, some of the Internet's most popular websites—Twitter, Amazon, GitHub, Reddit—were knocked offline thanks to insecure IOT devices. Some malicious actor was able to infect an army of DVRs, cameras, baby monitors, and printers with a malware called Mirai, directing these devices to launch a distributed-denial-of-service (DDOS) attack on those websites' hosting provider, Dyn. While the attack was short, and the fallout was mostly limited to inconvenience and loss of sales, it was a major warning signal for security researchers who envisioned how such an attack could have been much more devastating.

The main problem, as Schneier sees it, is that many companies developing and selling connected devices do not have the right security chops to make sure that they are safe before people buy them. Technology companies like Google and Apple have large dedicated teams to locate and patch software vulnerabilities as soon as possible—and even this process is imperfect. Now, companies who have no such software experience may put IOT products out to market without the necessary testing, which could create major unexpected problems down the road. And the home consumers who buy such devices are seldom equipped to evaluate the security settings on their own.

Whose Failure?

While Schneier's essay does an excellent job of describing the new security challenges that smart devices create, it falls short on solutions. "The market can't fix this," Schneier suggests, "because neither the buyer nor the seller cares … There is no market solution, because the insecurity is what economists call an externality: It's an effect of the purchasing decision that affects other people. Think of it kind of like invisible pollution."

Like many who make "market failure" arguments, Schneier believes that the government alone can intervene to fix the problem. Specifically, he thinks an entirely new federal agency is needed, fearing that without a Department of Technology Policy nothing could compel device manufacturers to internalize the externalities of poor digital security.

But behind every suspected market failure is usually an existing government failure. Schneier himself says as much when he discusses the many laws that inhibit security research and contribute to smart-device insecurity. In particular, laws like the Digital Millennium Copyright Act (DMCA) and Computer Fraud and Abuse Act (CFAA) penalize computer scientists who try to test or report certain software vulnerabilities. These laws should be amended before we do anything else.

Perhaps more importantly, when considering how best to address market failures, we must not succumb to what economist Harold Demsetz called the "Nirvana fallacy." If you compare an imperfect existing situation with the perfect ideal of government intervention, of course the government solution will be tempting. But government bodies operate in an imperfect reality, and once created, they will generate their own set of unintended consequences, which will be very hard to turn back.

Tech Policy Touches Everything

Beating back even a single federal regulation often requires a rare combination of years of scholarly attention, unwavering political will, and random chance. Unfortunately, bad government policy can inflict society for decades by mere virtue of institutional inertia. If a newly-created "Department of Technology Policy" proves useless, incompetent, or corrupt, it would be very hard if not impossible to set it right or close up shop. And since "technology" now touches so much of our lives, any new Department of Technology runs the risk of becoming entrenched in most of the things that we do.

Often, social problems can seem unprecedented or difficult to solve when new technologies are involved. Yet again and again, society has developed legal and social solutions borrowed from some earlier problem that can be applied to the new technology. For example, the "security pollution" that Schneier describes could perhaps be addressed with common law precedents in the courts, or through voluntary standards-setting bodies, or third party audits.

In other situations, new technological solutions may be appropriate. Entrepreneurs and researchers are hard at work on new IOT security solutions, because after all, where there is a great social need, there is a great profit opportunity. But if a "Department of Technology Policy" preemptively blocks such research, or requires companies to dedicate resources elsewhere by mandate, these solutions may never be discovered.

Security researchers like Schneier provide a great service in bringing attention to the newest technological problems that arise. Markets are not perfect, and they certainly don't immediately fix problems in the exact ways that we might want. But where government regulation is inflexible and susceptible to capture, market processes are adaptive and biased toward improvement. Rather than wishing for a Department of Technology Policy, we should focus on overturning the bad existing government policies that undermine security. With patience and humility, we may find that what we thought was a "market failure" was in actuality a market opportunity all along.

Photo Credit: Polaris/Newscom

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  • Ship of Theseus||

    Promise?

    Also, what's a terrist? Is it an Earth enthusiast?

  • DiegoF||

    Jack Russell enthusiast

  • Ship of Theseus||

    So, is it well known that dajjal and AddictionMyth are the same person? He posted the exact same thing in the Britain/Singapore article under dajjal, misspellings and all.

  • Acosmist||

    Yes, it is well known. It's also well known that he becomes so attention-starved that he literally posts "LOOK AT MEEEE!" after a while, although lately commenters have been feeding him too often for that.

    THANKS GUYS.

  • Ship of Theseus||

    But he said he was LEAVING AND NOT COMING BACK! You're saying that's not true?? I'm shocked.

  • The Elite Elite||

    Well, there are only so many trolls to kick around after Cytotoxic ran away.

  • Episteme||

    A childhood friend once randomly commented, upon seeing my terrier, "terriers – everyone's favorite dog!" I do not disagree. #terrism

  • Cynical Asshole||

    I'm sure you'll be missed. /sarc

  • chemjeff||

    A new department? To regulate Internet-connected devices?

    Oh heavens no.

    Kill this idea now, in the crib.

  • Episteme||

    Department of Things? Department-connected Devices?

  • Scarecrow Repair & Chippering||

    Someone on a tech blog wanted an FDA for IoT-related devices. I pointed out that the FDA does a lousy job, taking years to approve medicines already approved in the EU and elsewhere, taking so long to approve some new pills that the patents are almost worthless from having so little time left. Then there's the drone regulations, the FAA emergency location transponder regulations, and a zillion other examples of slow, dim, ponderous bureaucracies. If we had had this IoT bureaucracy in place since the beginning, we'd still be using dialup modems and Pemtiums.

    The response was "Thalidomide".

    I used to think pretty highly of Bruce Schneier, but if he thinks a government bureaucracy can improve any industry, he's lost a fan.

  • Christophe||

    As usual, expert assumes government intervention would be directed by people as knowledgeable and well-intended as they are.

    What I find hilarious is that he just recently was involved in fighting against the key escrow bullshit the FBI was pushing.

    He's seen first-hand the ravenous demand for more power emanating from all federal agencies. What does he think happens to IoT-connected surveillance camera firmware when the FBI wants on-demand access to any footage? What happens when the NSA tells you to change your encryption to be "more" secure, backed with the threat of blocking the sale of your product?

    The IoT-security problem is real, but it's only recently been the focus of much attention. Good market-oriented solutions haven't even been tried yet.

  • timbo||

    Trump loves some big government.
    If the tards on the left really wanted to get under his skin, they could change their tune from Marxist and just start calling him a big spending, wasteful spending, massive gov't warmonger. Of course we know they are clinically brain dead but if sales of 1984 are skyrocketing, there's hope.

    And those morons will try anything to get over the butt hurt.

  • Ron||

    technology policy would probably end tech all together it would be a steam punkers wet dream come true

  • OldMexican Blankety Blank||

    "The market can't fix this," Schneier suggests[...]


    If I had a penny for every imbecile that comes up with that facile and idiotic statement, I would be Warren Buffet.

    But behind every suspected market failure is usually an existing government failure.


    The government IS failure turned into public policy.

  • strat||

    I know Bruce, and he is an insightful guy. I share his misgivings about the potential negative impact of what seems to be an impending flood of insecure, cheap, connected, crap.

    That having been said, we don't need yet another department. What we do need is some sort of review step prior to the creation of new legislation (what the heck, let's add EOs to that too), to look for unintended security consequences should that policy be enacted.

    This is probably a good idea for any and all legislation, but I'll stick to my knitting. When you have copyright laws that prevent researchers from looking into potential flaws in mass-market products, there's a security problem. I can all but guarantee you that no legislator (well maybe 4) thought of that as "cybersecurity" legislation. When you have export control regimes that discourage the pervasive use of strong cryptography, you have a security problem.

    Since there's no sphere of human endeavor where this cybersecurity thing doesn't seem to touch, and legislators feel emboldened to manipulate all as well, I see a potential window of opportunity to have what would otherwise be some awkward conversations with people in the USG.

    Besides, now that you have the Wassenaar folks trying to regulate testing tools as if they were weapons, we have our work cut out for us. Trust me, nonproliferation of this new stuff will become a serious issue, but attackers aren't buying their tools from commercial enterprise testing software vendors.

  • Eman||

    Well, considering how the future has, in the past, gone exactly how we expected it to, government seems like just the imaginary entity for the job.

GET REASON MAGAZINE

Get Reason's print or digital edition before it’s posted online