Science Fiction

Regulatory Science Fiction

The stories of yesterday provide hints for the lawmakers of tomorrow.

|

As gangster Griff Tannen emerges from the Hill Valley courthouse after a disastrous hoverboard chase, a USA Today media drone floats nearby, furiously snapping photos. The date, according to a nearby edition of the newspaper, is October 21, 2015.

This pivotal scene in 1989's Back to the Future Part II wasn't far off when it came to anticipating drone and camera technology. If anything, the movie's prop designers seem to have slightly underestimated the speed of technological progress. The drone is large and cumbersome, struggling to stay aloft under the weight of numerous camera lenses and incandescent light bulbs.

When 2015 actually did roll around, comparatively lightweight drones capable of carrying a single 360-degree camera were already ubiquitous, and the press was moving in to take advantage of the new technology. Last year 16 media organizations, including The New York Times, NBCUniversal, Getty Images, the Associated Press, and USA Today's parent company Gannett, partnered with Virginia Tech to test drones and train journalists in their use.

In December, the Federal Aviation Administration announced that every drone more than half a pound and less than 55 pounds must be formally registered with the federal government, including drones purchased before the new rules were enacted. Pilots who fail to register could face civil fines of up to $27,500 and criminal penalties of up to $250,000 and imprisonment for up to three years, according to the government's FAQ page. These are the first universal drone ownership and use rules, and they pull drones out of a legal limbo in which they have long hovered.

Whether set in the future, the present, or a long time ago in a galaxy far, far away, works of science fiction offer examples of technology that may be with us sooner than we think. Such innovations are exciting, but they also pose challenges. Lawmakers should be ready for a time when facial recognition tech is more widespread and accurate, drones can be equipped with high-functioning A.I., and killer robots can fight our wars. Reading and watching more science fiction is a great way for judges and politicians to get prepared and immerse themselves in a few cautionary tales.

Star Wars Episode I: The Phantom Menace, perhaps the most lambasted movie of the Star Wars franchise, also has some pretty solid tech. The film includes a scene in which Sith apprentice Darth Maul travels to the desert planet Tatooine in order to find Queen Amidala of Naboo. When he arrives, Maul deploys three DRK-1 probe droids to aid his search.

The probe droids are unmanned aerial vehicles, a.k.a. drones. According to Wookiepedia, the all–Star Wars incarnation of Wikipedia, "the DRK-1 probe droid was a small, spherical automaton equipped with sophisticated sensor and communications packages….The DRK-1 version featured a trio of imaging sensors: a central photoreceptor, a magnetic imaging device, and a thermal imager. An antenna atop the DRK-1's dome allowed its master to relay commands to the probot, programming it to seek out individuals or information. Data was then sent back to headquarters via the transmission antennae." A little more sophisticated than what we have today, to be sure, but not altogether implausible as near-future technology, given the work already being done by researchers and military actors related to artificial intelligence, facial recognition, and drones. It will probably not be long before drones that can fly themselves and analyze audio/visual data will be available.

In Phantom Menace, one of the drones is destroyed by the Jedi Qui-Gon Jinn, who declares it "very unusual…not like anything I have seen before." So far most defenses against unwanted spying have a DIY flavor in the real world as well. In October, a Kentucky judge dismissed a case concerning a man who shot down a drone launched by his neighbor. Bullitt County Judge Rebecca Ward told the courtroom that the drone flown over his family's property "was an invasion of their privacy and that they had the right to shoot this drone." In this, Ward seemed to be in agreement with Kanye West. According to TMZ, the rapper once asked, "Wouldn't you like to just teach your daughter how to swim without a drone flying?"

It remains to be seen how state and federal law enforcement agencies intend to track down every 12-year-old who got a drone for Christmas, immediately crashed it into a tree, and then hid the wrecked carcass in his basement.

But state and local legal restrictions are being considered for drone use, in addition to the federal registry. In some states, lawmakers have pre-empted some of the concerns posed by these emerging and improving technologies. For instance, legislation in California, undoubtedly welcomed in the West household, prohibits using a drone to take photos or video of someone "engaging in a private, personal, or familial activity." In Mississippi, lawmakers have moved to ban those of a voyeuristic persuasion from taking advantage of drone technology, explicitly banning "peeping Toms" from using drones in spas, tanning booths, massage rooms, fitting rooms, and "any other area in which the occupant has a reasonable expectation of privacy." Dozens of states have passed legislation addressing not only drones and privacy but also the use of drones as weapons.

What happens when we put drones in the hands of the people who make the rules in the first place? There's not much appetite for restraint at the higher levels of government, that's for sure. In the wake of attacks in San Bernardino and Paris, Republican candidates Jeb Bush and Chris Christie made sure to mention that the law enforcement and intelligence communities should be fully outfitted. During the fifth Republican presidential nomination debate Bush said "we should make sure that we give the FBI, the NSA, our intelligence communities, all the resources they need to keep us safe," and Christie argued that the government should restore "tools to the NSA and to our entire surveillance and law enforcement community."

When it comes to cautionary tales about surveillance, lawmakers need look no further than George Orwell's classic 1984. The police-piloted helicopters of Oceania "skimmed down between the roofs, hovered for an instant like a bluebottle" while "snooping into people's windows."

Even with current tech, footage captured by law enforcement drones similar to those bluebottles could be extensive. The U.S. military has developed technologies like ARGUS-IS, which allows for the persistent surveillance of up to 15 square miles by using what amounts to a 1.8-gigapixel-resolution camera unit, which automatically tracks moving objects. This is, as the PBS series NOVA explained in 2013, the "equivalent of having up to a hundred Predators look at an area the size of a medium-sized city at once."

The New York Police Department (NYPD) has been using military-grade X-ray vans, which can see through walls. Perhaps unsurprisingly, the NYPD has proven reluctant to release information about their use of a technology that not long ago would have belonged in the panels of a Superman comic rather than the news pages. The implications for civil liberties are obvious.

Those hovering police patrols are quickly dismissed by the narrator in 1984, who notes lightly that "the patrols did not matter….Only the Thought Police mattered"—snoops gazing out from ubiquitous screens in homes and workplaces.

Back in the real world there is an ongoing debate about government screen-snooping, with some law enforcement officials demanding "back door" access to encrypted communications and civil libertarians warning of such an approach's worrying privacy ramifications. The chairman of the House Homeland Security Committee, Rep. Mike McCaul (R–Texas), who clearly hasn't revisited Orwell since high school, declared on Face the Nation in November that "the biggest threat today is the idea that terrorists can communicate in dark space, dark platforms, and we can't see what they're saying." Sen. Dianne Feinstein (Calif.), the top Democrat on the Senate Intelligence Committee, chimed in to agree, calling encryption the "Achilles heel" of the Internet.

Apple's Tim Cook pushed back on 60 Minutes that same month, explaining the importance of data security. "Here's what the situation is on your smartphone today, on your iPhone, there's likely health information, there's financial information. There are intimate conversations with your family, or your co-workers. There's probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It's because if there's a way to get in, then somebody will find the way in. There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back door's for everybody, for good guys and bad guys."

Presidential candidates have weighed in on the issue: GOP presidential wannabe Carly Fiorina took the softest stance (and perhaps also the most Orwellian) during that surveillance debate, arguing that when it comes to cooperation between tech companies and the FBI, "They do not need to be forced. They need to be asked."

Then there's artificial intelligence. Long familiar to science fiction fans, thanks to creations from William Gibson's citizen–A.I.s in Neuromancer to the out-of-control ship computer in Arthur C. Clarke's 2001: A Space Odyssey, weaponized A.I. is already a reality, whether it's the X-47B, an unmanned fighter jet that is capable of autonomous inflight refueling, or the SGR-A1, a South Korean sentry robot that can spot intruders autonomously.

Some activists have already taken steps to prevent or limit the use of weaponized A.I., arguing that we should stop intelligent machines being used in combat. For instance, Human Rights Watch is a founding member of the unambiguously titled Stop Killer Robots campaign. Concerns about the rise of A.I. have brought together a range of academics, business leaders, and researchers, including inventor Elon Musk, Apple co-founder Steve Wozniak, and physicist Steven Hawking, who all signed a letter last year urging a ban on autonomous weapons.

The A.I. nightmare scenario resembles the almost century-long Butlerian Jihad of the Dune novels, in which humans battled "thinking machines," resulting in widespread devastation. According to the semi-canonical Dune Encyclopedia, "the Jihad, smashing first interstellar communications, razed large and small governments planet by planet, leaving only rubble, ready for reassembly by the nimblest barbarian." The devastation of the Butlerian Jihad prompted the commandment "Thou shalt not make a machine in the likeness of a human mind" to be added to the Orange Catholic Bible, a fusion of ancient scriptures created after the conflict.

It would be a mistake for politicians to take as drastic a step as completely banning artificial intelligence, but the Butlerian Jihad and other events from science fiction provide us with cautionary—if not fantastical—tales about the future.

Urging lawmakers to take science fiction technology more seriously could have drawbacks. (Just think of the bizarre legislation we might get.) And science fiction doesn't always correctly anticipate tomorrow's tech. At the end of the year, hoverboards were shaping up to be one of 2015's most popular Christmas gifts, though these two-wheeled contraptions are somewhat misnamed, since—unlike Back to the Future Part II's famous floating skateboard—they remain firmly earthbound. And then Amazon had to recall a bunch of them because they kept catching on fire.

But lawmakers should prioritize their Netflix queues and rulemaking. Facial recognition software and drones already exist, and research on A.I., biometric readings, robotics, and weapons is not going to slow down any time soon. Debates on A.I. citizenship and homesteading on Saturn's moons can wait. Debates on surveillance and drones cannot.