Science Fiction

Cory Doctorow on Cyber Warfare, Lawbreaking, and His New Novel 'Walkaway'

The novelist, activist, and BoingBoing founder on cyber warfare, Uber-style reputation economics, and what he's likely to get arrested for someday.

|


Cory Doctorow, author of Down and Out in the Magic Kingdom, Little Brother, and Makers, is a three-time Prometheus Award winner, an honor bestowed on the best works of libertarian science fiction. In his most recent book, Walkaway, the super rich engineer their own immortality, while everyone else walks away from the post-scarcity utopia to rebuild the dead cities they left behind.

Reason Editor in Chief Katherine Mangu-Ward spoke with Doctorow about cyber warfare, Uber-style reputation economics, and that most overused and poorly understood of sci-fi themes: dystopia.

Edited by Todd Krainin. Cameras by Mark McDaniel and Krainin.

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.

This is a rush transcript—check all quotes against the audio for accuracy.

Katherine Mangu-Ward: Do you think that the underlying conditions of free speech as it is associated with dubious technologies, are they getting better or worse?

Cory Doctorow: There is the—there is a pure free speech argument and there's a scientific argument that just says you know it's not science if it's not published. You have to let people who disagree with you—and who dislike you—read your work and find the dumb mistakes you've made and call you an idiot for having made them otherwise you just end up hitting yourself and then you know your h-bomb blows up in your face, right?

And atomic knowledge was the first category of knowledge that scientists weren't allowed to freely talk about—as opposed to like trade secrets—but, like, scientific knowledge. That knowing it was a crime. And so it's the kind of original sin of science. But there's a difference between an atomic secret and a framework for keeping that a secret and a secret about a vulnerability in a computer system. And they're often lumped together.

I was on a family holiday. We were on like a scuba resort in the Caribbean, in a little island called Roatan in Honduras. And there was this family of D.C.-area spooks. Like multigenerational. And Grandpa what had been like with USAID when the tanks rolled on Hungary and in Budapest. And all of the kids worked for undisclosed three-letter agencies. And so we're like sitting in in the pool one day and talking about cyberweapons and cyberwar.

Katherine Mangu-Ward: Like you do. On vacation.

Cory Doctorow: On vacation. That's what I do. That's my idea of a good time. So the guy said like, "Well what about cyber weapons? Like why shouldn't we develop cyberweapons? Why shouldn't we a cyberwar?" And I said, "There's a difference between a secret bomb and a secret vulnerability in a computer operating system."

Because if I invent the h-bomb, it may be unwise. But keeping the physics of the h-bomb a secret does not make Americans more vulnerable to atomic attack than disclosing it. Maybe it would help them at the margins build slightly better bomb shelters. But it's really—it's not the same thing as me discovering a vulnerability in Windows and saying, "It would be great if I could attack former Soviet bloc countries or countries in Middle East or jihadis or drug runners by keeping this vulnerability a secret and assuming that nobody else discovers that vulnerability and uses it to attack the people I'm charged to protect." That mistake calls into question the whole scientific enterprise. Because we really only know one way to make computers secure and that is to publish what we think we know about why they're secure now and see what dumb mistakes our enemies and friends can locate and help us remediate.

And so you end up in this place where these vulnerabilities—that you are blithely assuming won't be independently rediscovered by your adversaries and exploited against you and yours—end up getting exploited against you and yours. And not just by state actors but by petty criminals, too. And this is one thing we're learning after the Vault 7 leaks. It's that a lot of those vulnerabilities were independently rediscovered and weaponized not just by governments and by military contractors who serve them, but by like, you know, dumb-dumbs who have crappy little identity theft rackets.

Right, you know, I guess maybe in in the Reasonverse it's a function of an imperfect market that the person who discovers the vulnerability that could be used by someone who could make millions with it, ends up migrating to a dumb-dumb who makes hundreds with it. And if we had better markets for these vulnerabilities, maybe we'd have a more efficient marketplace. And maybe it would only be exploited by people seeking high-value targets instead of targets of opportunity, where you're just looking for, you know, scanning all of ipv4 for people who've got CCTVs—

Katherine Mangu-Ward: Or maybe the highest bidder would be the person who actually wanted to close that vulnerability, at least in some cases?

Cory Doctorow: Well, except that would be a state, right? And states, so far, are bidding on these vulnerabilities just to exploit them. Like, that would be great. And this is the crux of the argument: Keeping security vulnerabilities a secret is itself a recipe for being exploited. It means that the only people—like a lot of contraband dysfunctions—the only people who end up knowing that secret are people who are weaponizing it and not the people against whom it's being exploited, until it becomes so widespread in an exploit that it can no longer be denied.

So Mirai, the Internet of Things worm, now is very widely known because it was very widely exploited. And before that it was a cozy secret among dumb-dumbs and creeps who used it for denial of service and blackmail.

So this is particularly salient because we only know how to make one kind of computer, and that's the general-purpose computer that can run all the programs. And we have computers that are now at the center of all of our policy problems because computers at the center of all of our life, every element of our life. So in theory, we could maybe solve all of our automotive problems by changing which programs the computers and the cars can run: Don't let it run the program that runs over children, right?

Katherine Mangu-Ward: Right.

Cory Doctorow: And in theory we can fix maybe some of our aviation security problems. And we can fix some of our child pornography problems. And like, all the Four Horsemen of the Infocalypse: the mafia, child porn, terrorism, and organ— Oh, I always forget the fourth. Mafia, child porn, terrorism, and—not money laundering. Goodness me!

Katherine Mangu-Ward: I don't know, man.

Cory Doctorow: And, and–

Katherine Mangu-Ward: And famine! Just throw a regular old-school—

Cory Doctorow: Dollar sign, bar!

Katherine Mangu-Ward: Old-school horsemen in there.

Cory Doctorow: Right. YouTube buffering. I'm sure ya know. That's one of the 10 plagues of Passover, I think.

Katherine Mangu-Ward: Yeah I know it is, it's right after the lice and before the death of the first born.

Cory Doctorow: That's right, yeah. So this belief that we can solve our problems by making computers that correspond to a theory object that doesn't exist anywhere in theory which is the computer that runs all the programs except for the one you don't like, involves necessarily a prohibition on disclosing defects in the design of that computer that might let you run the forbidden program. And that means that security vulnerabilities fester in those devices.

And we have a perfect storm of terribleness in which the constellation and devices that we are designing to control what kinds of programs we can run is monotonically expanding. And the ways in which those devices can harm us is also montonically expanding. And our ability to report security defects in those and continuously improve them, is contracting.

And this is catastrophic! Right? And so I think that we can leave aside the philosophical argument about whether we should or shouldn't have free speech or what its limits are. And we can go to a purely utilitarian argument that just says: If you don't want bad firmware loads being over-the-air updated into your pacemaker, your car, the CCTV in your bedroom, and the toaster that could burn down your house, you know, we need to have a robust culture of investigation and discovery and disclosure of security defects in those products. And that is just not compatible with cyberwar, digital rights management, crypto backdoors, anything else that requires controls on which programs computers can run.

Katherine Mangu-Ward: It almost sounds like something that's analogous to the problems with monocropping, right? Like, there's this one type of computer, it's everywhere, and we have now elaborate pesticides, and we have elaborate herbicides, and we have controls over who can plant what when and who has the intellectual property on which seeds. But on some fundamental level the problem is just it's all the same kind of thing and therefore vulnerable to a single apocalyptic event. Is that fair or is that the overworking the analogy?

Cory Doctorow: I think it is a little, because although we don't know for sure what other computer architectures might exist the only one that we've really been able to scale up and and make good is this Turing complete von Neumann machine. And there are some smart people at the Media Lab where where I have an appointment who think that maybe we could do other things. But—

Katherine Mangu-Ward: Unlike, say, corn, where we have already a thousand indigenous varieties we could be using instead. Maybe there really is just the one computer and we have to be cool with that.

Cory Doctorow: Yeah, well, it's kind of like, maybe we just have one physics. And if only we had lots of different physics then nuclear bombs wouldn't always work, right? And so if Turing completeness is like latent in the fabric of the universe—which seems to be this weird thing where like we try to design toy computing environments that just do a few things and they inevitably turn out to be bootstrappable into the full suite of Turing completeness—you know, like, maybe that maybe it is there, right? Like Magic: The Gathering is Turing complete, right?

Katherine Mangu-Ward: You really know how to speak to our viewers.

Cory Doctorow: That's right! That's right!

Katherine Mangu-Ward: That's great.

Cory Doctorow: You need a big deck. And a long time. But you could run Photoshop on it.

Katherine Mangu-Ward: That's probably just true because we're actually all living in a simulation that was created to be—that that's the condition they're testing here, right?

Cory Doctorow: I am of the stripe of science fiction writer that thinks that all of those things are useful metaphors and dangerous assumptions about reality.

Katherine Mangu-Ward: What's the danger in making that assumption about reality? I mean I don't actually have a strong view one way or the other on this question. But I am intrigued by your use of the word danger there.

Cory Doctorow: First of all the arguments seem to me to have that the reek of collegiate mind-game logic games, right? Like, I had a roommate and a dear friend—actually one of the people Walkaway is dedicated to—Eric Stewart, who died unexpectedly a few years ago without any proximate cause, just didn't wake up one morning. But he was very clever. And in high school one day he sat down and said, "So the universe is infinitely prolonged, right, it goes on forever."

And I'm like, "That's what I'm understanding."

And he said, "We have finite life spans."

And I said, "Yeah."

He said, "Anything finite divided by something infinite is zero."

I'm like, "All right."

He said, "Therefore the probability of us being alive at this moment is zero."

I'm like, "Okay."

"And the only thing that is nonzero when divided by infinity is infinity."

I'm like, "Yeah."

He's like, "Therefore we have infinitely prolonged lives and we are immortal."

I'm like, "Can't argue with your reasoning but I don't think it's true."

Katherine Mangu-Ward: Is it not fair to say, though, that maybe your not your target demographic but your de facto demographic is in fact, like, dudes who would like to have their mind blown by late-night conversations in dorms? Is that an unfair Cory Doctorow stereotype? Those are your guys! Right?

Cory Doctorow: No, no, no. I'm all for having the conversation. But I think that it's the difference between having a thought experiment and a religious faith, right? Thought experiments are cool and interesting. And, like, I think science fiction has this great signature move which is that it reaches into the world and it plucks a single technology out that is part of this complex matrix of technologies that are affecting our lives. And it imagines a world in a bottle where that one technology is the central fact. And where everything else, if it's present, is just garnish around the edges. And by magnifying it in this comic way, right, this is caricature way, it surfaces some of the latent emotional and, you know, moral questions around the technology that are otherwise kind of hard to tease out from the from the from that big matrix.

It's like when the doctor goes and swabs your throat and then, you know, rubs it in a petri dish and leaves it for the weekend. What she looks at on Monday is not an accurate model of your body. It's a usefully inaccurate model of your body in which one thing is blown up as a diagnostic tool to let us understand it. You know, it's the physicist imagining the perfectly spherical cow on a frictionless surface and uniform density, right? Not because we have perfectly spherical cows or frictionless surfaces or uniform density. But that's a good first approximation for testing all your ideas against.

Katherine Mangu-Ward: And assuming that the cow is a sphere is a useful and positive thing to do when you're writing science fiction and maybe slightly less so in actual practices.

Cory Doctorow: Yeah, eventually there's chewily weird reality.

Katherine Mangu-Ward: This book, is it a prequel or is it related in some way to Down and Out in the Magic Kingdom? It seems like a similar universe. Has the political takeaway that you would want people to get out of those two different books shifted between them at all? Either because your views have changed or because kinds of facts on the ground have changed?

Cory Doctorow: So I think that science fiction is not predictive any meaningful way.

Katherine Mangu-Ward: It's certainly not great at it.

Cory Doctorow: I always say we're Texas marksmen: We fire the shotgun into the side of the barn and draw the target around the place for the pellets hit. Just ignore all those science fiction stories that never came true.

Katherine Mangu-Ward: Yeah.

Cory Doctorow: But I also think that prediction is way overrated. I like what Dante did to the fortune tellers. He put them in a pit of molten shit up to their nipples with their heads twisted around backwards weeping into their own ass cracks for having pretended that the future was knowable, right? If the future is knowable then it's inevitable. And if it's inevitable, why are we even bothering? Why get out of bed if the future is going to happen no matter what we do, except I guess you're foreordained to.

Right, so I'm not a fatalist. The reason I'm an activist is because I think that the future at least in part is up for grabs. I think that there are great forces that produce some outcomes that are deterministic or semideterministic. And I think that there are other elements that are up for grabs. And I think that the forces are the result of the elements that are up for grabs, that's, you know, the parts that we intervene and become unstoppable forces later that then we can dance through other mechanisms and other interventions.

And so I think what science fiction does is not predictive, but sometimes diagnostic. Because across all the science fiction that has been written and is being written, and all the stuff that's being greenlit by editors or has been greenlit by editors, and all the stuff that readers can find and raise up or ignore—there's a kind of natural selection at work. Where the stuff that, like, resonates with our aspirations and fears about technology and our futures, that stuff gets buoyed up by market forces, by the marketplace of ideas, and becomes a really excellent tool for knowing what's in the minds of the world.

So the book itself considered on its own is a good way to know what's in the mind of the writer. The books would succeed tell you what's in the mind of the world. And if there's a lot of this stuff kind of coming to a prominence at this moment, I think it does say something about the moment that we live in, that there's a certain amount of pessimism. There's a fear that we are being stampeded towards a mutually distrustful internally divided future where we end up attacking each other rather than pulling together. I think even the most cynical person understands that if civilization collapses and you run for the hills that you aren't going to be a part of rebuilding it, right? That the people who are the part of rebuilding or the people who run to the middle and kind of get the power plant working again reopen the hospital and get the water filtration plant working again.

And so that mood of distrust and anger zero-sumess— although I just read an article this morning about how zero-sum doesn't mean zero-sum and that von Neumann who coined the term had this very specialized definition that we all use wrong, but I couldn't figure out what it was supposed to mean because it was five in the morning and I'd only gotten five hours' sleep. But this notion that my game is your loss and that there's not enough to go around and this big game of musical chairs and the chairs are being removed at speed, is a is a theme in a lot of the science fiction that's prominent right now.

So Walkaway is in some ways a prequel to Down and Out in the Magic Kingdom. I certainly reread Down on the Magic Kingdom with a pen and a highlighter and some post-its and made tons of notes before I started work on Walkaway and I have a whole file of themes that I wanted to pick up.

Some of that is understanding that I've come to and you know the 15 plus years since I wrote it. And some of it is wanting to respond back to the people who read Down at the Magic Kingdom as a utopia and who didn't understand that that there were dystopic elements. It was a very mixed future. And that reputation economics have the same winner-take-all problem that Pikettian rate of growth is always less than the rate of return on on capital problem, that produces you know insane runaway wealth disparity and dysfunction with, like, misallocation of resources. Literally, because in Down and Out in the Magic Kingdom your ability to, like, run Disney World is based on how much esteem people hold you in. And so literally you can walk in and start handing out tickets. And if the people treat your tickets as though they're the right tickets then you get to be the czar of Disney World, which is the premise of the book.

And so you get these crazy resource misallocations get all these other problems and that it's papered over with meritocracy—which I'm sure you know was claimed as a satirical term to describe the delusion that the reason that you have so much and everyone else has so little it's because you're better than them. Right? And, and—

Katherine Mangu-Ward: Yet I'm sure you get all the time people coming up and saying to you, "Oh my God, you basically predicted Uber's reputational system!"

Cory Doctorow: Yeah.

Katherine Mangu-Ward: And you know, certainly while you weren't alone in thinking about those reputation mechanisms, and as you mentioned Charlie Stross has—

Cory Doctorow: Yeah.

Katherine Mangu-Ward: —a bunch of great stuff in his books about how that might look.

Cory Doctorow: Well, and I stole that from Slashdot karma.

Katherine Mangu-Ward: Right. So it feels it feels both normal and dystopian to people simultaneously.

Cory Doctorow: But I think Uber is normal and dystopian for a lot of people, too. Like, I think all the dysfunctions of Uber reputation economics, where there's it's one-sided. I can tank your business by giving you an unfair review. Where you have this kind of weird mannered kabuki in some Ubers where people are like super-obsequious to try and get you to five-star them. And all of that other stuff that's actually characteristic Down and Out in the Magic Kingdom. I probably did predict Uber pretty well with what would happen if there are these reputation economies, which is that you would quickly have a have and a have-not. And the haves would be able to, in a very one-sided way, allocate reputation to have-nots or take it away from them, without redress without rule of law, without the ability to do any of the things we want currency to do. So it's like, it's not a store of value, it's not a unit of exchange, it's not a measure of account. Instead this like it's just it's a pure system for allowing the powerful to exercise power over the powerless.

Katherine Mangu-Ward: And yet isn't the positive spin on that: Well yeah, but the way we used to do that allocation was just by punching each other in the face?

Cory Doctorow: Well, that's one of the ways we used to. I was really informed by a book by David Graeber called Debt: The First 5,000 Years, where he points out that like the anthropological story that we all just used to punch each other in the face all the time doesn't really match evidence. That there's certainly some places where they punch each other in the face and there's other places where they just kind of got along. Including lots of places where they got along, like, through having long arguments or through like guilting each other.

Katherine Mangu-Ward: I don't know. I'm feeling like five stars on the Uber is better than the long arguments or the guilt.

Cory Doctorow: Well, that's because you don't have to drive Uber for a living and you've never had to worry that tomorrow you wouldn't be able to.

Katherine Mangu-Ward: That's certainly true.

Cory Doctorow: Yeah.

Katherine Mangu-Ward: Talk to me a little bit about the universe that Walkaway exists in. And in particular, I'm interested in its quasi-anarchic properties and what exists of the law and then the people who are operating outside of it, how that interaction works.

Cory Doctorow: So the mainstream Walkaway world what's called Default, which is a term I stole from from Burning Man, the default world is one in which the rule of law is entirely tilted to the favor of a small cadre of super-wealthy people who have game-rigged the system. And everybody else—the 99 percent—either is in this very precarious position where some of them are needed to make the automated systems go and some of them are needed to make sure that the people who do the work don't get too uppity because they can always be fired. And then everyone else is kind of surplus to requirements.

And a lot of them walk away, a lot of them, because they can escape-hatch into a kind of bohemian demimonde where they move into brownfield sites left behind by toxic post-industrial implosion. They use drones to find the leftovers of the civilization that had once been there. And then they use software from the U.N. High Commission on refugees to figure out how to recombine that to build a kind of fully automated luxury communist civilization, where they just you know you go on a scavenger hunt you find all the stuff and you build a huge Dr. Seussian amazing luxury hotel that anyone can stay in and that anyone can be the Tsar of and that anyone can can kind of contribute to. And it's built sort of like a wiki where people add things and people remove things and you can see who added what and who removed what and you can decide kind of collectively through deliberation and sometimes through you know shitty arguments and sometimes through very reasonable arguments. And one of the things that a lot of the walkaway culture aspires to is is that kind of rationalist mode of argument where we're talking things over rigorously front iron man each other's arguments is the best way to kind of win one over and decide what the best thing to do is and not just how to win.

And it's pretty stable because it turns out the Default doesn't mind having an escape hatch. Bohemias are cute, right? I mean there's a reason that loads of fast fashion places and designers go to Burning Man to make notes on what to knock off for the runway next year. Because Bohemia is a cool thing to mine. Grunge went from like, you know, Seattle's seedy underground to Sears in six months. Bohemians are cute. They're living labs. But then a group of scientists who've been working in Default figuring out the secrets of practical immortality for the super-rich decide that they don't really want to be complicit in helping the human race speciate into these infinitely prolonged God-like humans while the rest of us who are just mayflies receding in their rearview mirror.

So they engage in a Promethean act. They steal the secrets of immortality, which they after all discovered, bring them to the rest of us and then the super-rich realized that they're going to have to spend the rest of eternity with people they think of as being unworthy and that triggers the Hellfire missiles and all-out war.

Katherine Mangu-Ward: This is an analogy to open-source software development. And probably the phrase open-source is one that people use incredibly widely without really understanding what it means. People use it to just mean "vaguely collaborative."

Cory Doctorow: Right.

Katherine Mangu-Ward: It has a soft meaning.

Cory Doctorow: Spooks use it to mean just "stuff in the newspaper."

Katherine Mangu-Ward: Right. Open source, it has it has a whole separate meeting which is just stuff people generally know.

Cory Doctorow: Mm-hmm.

Katherine Mangu-Ward: I know you're a part of that community—you're stuck in there quite deep—so how much of it is a metaphor? And let's not work too hard to make the parallel.

Cory Doctorow: So I actually I'm working at whatever the thing that underpins screen open source software is, which I think is like Coasian coordination. So I think that, like, abundance is this triangle. And up here is what we want. Right, so Keynes, you know, wrote that paper in 1930: Our grandchildren will struggle to fill their three-day work weeks because they will be able to produce all the things that humanity could reasonably want. And he grossly underestimated the elasticity of our demand. And now you have people like Marie Kondo making a cottage industry out of like convincing us that really all we want is like a single smooth river rock that reminds us of our mother—

Katherine Mangu-Ward: And gives us joy.

Cory Doctorow: And makes us feel joy. So how much you want is obviously elastic. And it can go up and it can go down, right? And so that's one of the parameters on abundance that we have to think about.

And then over here is, like, how much we can make. So 3-D printing, automation, all that stuff. And both of those have seen significant changes in the last couple of decades. Marketing, A/B splitting, new additive manufacturing tools, automated milling, robotics, all of those have been profound changes in our world.

But all of the real action is over in this other corner, which is in logistics. And then that's like getting the stuff that people want to the people who want it after you've made it. And figuring out how to remake it. And figuring out what happens to it when we're done with it.

So Bruce Sterling wrote this very influential essay in the mid-2000s called Shaping Things, published by MIT press, where he posits an object called a Spime. And a Spime is a good, an object that is immaterial, it exists as information until someone needs it and then it's manufactured. But it's manufactured and designed in a way to gracefully decomposed back into the material stream when its duty cycle is over. And it's manufactured in a way so that it's use generate data about its efficacy and ways that it can be improved, so that every time it's made anew, it's better. And Spimes are a really provocative answer to the question of like how we can realize the Promethean project of both the heterodox right and heterodox left. You know, letting every peasant live like a lord. As opposed to insisting either, on the left, that every lord should be made to look like a peasant—or on the right, that lords and peasants are an inevitable fact of the world and there will always be lords and always be peasants maybe we incentivize people by having that difference.

The way that we get every peasant to live like a lord in a planet that only has one planet's worth of material, is that we find better ways to connect the material that people need with the people who have it and where it is at any given moment. So that rather than, like, everybody having to own a car, we have cars that are services. But we also have completely negotiable moment-to-moment things that you might need a car for. And so when there aren't cars available, the things that you can do instead of being in a car are brought to the fore.

So Google runs this data center in Belgium in a place where two thirds of the time it's so cool that they don't need the air conditioning and the other third of the time they just turn it off. And their file system is so good at migrating data away from places that are shutting down and into places where it's running that it doesn't really matter. You know, a lot of places that do aluminum smelting, because it's so energy intensive, they use aluminum smelting as a kind of battery. They say well we need to smelt so many tons of this year and when we have lots of solar or lots of wind or lots of tidal power we don't have anything to use it for so we smelt the the aluminum men and not at the moment when other people are trying to turn on their lights to run their air conditioning or run their Google data centers.

That kind of coordination where, like, at the moment that something is needed it's and at the moment where it's cheap to do it it's done, is characteristic of, like, efficient market hypothesis its characteristic planned economy theory. It's the thing that everyone is kind of shooting for. And it's the thing that free and open-source software has given us is the ability to coordinate ourselves very efficiently without having to put up with a lot of hierarchy to late bind why we're doing things or whether we're all on the same side. To be able to take things that we've done together where we've reached a breaking point and split them in two and have each of us pursue it in our own direction without having to pay too high a cost or even have a lot of acrimony. That's like that's the free software world I'm trying to imagine. Like, what would it mean be like to build skyscrapers the way we make encyclopedias and the 21st century?

Katherine Mangu-Ward: So you're over there imagining this world. But the place where the kind of social and political panic or preoccupation is happening is really, it's in that other corner of your triangle, it's in the logistics corner where were where the robots are going to come and take away our jobs.

Cory Doctorow: That's the manufacturing corner.

Katherine Mangu-Ward: Right. Sorry, the manufacturing corner. Where we're currently culturally perseverating is over there.

Cory Doctorow: Sure.

Katherine Mangu-Ward: This idea that it turns out people don't want leisure on some kind of deep level is the thing that I'm particularly hearing on the right. Conservatives are doubling down on [the idea] that there's value in work. There's value in difference. There's value in, as you say, maybe those differences incentivize people. And there's a conservative version of the panic about the robots are taking our jobs. Then there's a liberal version of the panic about the robots are taking our jobs, which is: what's going to happen to the truck drivers when all those are automated? But why is that the thing we're panicking about now?

Cory Doctorow: Well, because I think that we tend to worried a lot about the first-order effects. And the second- and third-order effects tend to come a little too late. Gardner Dozois very famously said that the job of a science fiction writer shouldn't be to just think of the car in the movie theater and invent the drive-in but also infer the sexual revolution.

But I say to Gardner: Once you've inferred the sexual revolution, maybe you could spare a moment to think that the sexual revolution happening in cars meant that, for the first time, people had a reason to carry government-issued ID. Which was to get laid, right? And that like the shibboleth of "papers please" which historically has been like a marker of, you know, descent into totalitarian misery became an everyday thing. And that today a lot of the database nation is the progeny of that strange moment where technology and social mores came together thanks to movie theaters and cars and the sexual revolution and give us all driver's licenses. Science fiction writers do like to think past the first-order effect of what would it mean if there weren't a lot of truck driving jobs.

Katherine Mangu-Ward: Is it fair to say that science fiction writers are doing the same thing as a good economist or a good political economist in thinking about unintended consequences?

Cory Doctorow: It's not just unintended consequences because I think making all truck drivers into desperate xenophobic populists who vote for strongman leaders was not the intended consequence of the self-driving car project, right? And yet that's the fear of what our political moment reflects.

It's more like the job of a science fiction writer is to, not even necessarily to map the territory, but to point out that there's territory to be mapped. There is a game we play when we argue about policy or tell stories and the game is "What's in the Frame?"

So there's this famous science fiction story called "The Cold Equations" [by Tom Godwin]. I don't know if you know it. It's taught in engineering schools all the time. And it's about a spaceship pilot who's piloting a small craft full of vaccine to a planet where there is a potential world-killing plague. And if he doesn't get the vaccine there, everybody on the planet will die. And there is a young girl who stowed away on his ship and when he discovers her he is aghast. Because he knows that the ship doesn't have any extra fuel. It has no autopilot. It can only land if he pilots it. If there's any excess weight it will crash. And without all of that, everyone in the planet will die. And that's why he has to shove that girl out the airlock. And they spend 15 pages trying to figure out why they don't have to shove her out the airlock. And then he shoves are at the airlock

Katherine Mangu-Ward: It sounds like a sexier version of the trolley problem, right?

Cory Doctorow: It is, it's just another sexy version of the trolley problem.

And what's out of the frame here is that the author is set up the rules of this thought experiment. And the author decided that autopilots weren't a thing. That reserve fuel wasn't a thing. That sending colonists with a supply of vaccine wasn't a thing, right? All of that stuff is out of frame.

So I think that science fiction is about pointing out in some ways that there are things that are out of the frame that don't properly belong out of the frame, whose ruling out is arbitrary—or customary, which another way of saying the same thing.

Katherine Mangu-Ward: When I saw the title of the book, before I had even seen the illustration on the cover or anything, I immediately thought of "The Ones Who Walk Away from Omelas."

Cory Doctorow: Sure, yeah.

Katherine Mangu-Ward: And wondered if that was getting at this incredibly poetic kind of Ursula Le Guin scenario about, When do you have to fully disengage from society because it's based on an immoral premise? I wondered if that would be what this book was going to be doing. I think in a way it is. Was that an influence on you at all?

Cory Doctorow: Sure. I think the connection is that— So, the book was originally called Utopia, OK? And its thesis is that a utopian society, as we started saying, is one that fails gracefully, not necessarily one that works well. And I sent it to Kim Stanley Robinson for a quote. And Stan read it and gave me a wonderful quote. And then said, "And by the way I don't understand why this book isn't called Walkaway." And I was like, "You're totally right!" And I asked my editor and my agent. And they were like "Stan is totally right." And so we changed the title to Walkaway. But Stan might have been thinking of Le Guin's story.

Katherine Mangu-Ward: Last question: When you go to jail, what is it going to be for?

Cory Doctorow: What will the charge be or why will they…?

Katherine Mangu-Ward: I like that you asked that distinction and you may answer either way.

Cory Doctorow: So you know, I have lots of different kinds of privilege that I think has kept me in reasonably out of harm's way. Not just being like a white middle-class articulate dude with half a million Twitter followers. But also, like, working at a civil liberties law firm filled with lawyers whose numbers I write on my arm before I cross borders, you know.

Katherine Mangu-Ward: Your answer can just be "I'm invincible, man."

Cory Doctorow: No. You know what I worry about a lot? Because I'm a dirty foreigner. I'm a Canadian on a green card. And as we heard in the Supreme Court this week, it is virtually impossible to not have some way in which you are technically violating immigration rules when you are on a green card crossing borders, whatever. Just that you know as the justices at the Supreme Court asked, like, if it says list all your known aliases. If I forget a childhood nickname, does that mean that I can be deported or jailed for for immigration fraud? And the state's position was yes, that is, at any, regardless of whether or not the omission is material, the act of omission itself violates the statute and qualifies you.

And given the highly arbitrary nature of borders, and the very deep antipathy towards the people who cross them, from many of the people whose job it is to inspect those people who cross them, that's the place where I have the most worry. I really do worry a lot about that because I cross the border all the time.

And I worry that I don't know what I would do if I were required to decrypt my devices. I have a certain amount of purging I do before I cross borders to not so to be able to decrypt my devices if I if I'm made to. But then you know there's this hole on known area which is like what about making you log into your cloud services? And if you don't have the password, what about calling the people who have the password and saying Mr. Doctorow doesn't get at an immigration detention until you give us the password to his thing that he's left with you for safekeeping to change and then give to him so he can change it back when he gets out of immigration detention?

Those are all things—those are, like, unknown unknowns, right? It's a complete black hole I think by design the government has not pursued cases where those questions have come up where it looks like the courts would find that they were acting unconstitutionally because they want to see that ambiguity flourishing. Because it they have so much leverage over you and you're at the border that that ambiguity really works in your favor. I mean after the Muslim ban, one of the things that immediately emerged when people said "What should you do if…?" was like nobody knows. Nobody even knows for sure.

So now I do ridiculous things, like I have there's a form I think it's called the g28. So border guards have discretion as to whether to allow your council to see when you're in border detention. But that discretion goes away and becomes an obligation if this form has been signed and left with your lawyer before you cross the border. But it has to be on green paper.

And so I have signed many copies of this and left it in our paralegals filing cabinet at EFF [Electronic Frontier Foundation]. And I always let a lawyer know before I cross, and I always let them know when I'm on the other side. And I hope that they check their phone. And if they see that many hours have gone by and they haven't heard for me to try and call me. And if they don't hear from me again they go and they get one of these green forms. I bought a ream of green paper to print these green forms. And they bring it down to the border to see if that's where I am.

Katherine Mangu-Ward: That's maybe advice for all of our viewers here, I don't know. Get a lawyer on retainer and a lot of green paper.

Cory Doctorow: Yeah, and a lot of green paper. A ream of green paper. I have some leftovers. My kid drew on a lot of it but I still have some leftovers.

Katherine Mangu-Ward: Thank you very much, Cory Doctorow, for talking to us about your new novel, Walkaway.

Cory Doctorow: My pleasure.

Katherine Mangu-Ward: For Reason, I'm Katherine Mangu-Ward.