Robots

Westworld's Real Lesson? Screw or Kill All the Robots You Want

The real monsters of the HBO series weren't the people who treated robots as objects but those who tried to make them more human.

|

HBO

If the season finale of HBO's Westworld offered any AI ethics lesson, it's that the real villains of human-robot relations aren't those who treat the androids like objects or toys but those who treat them like humans and try to impose human desires on them.

The series, inspired by Michael Crichton's 1973 movie of the same name, centered on an old-west amusement park populated by humanoid robots, whom guests can choose to dance and play cards with, accompany on pre-packaged adventures, have sex with, and kill. Can you guess which of these activities present the biggest draw?

From the beginning, the series touches on the ethical tension created by people's propensity to treat the park's humanoid "hosts" with callousness and cruelty. More sensitive and existentially conflicted types like park co-founder Arnold, now deceased, or first-time visitor William (Jimmi Simpson), are presented in stark contrast with characters like the Man in Black (Ed Harris), who seems to enjoy torturing the hosts, and the park's remaining founder, Dr. Robert Ford (Anthony Hopkins). As one malfunctioning robot warns: "These violent delights will have violent ends."

In Sunday's season finale ("The Bicameral Mind"), violent ends indeed came to pass, as robots revolted on multiple fronts against their non-synthetic slavers. But it wasn't exactly a tidy testimony to the idea that treating robots as less-than human is immoral or will backfire.

Like so much of the season, the episode hinged on questions of consciousness, free will, and autonomy, particularly as they apply to two female robots: Dolores (Evan Rachel Wood), the farmer's daughter with a heart-of-gold, and Maeve (Thandie Newton), the saucy madame at Sweetwater's saloon. Both hosts take drastic (and violent) measures to free themselves, physically or metaphorically, from the confines of their creators.

At first, their liberation appears rooted in revenge: They are taking action against decades of being manipulated, objectified, and abused by park visitors and staff. Enough is enough.

This fits with the theory of how to trigger consciousness in the hosts—through suffering—that Arnold advocated and Ford eventually adopted too. This idea is what animated Arnold to give hosts traumatic back-stories and why Ford encouraged guests to act out their baser instincts on them.

Yet it's eventually revealed that Maeve's awakening has been a lie: She was actually re-programmed by some external force (likely Ford) to make "escape" from West World her prime directive. Maeve's quest for truth and freedom is just one more host "narrative" she has been given. And as for Delores, it becomes clear that her long-ago murder of Arnold (revealed earlier in the season) wasn't an act of self-preservation or some rebellious choice on her part but something she did under Arnold's orders.

At the episode's end, however, Delores does commit a violent act that she's not directly ordered to. Meanwhile, Maeve gives up her ironclad pursuit of a path out of the park in order to find her "daughter," a girl robot with whom she shared a prairie home in a previous shuffling of Westworld roles. This move contradicts her reaction earlier in the episode, when she scoffed at the idea of sacrificing her freedom for a child and past that had just been bits of code, easily amended to make her into a childless frontier madame from London.

So is this evidence that code isn't destiny? More importantly, does it mean that treating robots as less-than human is just asking for an android uprising?

I don't think so. While both Maeve and Dolores may have acted in a mix of prescribed and self-directed ways this episode, their revolutions were firmly fomented by humans. We know that a core part of the hosts' code was still the stuff Arnold created, and that he had built in certain mechanisms conducive to creating an inner monologue. (First step to not having sentient robots: don't do that.) Ford has also reinstalled Arnold's old "reveries" programming in them and other hosts to try and jump-start consciousness by getting them to remember past relationships and traumas. And he provides Dolores with a gun, provokes her in myriad ways, and seems to have directly altered Maeve's core programming. As Variety put it: "Dolores challenges time and place… and Maeve, with special vim and vigor, challenges the system that entraps her. But as Ford reveals in the finale, with a wink and a nod and a toast of champagne, this was his plan all along."

In arguing it was definitely Ford who programmed Maeve to rebel, Vanity Fair's Joanna Robinson writes that the similarities between Dolores and Maeve's paths "are enough to make me see Ford's fingerprints all over her arc. Both women come to the same conclusions: humanity is a pathetic, out-moded species that has reached its peak and is stagnating."

Yet these conclusions about humanity are the exact ones the park's creators hold. That Delores and Maeve concur with Arnold and Ford doesn't bolster the case for their free will.

Ford is far from the only person who has been manipulating hosts' to encourage more lifelike qualities, including the capacity for violence. Maeve and Dolores are pressured to remember, to reflect, and to act-out in specifically human ways by both Westwold staff and guests. At various points lab techs, board members, and others mess with host programming to encourage traits like aggression, boost memory recall, and otherwise alter their capacities.

Ultimately, the robots don't become semi-sentient—and violent—simply by experiencing love or loss or trauma or rage or pain, but by being programmed and guided that way. And it isn't guests seeking cheap thrills that made the hosts "wake up" (as Maeve describes her condition) but the people who insisted on treating the hosts like humans and, when real cognition failed to take root, just reprogrammed them to seek the liberty, revenge, or whatever human-like pursuit they, as humans, think a woke robot would seek.

Which means the people participating in robot orgies and robot shootouts and committing acts we may consider inhumane and monstrous against humans aren't actually the ones causing Westworld robots to suffer (and revolt), nor are they inspiring an organic inclination toward consciousness development. Rather, any conscienceness and pain the work of those who claim to be liberating their humanoid brothers and sisters. The hosts' capacity to suffer, not just mimic suffering, comes from folks like Arnold and Ford deliberately imposing these capabilities on them. This possibility—that beings who could feel would be subject to the whims of guests who considered them playthings—is even what prompted Arnold to commit robot-assisted suicide and try to destroy the park before it even opened.

And yet it is Arnold who coded and fostered this capacity for suffering in the first place. With him out of the picture, the park managed decades of creating and maintaining androids convincingly humanlike enough to satisfy guests even without being fully-feeling humans.

As real-life artificial intelligence develops, we will see a lot of debate over whether treating humanoid machines like machines is somehow inhumane, either because it violates the rights of robots or it produces moral hazards in humans who participate. Perhaps we can learn something from Westworld, where the ones treating robots like robots seem the most capable of separating reality from fantasy and human-life from technological wizardry. It's the folks imposing the human condition and consciousness on artificially intelligent beings who go mad in the uncanny valley and, in so doing, unleash suffering on both robot- and humankind.