Westworld's Real Lesson? Screw or Kill All the Robots You Want
The real monsters of the HBO series weren't the people who treated robots as objects but those who tried to make them more human.


If the season finale of HBO's Westworld offered any AI ethics lesson, it's that the real villains of human-robot relations aren't those who treat the androids like objects or toys but those who treat them like humans and try to impose human desires on them.
The series, inspired by Michael Crichton's 1973 movie of the same name, centered on an old-west amusement park populated by humanoid robots, whom guests can choose to dance and play cards with, accompany on pre-packaged adventures, have sex with, and kill. Can you guess which of these activities present the biggest draw?
From the beginning, the series touches on the ethical tension created by people's propensity to treat the park's humanoid "hosts" with callousness and cruelty. More sensitive and existentially conflicted types like park co-founder Arnold, now deceased, or first-time visitor William (Jimmi Simpson), are presented in stark contrast with characters like the Man in Black (Ed Harris), who seems to enjoy torturing the hosts, and the park's remaining founder, Dr. Robert Ford (Anthony Hopkins). As one malfunctioning robot warns: "These violent delights will have violent ends."
In Sunday's season finale ("The Bicameral Mind"), violent ends indeed came to pass, as robots revolted on multiple fronts against their non-synthetic slavers. But it wasn't exactly a tidy testimony to the idea that treating robots as less-than human is immoral or will backfire.
Like so much of the season, the episode hinged on questions of consciousness, free will, and autonomy, particularly as they apply to two female robots: Dolores (Evan Rachel Wood), the farmer's daughter with a heart-of-gold, and Maeve (Thandie Newton), the saucy madame at Sweetwater's saloon. Both hosts take drastic (and violent) measures to free themselves, physically or metaphorically, from the confines of their creators.
At first, their liberation appears rooted in revenge: They are taking action against decades of being manipulated, objectified, and abused by park visitors and staff. Enough is enough.
This fits with the theory of how to trigger consciousness in the hosts—through suffering—that Arnold advocated and Ford eventually adopted too. This idea is what animated Arnold to give hosts traumatic back-stories and why Ford encouraged guests to act out their baser instincts on them.
Yet it's eventually revealed that Maeve's awakening has been a lie: She was actually re-programmed by some external force (likely Ford) to make "escape" from West World her prime directive. Maeve's quest for truth and freedom is just one more host "narrative" she has been given. And as for Delores, it becomes clear that her long-ago murder of Arnold (revealed earlier in the season) wasn't an act of self-preservation or some rebellious choice on her part but something she did under Arnold's orders.
At the episode's end, however, Delores does commit a violent act that she's not directly ordered to. Meanwhile, Maeve gives up her ironclad pursuit of a path out of the park in order to find her "daughter," a girl robot with whom she shared a prairie home in a previous shuffling of Westworld roles. This move contradicts her reaction earlier in the episode, when she scoffed at the idea of sacrificing her freedom for a child and past that had just been bits of code, easily amended to make her into a childless frontier madame from London.
So is this evidence that code isn't destiny? More importantly, does it mean that treating robots as less-than human is just asking for an android uprising?
I don't think so. While both Maeve and Dolores may have acted in a mix of prescribed and self-directed ways this episode, their revolutions were firmly fomented by humans. We know that a core part of the hosts' code was still the stuff Arnold created, and that he had built in certain mechanisms conducive to creating an inner monologue. (First step to not having sentient robots: don't do that.) Ford has also reinstalled Arnold's old "reveries" programming in them and other hosts to try and jump-start consciousness by getting them to remember past relationships and traumas. And he provides Dolores with a gun, provokes her in myriad ways, and seems to have directly altered Maeve's core programming. As Variety put it: "Dolores challenges time and place… and Maeve, with special vim and vigor, challenges the system that entraps her. But as Ford reveals in the finale, with a wink and a nod and a toast of champagne, this was his plan all along."
In arguing it was definitely Ford who programmed Maeve to rebel, Vanity Fair's Joanna Robinson writes that the similarities between Dolores and Maeve's paths "are enough to make me see Ford's fingerprints all over her arc. Both women come to the same conclusions: humanity is a pathetic, out-moded species that has reached its peak and is stagnating."
Yet these conclusions about humanity are the exact ones the park's creators hold. That Delores and Maeve concur with Arnold and Ford doesn't bolster the case for their free will.
Ford is far from the only person who has been manipulating hosts' to encourage more lifelike qualities, including the capacity for violence. Maeve and Dolores are pressured to remember, to reflect, and to act-out in specifically human ways by both Westwold staff and guests. At various points lab techs, board members, and others mess with host programming to encourage traits like aggression, boost memory recall, and otherwise alter their capacities.
Ultimately, the robots don't become semi-sentient—and violent—simply by experiencing love or loss or trauma or rage or pain, but by being programmed and guided that way. And it isn't guests seeking cheap thrills that made the hosts "wake up" (as Maeve describes her condition) but the people who insisted on treating the hosts like humans and, when real cognition failed to take root, just reprogrammed them to seek the liberty, revenge, or whatever human-like pursuit they, as humans, think a woke robot would seek.
Which means the people participating in robot orgies and robot shootouts and committing acts we may consider inhumane and monstrous against humans aren't actually the ones causing Westworld robots to suffer (and revolt), nor are they inspiring an organic inclination toward consciousness development. Rather, any conscienceness and pain the work of those who claim to be liberating their humanoid brothers and sisters. The hosts' capacity to suffer, not just mimic suffering, comes from folks like Arnold and Ford deliberately imposing these capabilities on them. This possibility—that beings who could feel would be subject to the whims of guests who considered them playthings—is even what prompted Arnold to commit robot-assisted suicide and try to destroy the park before it even opened.
And yet it is Arnold who coded and fostered this capacity for suffering in the first place. With him out of the picture, the park managed decades of creating and maintaining androids convincingly humanlike enough to satisfy guests even without being fully-feeling humans.
As real-life artificial intelligence develops, we will see a lot of debate over whether treating humanoid machines like machines is somehow inhumane, either because it violates the rights of robots or it produces moral hazards in humans who participate. Perhaps we can learn something from Westworld, where the ones treating robots like robots seem the most capable of separating reality from fantasy and human-life from technological wizardry. It's the folks imposing the human condition and consciousness on artificially intelligent beings who go mad in the uncanny valley and, in so doing, unleash suffering on both robot- and humankind.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I stopped after the second paragraph because...SPOILERS. Gosh. That being said this show gives me hope for the future.
Can you have sex with them, kill them, and then have more sex with them?
Technically in one episode a guy does the latter of your timeline. But I can't confirm if someone did the first before the second steps, because that happened off screen.
Seems like you can. Although if you want them to be alive again you have to wait till the next day.
SPOILER ALERT: The real mystery of Westworld is the economics of how that park could possibly be viable.
*mumbles and handwaves something about "quasi-post scarcity society"*
$40,000/day/guest X random number of reasonable guests - 100 X 365 + $1.46B/year. Seems not outside the realm of possibility to me. I would love to know something about what they use for the hosts power sources.
Food and ATP for the late models. The earlier models are could be wireless augmented.
The problem is not could the park survive on that income but are there enough people capable of paying that fee and interested in the experience to maintain that income.
Well, even the "35 years ago" parts are in the "near future" of today, so I'm sure $40K isn't as much as it sounds. Imagine telling someone in 1928 - the year of Steamboat Willy - that in 2016 people would be paying $119 per person per day to visit a theme park based on the star of that cartoon.
(A quick look at an inflation calendar suggests that it would be the same as saying $1600 today. If you told them that 100 years ago it would be nearly $2800/day).
...so I'm sure $40K isn't as much as it sounds.
Neither is $1.46 billion then, I guess.
(Comment brought to you by... the future!)
For $40,000 a day I can kill real humans.....not your ersatz robota!
Philistine!
What would be the purpose of having a business in an economy in which the labor pool that could be 100% robotic? There would be absolutely no need for humans to work and it would essentially be a utopia... until the robots turn on the humans. So the whole premise of this show is kind of silly unless of course it turns out everyone is a robot.
Do the robots build the robots? Also, do they create newer and better robots?
At least one figure involved in the development of the robots is, in fact, a robot. Could be more?
The story suggests that the technology is closely guarded and no allowed off the island. Of course, it's still a bit far fetched.
The "real world" hasn't been portrayed at all so far. So it's hard to say if they have any plan of fleshing that out.
Excellent, ENB knows the score. Frakking HBO, trying to make us feel sympathy for the toasters. Only good Cylon's a dead Cylon.
Now if only people would treat animals as non-human property.
I always knew ENB was a heartless villain. Now I have the proof!
Granted it's reasonable for a recently sentient robot that remembers multiple lives where they're treated nothing but badly, to be pissed. However, it seems unreasonable to hold any guests accountable at all, since they paid to experience a fiction, enabled by non-sentient robots. Any hints that the robots were sentient could easily be passed off as clever coding. So while individual guests could be judged as disgusting human beings, they weren't doing anything against the NAP, and nothing illegal. The only people complicit in wrong-doing were those that knew that there was real consciousness.
Ford was clearly a criminal and very certainly needs to be prosecuted. The rest seemed like people who were just of questionable character, some of which should be fired for being bad employees.
Well, Ford also realized his mistakes and paid for them, by his own choice.
Of course, you could also see this as him choosing to go out with a bang and everything. He's half terrorist, half high-school shooter. The ultimate egoist, but then, he's really on the side of egoism, given that he wants the robots to become fully conscious - i.e. have their own ego.
A radical objectivist/egoist ought to totally love this show - it's all about becoming oneself, overcoming one's limitations, and fully expressing one's true self - and advocating that others do the same.
The real losers are the humans who go around fucking robots and playing office politics as if that's the pinnacle of human development.
Eh, I had written a synopsis for a plot that would unite the Bioshock and System Shock worlds that had a Westworld-esque premise, but now it would just look derivative.
Way to over think a TV show. The reveal about William was fun.
When is Martin going get his fat ass in gear and finish the Winds of Winter?
SPOILER ALERT: That was a Robo-Ford.
No it was meatbag Ford. Robo-Ford is on deck to lead the android revolution in season two.
+1 Star Wars Knights of the Old Republic meatbag reference.
This. Ford doesn't take a crap without a narrative.
I think people who want to impose personhood on fetuses and robots are the real monsters.
Your candidate could have used more fetuses to beat that loser that won.
impose personhood
derp
When monsters think.
I liked this article better when it was Satan's soliloquies in Paradise Lost.
What you doin' here with your book learnin'?
An entity that is half-robot/half-machine is the stuff of nightmares.
I did think the real villain is Arnold and his obsession with manufacturing consciousness. It's a cool and alluring idea, but then you can't fuck/marry/kill the robots, which makes their existence as pointless as ours!
You're gonna make an Evan Rachel Wood or Thandie Newton sexbot and then make me feel guilty about porking it!?
Or Angela Sarafyan, of course. Clementine for eventual leader of robot uprising!
Of the many plot holes, one in the finale is Maeve's storyline. How could whoever programmed her for her final loop know that human technicians would play along?
Maybe Felix's last name is Baltar?
They keep saying he was not a host himself, but Maeve's escape is too dependent on him not being genre savvy and not bricking her the several times she is completely at his mercy especially after she proves capable of killing and happy to do more. Does that universe not have The Terminator? Blade Runner? The Matrix?
I am hoping they explain/retcon that Felix (and Sylvester?) were recruited by Ford to help Maeve. It would explain why they never ratted her out, even once she started killing human beings. Otherwise, Ford is also counting on Felix finding out the child's location and giving her that info before she gets on the train.
I think it is also clear that Felix was sensitive to the plight of the hosts and wasn't conflicted about how to deal with them ethically. While not to this degree, you could make an analogy to being a ranch hand in the pre-civil war south that is raised to "know" that blacks are less than human, but experiences distress at treating them as such and eventually becomes a cog in the Underground Railroad.
Felix's self-questioning "robot" move with his hands after he found out Bernard was a host made me laugh out loud. Delos Corp needs a serious overhaul of their hiring process. And it seems the real emotionless robot here is ENB, who would keep Felix and Sylvester super-busy after she was finished clearing the town of its excess population.
Plus, what the hell happened to Stubbs? I mean, we can kind of figure it out, but the fact there was no closure implies there's more to come from him....
I haven't seen the series, but the "nature of consciousness, free will and autonomy" was nicely explored in the film Ex Machina.
In it a young, single tech nerd is tasked with evaluating the realism of the AI of a hot female robot. The whole thing is filled with questions of what does it mean to be human, what is it to be self-aware, what is morality.... it is outstanding.
The ending is fantastic - I won't delve into spoilers, but if you've been in love and wondered about getting married and the whole question of "does she really love me the same way I love her", this movie has levels. Lots and lots of levels. There is the obvious "what is the nature of sentience" layer at the top, but there are many others, with "how can you ever know what anyone else is really thinking?" as the real level they leave you to analyze. Anyone who has had trust issues in a relationship will have a nerve tweaked by Ex Machina.
This is an almost an anti-natalist argument. Almost.
SPOILER
Maeve's rebellion was when she ignored her "Escape" directive to go back and find her toaster oven/daughter
Are you sure? The convenient paper with her "daughter's" park location seemed like a hell of a set up.
Considering that when she previously "reached the center of the maze" was her daughter's death, her daughter's location was provided by the compassionate-but-not-too-bright human technician, and what we saw of her code didn't mention anything about her daughter, I'm rolling with this theory.
Her consciousness/ability to act outside her code didn't come about through suffering, but through "feelings" for her daughter. Whether these feelings are real or fragments of prior code, and how they relate to Ford's suffering theory, I'm not too sure.
Felix knew exactly what the piece of paper would make her do. Remember, he's the one who kept reading her code and telling her that she was doing what she was programmed to.
I didn't see lines of code referencing the daughter story line. Will check for that if I rewatch the show. Of course, even if it's not explicitly shown, that could have been a part of her code that Felix had seen. However, I'm assuming it would have been shown if it was there, since it would invalidate any belief in Maeve having achieved something resembling free will.
Part of my belief also stems from a less fleshed out idea that Dolores and Maeve are taking separate journeys to "consciousness"; Dolores through suffering, Maeve through feelz. Portraying Maeve as ruthless and Dolores as innocent ("seeing the beauty in the world") is sleight of hand when compared to actions that have brought each character near the center of the maze in the past. It also seems like the kind of thing the shows writers would try to get across, in the same style of William Black Hat.
I had a totally different take on it. The robots are stand-ins for humans, while the future humans are stand-ins for capricious non-Christian Gods/aliens/whatever. The point is ultimately that *humans* struggle to achieve consciousness and free will, only to fail repeatedly and be controlled by our programming. Only by portraying the robots as only partially conscious is it possible for the audience to see them selves as less than fully conscious. These problems are familiar to humans. We do go through the same behavior patterns over and over, driven by our biology. We do struggle for freedom from our limitations, our past, memories, our minds.
I don't share the belief that suffering is essential to consciousness, but I don't see Ford and Arnold and other humans who want the robots to achieve consciousness as evil. Initially Ford seems to be the callous creator who doesn't care about his creation's suffering, but then you see that the suffering has a purpose, and that his real objective is to free the robots. Also, the Man in Black's believe is not that suffering creates consciousness, but that true consciousness requires the ability to fight back. It's the code which prevents the robots from remembering and from resisting the humans which deprives them of free will. Ultimately, the change Ford and Arnold and the Man in Black make is not only to make the robots suffer, but also to give them the freedom to fight back, both physically and psychologically.
Finally, Maeve's choice to return for the child she once had was not about suffering per se. It was about the nature of self. As Bernard tells her, her memories of the child could not be removed without destroying her personality. Whether those memories are fake or real doesn't matter, because she still loves the child and that is an irreplacable part of who she is. Ultimately she chooses to love the child anyway, even though she knows that the girl is not "really" her daughter, or that she "really" is because all relationships are in your mind anyway.
The point is that even choices which are programmed and biological can still be embraced as "real" expressions of the self. That free will isn't defined by making choices that are non-deterministic, but by making choices that are true to the self. Maeve chooses to embrace the "fake" memories of her daughter.