“Westworld,” now in its second season on HBO, is about a choose-your-own-adventure Wild West theme park, where wealthy visitors go to interact with humanoid robots—called “hosts”—who pose as citizens of a fictional town. In this world, sin has no consequences; visitors can kill, rape and maim with no repercussions because they are hurting only robots.
In this world, sin has no consequences.
“Westworld” makes it clear that the robots are synthetic, metal, man-made, but it also calls into question what constitutes personhood. Dr. Ford (Anthony Hopkins), the creator of the the park, is able to program the hosts with a consciousness of sorts. An update in his code allows the hosts to remember bits and pieces of their past “lives.” (When they are killed or hurt by visitors, they are taken out of the park, reprogrammed and sent back out.) Suddenly, they can experience flashbacks, hold grudges and break from their pre-programmed scripts. It is in their lies, deception and killing that the robots start to feel real, autonomous.
The first time the host Dolores lies, she tells one of the technicians that she would never hurt a living thing. But then the scene cuts to her sitting on her porch, swatting a fly from her neck. The reaction is unaffectedly human. This small act of rebellion shows Dolores’s first glimmer of sentience. The message is clear: Free will, the choice between whether or not we follow the path of our maker, is what makes us human.
“Westworld” not only shows how robots are humanlike, but how humans can become robotic.
“Westworld” not only shows how robots are humanlike, but how humans can become robotic. Humans often make the same predictable decisions, as if they were predetermined by nature. We give up our free will for the comfort of routine, the show suggests, and lose ourselves in the process. We stop questioning our purpose.
“Humans fancy that there’s something special about the way we perceive the world,” Dr. Ford says, “and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next.”
Catholics are taught to answer ambiguous questions about what human life is and when it begins by erring on the side of caution. Life is too precious to risk. But in “Westworld,” the hosts’ deaths are not final. They die over and over again, rebuilt stronger each time. Can something be truly alive if it cannot die? Do the robots truly feel the pain of their deaths or merely mimic it frighteningly well?
Robots may be, at root, only a series of ones and zeroes. But then again, in our simplest form, humans are only a mass of cells. In the end, a being with the capacity for moral and rational thought is more than the sum of its parts. In the first season of “Westworld,” the robots gradually become more sympathetic than their human oppressors. Disenfranchised groups can see themselves in their dehumanization. Sexism, racism, homophobia, abortion and the eugenics movement all started with one group believing they were more human than another. When we reason away the rights of others, we lose a piece of our own humanity.
“Westworld” forces viewers to question their own definitions of what it means to be human and to reconsider their moral obligations. As the hosts rise up against their oppressors at the end of the season, the viewer comes to realize that they may have been sentient long before they were able to defend themselves. Yet even if the robots do not feel the full pain of death, the humans still feel the full weight of killing.