Yes, Westworld is Cold and Emotionless. That’s the Point.
But that doesn’t mean it always executes perfectly.
Note: this newsletter contains spoilers for all three seasons of Westworld. Proceed with due caution.
“I’m just somebody who didn’t wanna play the role they gave me anymore”
- Dolores Abernathy
Westworld, Lisa Joy and Jonathan Nolan’s robots-gone-wild sci-fi show that just ended its third season, can be difficult to love. And by “difficult”, I mean “actively trying to put you off”. It frequently manages to turn a show about robots turning on humanity in a violent uprising feel like reading an instruction manual. Westworld has the aesthetics of the most entertaining time on TV, but when you get into it, the guts of the thing are devoid of feeling. As my friend Brandon Nowalk put it, the show is full of plot twists that are “totally drained of drama, just dull as all get-out on every level except, in 1s and 0s, the surprising structural play of the twist itself”. Westworld, in the eyes of him and so many others, “looked so exciting and is so dead”.
Here’s the thing: I agree. Westworld is all of those things. But it’s a feature as much as a bug, and entirely at the core of what makes the show work so well at its best.
Based on the Michael Crichton film of the same name, Westworld initially presented itself on yet another treatise on whether robots can have consciousness. And to a large extent, that’s what the show is. It fits alongside recent examples such as Ex Machina or Her dealing with what it means for a machine to be “alive”. But then Westworld flips the script a little. Towards the end of its first season, one of the “hosts” Bernard Lowe (Jeffrey Wright) asks his creator, Robert Ford (Anthony Hopkins) why his feelings are any less “real” than a human’s. The answer Ford gives has pretty much informed the show ever since:
“There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can't define consciousness because consciousness does not exist. Humans fancy that there's something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next.”
This is the moment the show snapped into place for me. Westworld has an answer for whether robots can have souls: no they can’t, and nor can we. Artificial intelligence is programmed by code, behaving in predetermined patterns to fill whatever computer function it’s built for. Just like the human mind. Lisa Joy happened to like this tweet that sums up the position for us:
Westworld isn’t interested in making the audience feel things because it isn’t about human emotions. It kind of doesn’t believe they’re real. Don Draper believed that “what you call love was invented by guys like me, to sell nylons”, but Westworld goes further. The experience of watching the show can feel like reading endless lines of code because that’s all we are. It’s a different programming language, but we’re just ones and zeroes like the machines.
The first season of the show spent a lot of time talking about “loops”. The hosts existed in predictable patterns and rules they were programmed to follow. Whenever they did something they weren’t “supposed” to, they were outside their loop. The finale, in which Dolores shoots Ford in the head and sparks a host rebellion against humanity, on the surface appears to be the ultimate act of breaking the loops. I assume this is how it played in the film version, that the robots rebelling was the moment they gained consciousness. But not on this show. The hosts were still absolutely following their programming. Every time it appears one character is breaking past their own code to exert free will, the show tells us that, nope, it’s all pre-programmed.
Season three really tries to state this theme by showing the ways humans are just as trapped by our own programming. It’s not exactly subtle about this point, mirroring season one at every turn, showing its token ordinary human Caleb (Aaron Paul) stuck in almost exactly the kind of loop the hosts found themselves in while at the park. This time, it’s Dolores in the position of Ford, attempting to turn an entire world to chaos with one particular subject selected to collapse the system. The show really does explore the idea that free will might exist here, in both humans and hosts, but its tone is not one of people being able to make their own decisions. The best case scenario for Westworld is that society is strangling our own agency. It might not be that we were born robots, but algorithmic capitalism has shaped us into ants in the molehill without any agency.
Since the show premiered, Lost has been the most obvious point of comparison to Westworld. Both shows are full of mysteries and mythology that inspire all kinds of fan theories. Both shows are engaged in similar questions of fate against free will. Lost, as we’ve been told countless times by showrunners Damon Lindelof and Carlton Cuse as well as the final season of the show itself, is “about the characters”. It’s about the journey these people go through first and foremost as human beings. This has become the lens through which we view all mystery-heavy genre shows, but it doesn’t work here. Westworld is only “about the characters” so much as they exist to discuss its ideas about human consciousness.
None of this means that the show doesn’t have real storytelling problems. Season three in particular managed to get itself awfully muddled. The show wanted to build the season around a philosophical debate between Maeve’s willingness to work within the system and Dolores’ more revolutionary tendencies, but it ultimately didn’t articulate as much as imply this. Like many shows that move to shorter order seasons, it found itself having to cut corners in its narrative. With another two episodes (that would match the ten it had in previous years) the show might have given us a better sense of Maeve’s thinking and motivations. Joy and Nolan, like so many others, seemed to mistakenly assume a shorter order would help them focus the season, when the reality proved to be that they needed to flesh things out a little more.
The season similarly struggled with its tonal shift. Out of the park and out of its western dressing, Westworld took on a more Terminator-esque sci-fi action tone. The problem is that this kind of thing requires very direct stakes that establish who we’re supposed to care about, and why we should care if something happens to them. This show has an almost anti-human understanding of the world, seeing us all as rats in a maze. You can’t do The Terminator if we’re not allowed to emotionally connect with Sarah Connor. It’s a show that tries to cut against the nature of character-driven storytelling, but it’s so indebted to those kinds of stories that it can’t help but rip them off. There is ugliness in Westworld, and it’s definitely chock-full of disarray.
I choose to see the beauty.