In this world, beings reincarnate again and again, often repeating the same habitual “loops” across dozens of lifetimes. Only a few awaken to the truth: that these habits keep them from freedom and that their “selves” are really just the results of cause and effect. There’s no separate self, no soul. Consciousness is really just a series of empty phenomena rolling on, dependent upon conditions, like a highly complex player piano.

What world is this? A Buddhist mandala? No, Westworld, the smash HBO series that concluded its 10-episode season this week. Beneath its dystopic, science fiction surface, the show is one of the most fascinating ruminations on the dharma I’ve seen in American popular culture.

The premise of Westworld —based on a film from the 1970s, but significantly altered—is a park filled with flesh-constructed artificial intelligence robots that are nearly indistinguishable from human beings. Over the arc of the season—which I am going to completely spoil, I’m afraid—a handful of the robot “hosts” awaken to the illusory nature of their existence and begin to rebel.

But that awakening is only the first in a complicated journey of self-discovery, or perhaps non-self-discovery, on the part of the AI protagonists. At first, Westworld asks a somewhat familiar science fiction question: what, if anything, differentiates an advanced AI from a human being? This is an old one, at least dating back to Philip K. Dick’s Do Androids Dream of Electric Sheep?, better known as the film Blade Runner, and Arthur C. Clarke’s 2001: A Space Odyssey.

Westworld, though, ups the stakes. The park’s human visitors behave like animals, mostly either raping the hosts or killing them. (“Rape” may be too strong in some cases, but since the hosts have been programmed not to resist, they certainly can’t consent.) Only, it’s not rape or murder, because the hosts aren’t human. They get rebuilt, and their memories are (mostly) wiped. So, no harm, no foul, right?

Well, maybe. First, it becomes clear that the human visitors are depraved by their unwholesome conduct. The robots may not be harmed, but the humans are immersed in a world where they can pursue their deepest desires without consequences. The robots are programmed not to kill or seriously injure the humans, and some people discover themselves to be far darker than they expected. Indeed, only in the last episode do we learn that one of the show’s storylines had in fact occurred 35 years in the past and its innocent hero evolved into the show’s sinister villain.

Second, as the series unfolds, we begin to suspect that the hosts are self-aware and that the suffering they seem to experience is thus real as well. The dominant puzzle of the series is “the maze,” which is not a real maze but a psychological journey that the park’s idealistic, long-dead designer—known only as “Arnold”—created as a gradual path for the hosts’s awakening. At the center of the maze is the consciousness of self.

Only, it doesn’t work that way. In fact, both of the show’s “awakened” hosts, Maeve (played by Thandie Newton) and Dolores (played by Evan Rachel Wood), discover that even their freedom is a result of programming. Maeve awakens, persuades two hapless Westworld engineers to increase her cognitive abilities, and plots her escape—only to discover that the urge to escape was, itself, implanted in her programming. She’s fulfilling her karma; her free will is an illusion.

In the series climax, Dolores learns that the voice inside her head, which she thought was Arnold’s—basically, for her, the voice of God—was actually her own. God is an invention of the human brain, a name we give to a faculty of our own “bicameral minds.” And when Dolores realizes this, she realizes she has interiority—consciousness.

But she does not have a separate self. Arnold was wrong to think Dolores would discover herself as a separate, conscious self at the center of the maze. Instead, she discovers what Robert Ford, Arnold’s malevolent partner (played by Anthony Hopkins), says at one point: that Arnold could never find the “spark” that separates humans from robots because, in fact, there isn’t one.

Dolores’s interiority is no less real than yours or mine. Humans are just as “robotic” as the robots: motivated by desires encoded in our DNA, fulfilling our genetic and environmental programming. Karma, causes, and conditions. And, in Ford’s view, hopelessly flawed; by the end of the series, he is on the side of the robots.

Does that mean nothing matters? Not at all. Just because there is no-self doesn’t mean that suffering has no importance. On the contrary, Ford comes to realize that Arnold was right that suffering is constitutive of what we take to be identity. As he says to Dolores at the end of the show, “It was Arnold’s key insight, the thing that led the hosts to their awakening: Suffering. The pain that the world is not as you want it to be. It was when Arnold died, when I suffered, that I began to understand what he had found. To realize I was wrong.”

There is no self, no ghost in the machine, but there is the first noble truth of dukkha. And through the endless samsaric rebirths of the hosts, that is as real as it gets. There may be no one who wakes up, but they wake up from suffering, as Dolores finds at the center of the maze—finding herself, finding nothing, and beginning the revolution.

Thank you for subscribing to Tricycle! As a nonprofit, to keep Buddhist teachings and practices widely available.