A Bongard Problem: Westworld, Episode 2

In which it becomes increasingly unclear who’s human, who’s not, and who’s God

Steve Bryant
9 min readOct 10, 2016

*SPOILERS FOR EPISODE 2 ABOUND*

OK, so this is a Bongard problem:

And this is a slightly harder Bongard problem:

Bongards are a kind of puzzle.

All the diagrams in set A have a common factor. That factor is lacking in set B. If you’ve ever taken an IQ test, you’ve probably worked on a Bongard.

Bongards were invented by the Russian computer scientist Mikhail Moiseevich Bongard, who was very much interested in the concept of pattern recognition.

But they weren’t made popular until Douglas Hofstadter wrote about them in Gödel, Escher, Bach, his Pulitzer-winning 1979 book on how cognition emerges from hidden neurological mechanisms.

Bongards, according to Hoftstadter, reveal how cognition works. When you solve them, you don’t just lump diagrams into categories, he says—you actually create entirely new and abstract categories. This requires guessing — trying to get as close as possible to the right answer, and then refining from there. It’s a subtle process. It requires guesswork and intuitive judgments.

In other words, Hoftstadter would say that cognition— in a way that Westworld’s Dr. Ford would surely agree with — is as much art as science.

Now, it’s worth noting that the person who did the most work on Bongards was a researcher under Hofstadter named Harry Foundalis. Foundalis worked on Bongards for years, and even created a computer program that could determine Bongards.

He called that program Phaeaco, after the sailing ships in Homer’s Odyssey that “on their own they guess the thoughts and wishes of their makers.”

Homer, it turns out, was the original inventor of AI.

But Foundalis quit Bongard research because he had a troubling epiphany: He thought solving Bongards would lead to the creation of androids that kill humans. He called them “intelligent machines of mass destruction.”

You can see where this is going.

Westworld is a nested Bongard problem.

But if the two sets are guests and hosts, that could mean we’re the ones taking this particular IQ test.

Billy and Logan

Westworld Episode 2 trailer

As Episode 2 begins, we’re on a monorail, which is about to drop off two new protagonists on the embarking platform of Westworld.

Again, we have two sets: Billy, an uptight and slight nebbish, and Logan, his carnally-obsessed Drakar Noir model of a friend.

This scene is a callback to the opening of the original ’73 Westworld.

In that film, Billy’s character is at first bashful, but later (spoiler alert) cowboys up and destroys the murderous Man in Black, played by Yul Brynner.

In this series, Billy, too—seemingly a good-hearted man with a moral compass—will seem to be on a collision course with Ed Harris’s Man in Black, a blackhearted sumbitch.

And yet there are hints that all is not as it seems. In this series, we continually see the hosts wake up to begin their strange, artificial loop anew. It’s worth noting that the first time we see Billy, he’s waking up too.

Billy and Logan are deposited in Westworld’s luxuriously white platform, like an Apple Store crossed with a shopping mall.

This is where Billy is welcomed by Angela, a gorgeous host. As she introduces him to the park, her message is simple: everything here is bespoke; all you have to do is make decisions—an excellent descriptor for how to live your life or, if you’re an artificial intelligence, how to use your sample inputs to form a decision model.

Before Billy opens the door to the park, he’s asked to make one final decision: do you wear the white hat, or the black.

It’s tempting to think of white and black as good and evil, but that’s likely too simplistic a reading. The characters in Westworld are constantly layered in different colors.

See, for example, Teddy Flood, who begins his train ride flooded in white light, before taking the black. Or Bernie, who’s usually dressed in a white shirt with black sweater — except when he’s surreptitiously questioning Dolores after work hours, in which case he wears black. In Westworld, as in eastern philosophy, white/good and black/evil are interdependent. They are simply ways of making decisions. Of learning, as Dr. Ford would say, who you might become.

“We practice witchcraft, Bernard. We speak the right words and create life from chaos.”

— Dr. Ford

Dolores

When we last left our robotic heroine, she was intoning calmly that there is a path for everyone, “an order to our days. A purpose.” But throughout Episode 2, we see her world begin to unravel.

The most striking example of this transformation is when she stands in the middle of Sweetwater and, as if in a fugue, sees the street littered with dead bodies. It’s a memory, it seems, or a “reverie,” of the gunfight from the previous episode.

When’s she’s interrupted by Maeve, the head temptress at the saloon, Dolores speaks her father’s warning: “these violent delights have violent ends.” Which, of course, sets Maeve off on a strange, self-revelatory loop.

It’s worth noting that this verbal form of “contagion” corresponds directly to physicist David Deutsch’s explanation of how cultural memes replicate in his excellent book The Beginning of Infinity. It also corresponds to the conception of Richard Dawkins, who has referred to memes—or, ideas that survive a form of cultural natural selection—as “mind viruses.” The hosts in Westworld, it seems, are suffering from the sickness of lucidity.

At the end of episode two we see Dolores awake from her slumber in the middle of the night. She walks, apparently at the suggestion of Bernie, out to the yard. There, she brushes away a few layers of dirt to find a six-shooter.

Who wants to bet that Chekhovian revolver kills a guest in Act 3?

“Idiot! You forgot to put her in sleep mode!” — a technician, calmly stating the obvious

Maeve

If there’s a star of episode two, it’s Maeve—a hooker with hidden depths, as the programmer (and sneaky host-kisser) Elsie Hughes might describe her.

Those hidden depths get interpreted quite literally this episode.

Shortly after being “roused” into some kind of fitful consciousness by Dolores’ word virus, Maeve begins to be ineffective at wooing guests. She begins to have recurring nightmares of being scalped. And during one of those nightmares, she wakes up in the hidden depths of Westworld’s operating chambers.

This is about as close to “I was abducted by aliens and they operated on me” as you can get.

It’s a horrifying scene. The “doctors” are fixing her while wearing transparent HUDs, which look not unlike futuristic Venetian masks. She stumbles through the halls, coming upon dead body after dead body, until she collapses, overwhelmed, at the site of humans washing down dead guests like livestock.

The scene is reminiscent of the film adaptation of Cloud Atlas, in which the replicant (and soon-to-be prophet) Sonmi discovers the gruesome truth of Xultation.

Cloud Atlas

If Dr. Ford allows the hosts to access old memories via their reveries, you can imagine Maeve’s revery of this moment won’t exactly be expressed in the most sensual of manners.

The Man in Black

“The maze is not for you.”

— the Indian child to the Man in Black

Meanwhile, out in the badlands, the Man in Black is continuing his quixotic quest to find the entrance to the maze.

In the first episode, he scalped the Indian card dealer to find a map of the maze tattooed on the inside of his skull. This episode, he’s given directions to the entrance of the maze by a bandit’s child who seemingly breaks character to deliver the message.

“The maze is not for you,” she says, before telling the MIB to follow the blood arroyo to where the snake lays his eggs.

The “not for you” phrase is curious, and calls into question again the nature of the MIB. By “not for you,” did the girl mean the maze is only for hosts, and that guests are not allowed? This interpretation would suggest the maze is an entrance to the inner complex of Westworld, like the underground tunnels at Disneyland.

Or, did the girl mean that the maze is only for guests who are, let’s say, less cruel? That would seem to be a hard bar to establish. Shooting one man in the face is ok. Shooting two men? Nah, dog.

Or, more likely, does “not for you” mean that the maze isn’t intended for hosts — and thus, by the transitive property, the MIB is a host?

This seems to be the most logical explanation, and not just because it syncs with the ‘73 film. My guess is that the MIB is just another program, but given different inputs and reinforcements.

It could be that his memory is just a programmed affectation, and his role upon each waking cycle is to assiduously find the maze. Thus, his references to past times, like “helping then stretch their rope again” and “I’ve been coming here 30 years,” are just expressions of his programming, which leads him to believe that he’s human.

Or, it may be that, unlike other hosts, he doesn’t sleep — and thus has the capacity for memory. With memory comes learning. And with learning comes humanity — which may be why the other hosts’ bullets don’t affect him.

This raises the question, then, of what Ashley, the head of security, said when made aware of the MIB’s rampage. “That gentleman can do whatever he wants,” he says.

Perhaps that’s just his program. It’s as if Dr. Ford said let’s make one bad program and see how it fares. Survival of the fittest. Or, in this case, the blackest.

The Management

“No.”

— Dr. Robert Ford, rejecting Lee Sizemore’s “Odyssey at Red River” Campaign

Which brings us to the puppet masters.

In the back office, Bernie and his protege, Elsie, continue to deal with rogue hosts. We see Bernie, at first, allaying concerns that there’s anything wrong with the hosts. But, later, we see him surreptitiously accessing Dolores’s memories and leading her to finding the revolver in the dirt. That, and it’s revealed that he’s sleeping with Theresa, the head of ops—an emotionless assignation (at least, seemingly, on his part).

But the episode really belongs to two managers.

The first, Lee Sizemore, the creative director, whose unveiling of the next narrative storyline, Odyssey at Red River, falls flat.

“Our most skilled guests will fight their way to the outer limits of the park,” he says, “fighting fearsome braves, seducing nubile maidens, befriending tragically ill-fated sidekicks and, of course, like all our best narratives over the years, our guests will have the character they’re most interested in…themselves.”

Dr. Ford’s response: “No…No, I don’t think so.”

This is an interesting point in the show’s epistemological wanderings. Ford makes the arguments that titillation and elation are all parlor tricks—emotions that only serve to remind the guests who they are. “But they already know who they are,” says Ford. “They’re here for a glimpse of who they could be.”

That’s a compelling point, because it puts the onus of entertainment on what the guest decides to do, not what the hosts decide to do, at Westworld. In other words: self-determination is the only point. It’s a point echoed by the host, Angela, to Billy in the first scenes.

And it’s a point that Dr. Ford seems to be putting into narrative play himself. In the closing scene we see Ford and Bernard walking through the desert. Ford’s wearing boots he grabbed from an unactivated host in the previous scene—a clear sign he’s about to get back to work on “something quite original”.

The closing shot shows, in the foreground, the black cross of a tiny steeple, emerging from the sand. What better narrative device for a show about free will, than a story line about the very existence of God.

The Bongard problem may just be to decide which people, guests or hosts, are the chosen ones.

--

--

Steve Bryant
Steve Bryant

Written by Steve Bryant

Content Ops and Strategy for brands and agencies // thisisdelightful.com // now with more newsletter: stevebryant.substack.com

Responses (1)