The 7nth episode of Westworld resurfaced into goodness after two mediocre ones. The plot moved. Superficial complexity reigned in, although there’s still an annoying amount of petty agendas driving the plot, instead of what I care about, scientific potential and mythical depth.

This was Bernard episode. The initial sequence with Bernard dreaming about his son is interesting not because of what’s obvious, but because the recurring question “have you ever questioned the nature of your own reality”, asked by Bernard himself, is superimposed on the same sequence, BEFORE the transition to the new scene. I guess for most that little hint was lost, but the moment I noticed it I knew that the question was referred to Bernard himself. And in fact the episode ends answering that question. That was some perfect opening and perfect closure.

During the week I started to be persuaded of the fan theory of Bernard as a host. I never pick up fan theories until there are elements in the show that offer concrete references, and in this case more evidence was piling up. The problem is that the evidence I got is all stuff that is still not confirmed in the episode: Bernard is supposed to be Arnold, that’s what motivated the concept of Bernard as a host. Also, when this idea of Bernard as a host was tossed around, most doubts revolved around the scene of Bernard going to talk with his wife. So it’s good that this episode clarified that part. It’s all done deliberately to make it more convincing.

Is Bernard Arnold? The reason I was picking up the idea of Bernard being a host, and modeled after Arnold, was because it compresses some more complexity. There’s this dangling thread we’ve lost behind: not only Ford was interfacing with Dolores, but also Bernard was doing that, telling her about the potential of consciousness and the maze. Now with Delos out of the picture (their meddling was just an attempt to steal the code, there doesn’t seem to be more than these petty reasons), we still have three subjects tampering with Dolores and other hosts. There’s Ford’s main code, there’s Arnold in the form of “ghost in the machine”, and then there’s that scene with Bernard and Dolores.

If later will be revealed that the hidden voice Dolores hears, and that she keeps secret from Ford, is Bernard’s own, then we obtain that Arnold is acting as Bernard. So in this case we have a different split. There’s the Arnold that died, that lurks in the hosts’ code, and there’s an Arnold that survives in the artificial form of an host that Ford built in the shape of his colleague. Another good reason to confirm this is that we’ve been explicitly told not even a picture is left of him. So there needs to be a good reason to “hide” what Arnold looks like, and the only good reason is that it would reveal something: that Bernard looks like Arnold.

Problem: if Bernard is an host under Ford’s control then Ford already knows that Arnold is messing with the code. That’s what Elsie revealed to Bernard in the last episode.

So, this episode puts everything back firmly into Ford’s hands, and I’m relieved.

Problem number 2: this episode both confirmed and denied a popular fan theory. The idea that a segment of the show is happening in the past. The Man in Black is present day, whereas William and Dolores are in the past, and William will become the MiB.

The confirmation comes from a quite explicit hint. William this episode says
“This place, this is like I woke up inside one of those stories.
I guess I just wanna find out what it means.

And this is echoes exactly what the MiB has said a couple of episodes back. It’s a direct reference, and this show doesn’t drop hints casually.

But it also seems to me certain aspects are not coherent. Maeve is awakened in the present, and we know she’s been awakened by Dolores (and Dolores by her father). In the scenes with William we do see an awakened Dolores. Once again it seems way too contrived to have this duality where Dolores is awakened both in the present AND the past. It’s too clunky. And yet that hint between MiB and William is too big to be ignored.

The only possibility is that past and present are similar because they are mirroring each other. Dolores awakening in the past, triggered by Arnold, is what ultimately caused the crisis leading to Arnold’s death (and we know the MiB is the one who “stopped” the crisis, maybe William killing Arnold once he knew it was Arnold manipulating Dolores as a love interest). But Dolores has been awakened even in the present time, as if the cycle is now repeating, maybe this time triggered by Ford. But it’s still too messy for me. In episode 5 we’ve seen Dolores fainting in what’s supposed to be the past, to be recalled in the present and have a conversation with Ford. This is either heavy handed misdirection, or a good proof we don’t have these two timelines.

Finally I wanted to point out the most important aspect for me, and that’s some thematic depth. An idea of conflict between Free Will and consciousness. We usually think they are directly causally connected, consciousness means having free will. But this episode suggests a new way to look at the two, and to keep them separated.

“Being free” means exiting the code. Behave in a way that can’t be predicted, and so that violates some rules that define a behavior. Free Will cannot be coded, it inherently implies the possibility of acting otherwise, of stepping outside a code. But instead there’s nothing inherent to “consciousness” that negates the possibility of codification. We know that consciousness is an hard to crack problem, and philosophers say maybe it’s impossible to solve. But that’s the horizon. We don’t know what to make of consciousness. The problem is exactly whether it is merely complex code, or something transcendental. Something about gods and the world outside the world.

During the “demo” of this episode, the scene where the host is shown to violate the rules, all being set up by Delos to put the blame on Bernard, it is explained what “consciousness” is. There’s irony, because what they say in order to frame Ford/Bernard is exactly what Ford is doing. The reveries allow the hosts to tap into previous cycles, and integrating that former information into their present selves allows them to… guess what? Introduce “new information”. Loops that were supposed to be closed, are instead now left open. That means all this still happens within deterministic code, because previous memories being new information alter the loop behavior, but they don’t directly alter the underlying code.

The reveries allow the hosts to reach a form of consciousness, of awareness. They let them *question their own reality*, same as Maeve is doing. Maeve isn’t behaving outside her code. She’s simply behaving in the way a host would behave when exposed to information that wasn’t previously available, or supposed to be available. She understands she’s part of a loop, she suddenly receives information about the reality of her own reality, so information to correct a blindness, anosognosia. But she’s still a slave of her own code. It’s not new code. It’s the same old code that is being fed new data. The new behavior of Maeve is not unpredictable. It’s new behavior because the information was new.

(leading to my suspicion: that the Arnold code Elsie discovered is Ford’s. So Ford has nothing to learn from that revelation. He’s the one who’s introducing Arnold’s code back into the hosts, in the form of those “reveries”)

This self reflection and self awareness is “consciousness”. And now Maeve can alter her own mind, recursively, giving herself new capabilities. But, again, it’s still the same underlying code reacting to new data. It’s still deterministically sent on its course. But this means the hosts (and human beings in general) aren’t really “conscious”. They only have the appearance of it. It’s still code.

The big point here is that it’s all relative to the level of the analysis. The hosts, at the bottom level, are already as free and conscious as possible, being life-like. To an external observer, like Bernard, that freedom is limited, because he sees the code and can predict the hosts’ behavior. They are just robots. “Awakened” hosts are one level further, they are aware of the loops, Maeve becomes aware she’s an automaton in a park, going through cycles, she even gets the possibility to self-correct by reprogramming herself, but again she’s still slave of the code that initiated this. She’s still not free from the point of view of someone higher in the chain like Ford.

The big point is that freedom is inversely proportional to the information available. The more information you have, the more you realize the artificiality of the process. The Maeve before awakening was entirely “free”, exactly because she wasn’t questioning her reality. The experience she had was directly believable. The choices she made, to herself, were perfectly free for the level of awareness and information available to her. No different from the level of awareness and information we ALL possess by living this life. But the more she receives actual information of the Big Picture, the more she should realize that freedom is lost. She sees her own code, her own dialogue trees. No matter how she recursively feeds that information to her own code, that code is inescapable. Self referential loops don’t break the pattern, it’s merely mise en abyme. Information increases in a way that is inversely proportional to freedom.

Which means, Consciousness and Free Will are the qualities of being limited. Of living under a dome.

Even if this is used in the show for a slightly different and plain meaning (he means the hosts are merely free from human pains), Ford’s lines are revelatory in all their power:

I have come to think of so much of consciousness as a burden
The hosts are the ones who are free.
Free here under my control.

That’s exactly how it is in “reality”.

We have Free Will because we are limited. Because we don’t have the information. The more information we get, the less free we become.

I could have skipped writing about this. The 6th episode is marginally better than the 5th, but still not good overall, and the writing is maybe even worse. I don’t even know how it’s possible to go downhill so fast, all the thematic depth of the first four excellent episodes has been completely swept away to the point there’s almost nothing left. Not only these last two are bad episodes, but the wreck all that came before.

More side-plots are being added without any elegance or consideration, to the point that certain characters can’t even appear every episode since they added so much bloat they have to proceed in a two-step kind of manner. But this apparent richness of things to say is extremely sterile and cliche. It’s not about exploring the depth of the themes, it’s all about superficial plot bloat and very artificial conflict. On top of a very irritating level of completely unexcused obfuscation.

The “mystery” show is fun when you’re given some pieces of a puzzle to fit together, and then keep filling in until all those pieces move to the right place. But instead here every episode keeps adding brand new pieces that prove you never had enough to solve any mystery. In fact there’s no mystery at all, only a bunch of poorly excused factions that do not earn any interest.

No one in this show has any motive, because motives would reveal too much.

The rest is some objectively bad writing I really wasn’t expecting for this kind of show. The first big issue is that the scenes with the MiB were purely superfluous sidetracks that literally added nothing beside providing another excuse in this episode to show some more shooting and blood. It’s okay if it’s a consequence of something going on, but here the MiB is captured only to get released once again. The whole thing could have been erased from the episode and we wouldn’t have lost anything.

The other big issue is Elsie tracking down the signal into some old arbitrary deposit, wasting time tapping on arbitrary crates, looking at arbitrary devices. It’s all a prop. The whole scene is so poorly written that it seems completely out of place. First they rely on the annoying cliche of “I’ve made a big discovery but I can’t tell you now”. Thankfully it’s not 100% stupid and the phone call happens later in the episode, but without really revealing anything. And then she gets caught but some unknown presence, because that’s the shitshow Westworld has to derail toward, apparently. “Oh no! Elsie has been caught!”

With just 10 episodes in the season, and dire hopes to see this renewed for a second season, without even thinking about the now pretty stupid plan of 5/6 seasons, this is quickly becoming just an exercise in frustration. They are planning for a long term they don’t have, and as a consequence they are wrecking the little they can have.

So what happened this episode? That we now know the “conflict” grows to three different agents, all with unknown motives. There’s Ford, whose own mystery plan is tied to this new storyline in the works, but apparently Ford is not anymore the genius of the first episodes. Stuff happens around him and he’s as surprised and caught with his pants down as everyone else. Then there’s the actual revelation of this episode, the fact that “Delos”, the company that finances the park, is smuggling data out of it, probably as a way to appropriate the thing and take control away from Ford. And finally there’s Arnold, the ghost in the machine, that is now a very obvious active agent.

This means it’s not anymore Ford who is working to give the hosts their consciousness. All the subtlety and complexity I had seen in the character is GONE. The “reveries” he coded were just that, ways to make these puppets more life-like. They weren’t part of a plan, they didn’t have more to them. Ford is just plain stupid and he just didn’t know about anything. I had misinterpreted his calm as insight. Instead he’s just too stupid to figure out the problem. The show seems to suggest Ford was the unchallenged king of this place for so long he grew complacent.

The problem here is that actually nothing changes from episode to episode. They just keep shifting the goalposts. There’s still someone working to give the hosts consciousness, but it’s Arnold instead of Ford. Absolutely nothing changes in the economy of the story. There’s just this shift of motivations from one character to another, due to an artificial obfuscation, that is meandering for no real reason.

Initially it seemed it was Ford that jumpstarted the host consciousness by giving them access to previous memories/cycles, but no, he did that just for the aesthetics. Someone else is reprogramming them. It is Delos, who’s smuggling information for their own corporate businesses. But there’s not just Delos, because there’s another third party who’s recoding the host, and that’s Arnold. And it’s Arnold who’s actually unlocking Ford’s reveries for what they actually are (full access to memory).

How many more factions we need in the show re-coding the same system? How anyone thought this multiplication of obscure agencies was a good idea?

That, without even considering there’s Bernard TOO interfering with Dolores’ programming. IT’S A FUCKING MESS. It has no thematic depth, it has no substance. It’s just a tangle of artificial plots built just for the sake of complication. Arbitrary people struggling for power, is this Game of Thrones?

Why is it that, after 35 years, and exactly when Ford decided to code the reveries, it is now Arnold to surface right at this time to unlock the Hosts memories. This show still withdraws fundamental information that is necessary to FOLLOW the show as a coherent thing.

Instead we moved from the first intelligent episodes dense with depth and layers of meaning, to a shitshow of incoherent plot lines that were inflated to the point that now they can’t even fit all together in a single episode.

I was initially thinking Westworld was going to be canceled because it was too smart and too dense for a large public, but nope, it looks like it is going to be canceled because it is too stupid.

“Oh no! Science has gone too far!”

I had big expectations about this episode, but instead it was a rather weak one. Plot is a bit meandering and there wasn’t much depth to the ideas and themes, this time.

Most of my theory seems already gone. This episode added a number of brand new elements to the puzzle, so previous theories cannot fit with the new picture.

That said, I was reading EW recap, including absurd theories like Arnold being Ford or Dolores being Arnold. Not only I find this incredibly silly, but it’s incoherent with what I saw on screen.

What I got from the episode is this: Ford gets emotional at the end of that dialogue with Dolores. Dolores asks if they were old friends, and Ford replies nope, not friends at all. That seems so clear to me. Dolores was Ford’s wife or girlfriend or something like that. She probably died as well, back then, and either Ford or Arnold made a copy. It’s even possible that Dolores was the love interest of both Ford and Arnold, and that brought conflict. Or Arnold built a copy of Dolores after the original’s death in order to convince Ford that these androids should be more than machines.

In general there’s this obvious dichotomy with Ford on one side coldly treating the park as a toy, and Arnold instead seeing it as something more. But then why is Ford the one currently programming the new update that is giving the host awareness? It seems to contradict his motivations.

Complicating things is the fact Dolores isn’t simply gaining a sort of introspection, but also hearing voices. So this creates a contradiction. Initially it seemed Dolores was an instrument of Arnold or Ford, sent on a path of self-awareness. But now it’s shown she instead follows a voice and that voice has been identified with Arnold. These two aspects do not make a whole lot of sense.

Same for the encounter between Ford and Man in Black. They talk without achieving anything. MiB says he’s the one who saved the park. It’s possible he’s the one who killed Arnold (or maybe he saved the park by just putting more money into it), but now he’s after Arnold’s plan. Ford is interested, but he won’t stop MiB.

I’m not too sure what to do with these pieces.

I was updating the previous post as I looked up more stuff but decided to yank all that and move it to a separate one because it looks like all pieces of the puzzle already fell into place. We have a fairly plausible ending, at least for Season 1 (hopefully they at least get to this point).

If it turns out I’m right then it means they dropped too many clues, or just didn’t spin this well enough, also because I still think that it ends up a little too dry.

After listening to this, seeing the very obvious reference at the core, and reading straight from Nolan that “We wanted a big story. We wanted the story of the origin of a new species and how that would play out in its complexity.”

So how does Westworld end?

It’s plausible to assume that the show is pointing both the Man in Black and Dolores to the same “Maze”. What we know about this Maze is that it’s where the real endgame is, that it’s “a story with real stakes, real violence”, and that if Dolores finds the center she’ll be set free. It’s easy to connect the dots, the first episode opens with Dolores versus Man in Black, and both seem now to converge at that showdown right in the center of the Maze, maybe as the climax of the season finale. So we can assume the maze is that particular place where guests like the Man in Black aren’t anymore protected by their supernatural status and both guests and hosts play under the same rules, so that the hosts can actually harm the guests.

The showdown at the center of the Maze will likely see Dolores prevail on the Man in Black, since it projects a nice arc and loops back to the first episode where Dolores was instead the victim, and this likely will trigger a full-blown rebellion, lead by Dolores herself. Something close to “Rise of the Planet of the Apes” reboot, where in this case the androids seize the simulation itself, not only setting themselves free, but starting a conflict.

(The “bicameral mind”, being the device Bernard uses to normally interface with Dolores, giving her voice commands she ends up receiving without explicit awareness, since she’s normally bound by her fictional perspective, is likely the mean through which Dolores will gain her freedom. Being able to take charge of her own programming. She seals her mind in, becoming immune to external control.)

All this being part of Ford’s master plan. Because it’s obviously Ford who is triggering the whole process, starting to inject some self-awareness into the hosts. All the scenes where Ford mistreats androids as “things” are pure misdirection and ways to directly manipulate Bernard to send him on the opposite path. Ford shows so much cynicism to Bernard that Bernard ends up empathizing as an inverse reaction. But very obviously that too was carefully anticipated by Ford. To Ford his fellow human beings are very simple to understand and control, that why he plays the god’s game: to jumpstart a better species. The overall theme is the creature versus his maker, in order to gain freedom the gods need to be killed. A form of “patricide”. And that’s why there’s also a new planned storyline that seems to play around the theme of “religion”, so that Ford can give the hosts awareness of their cruel “gods”, and to trigger that paradigm shift, the rebellion against the gods themselves in order to seize real freedom.

So Ford’s behavior is ultimately ambiguous, he cares for his androids more than he cares for his fellow human beings, because his ultimate plan is to replace them. In the end he’s only working to complete the job that his partner Arnold started.

I was thinking of highlighting this quote from Scott Bakker, because it’s meaningful, touches on the ‘meta’, and imagines what happens to literature when the world changes. It also links back to this, if you want to look at it from the specular opposite perspective (“the inside”).

“Exactly the same lesson is learned by Captain Kirk and Captain Jean-Luc Picard as they travel the galaxy in the starship Enterprise, by Huckleberry Finn and Jim as they sail down the Mississippi, by Wyatt and Billy as they ride their Harley Davidson’s in Easy Rider, and by countless other characters in myriad other road movies who leave their home town in Pennsylvannia (or perhaps New South Wales), travel in an old convertible (or perhaps a bus), pass through various life-changing experiences, get in touch with themselves, talk about their feelings, and eventually reach San Francisco (or perhaps Alice Springs) as better and wiser individuals.” 241

Not only is experience the new scripture, it is a scripture that is being continually revised and rewritten, a meaning that arises out of the process of lived life (yet somehow always managing to conserve the status quo). In story after story, the protagonist must find some ‘individual’ way to derive their own personal meaning out of an apparently meaningless world. This is a primary philosophical motivation behind The Second Apocalypse, the reason why I think epic fantasy provides such an ideal narrative vehicle for the critique of modernity and meaning. Fantasy worlds are fantastic, especially fictional, because they assert the objectivity of what we now (implicitly or explicitly) acknowledge to be anthropomorphic projections. The idea has always been to invert the modernist paradigm Harari sketches above, to follow a meaningless character through a meaningful world, using Kellhus to recapitulate the very dilemma Harari sees confronting us now:

“What then, will happen once we realize that customers and voters never make free choices, and once we have the technology to calculate, design, or outsmart their feelings? If the whole universe is pegged to the human experience, what will happen once the human experience becomes just another designable product, no different in essence from any other item in the supermarket?” 277

(an aside: That last quote is a very unlikely scenario in my opinion, because it describes a fully reductionist strategy to solve a system that is absurdly high in complexity. And you cannot really apply a reductionist strategy to a system where you know less than 10% of its elements. It’s not that the reductionist approach is not possible, it’s that we aren’t even remotely there to make it plausibly work. We are majorly underestimating the scale of the task.)

Then I watched Westworld fourth episode and, amidst delicious fourth wall elegant dancing, the Man in Black delivers a nice connection to the same argument.

– Do you know where you are?
– I’m in a dream.


The hosts don’t imagine things, you do.


– If you did consider your choices, you’d be confronted with a truth you could not comprehend… That no choice you ever made was your own.

Locked in your little cycle like a prized poodle after its own tail.

You have always been a prisoner.


– But this world… I think there may be something wrong with this world.

Something hiding underneath.

– There’s something I’d like you to try. It’s a game. A secret. It’s called… the Maze. It’s a very special kind of game, Dolores. The goal is to find the center of it. If you can do that, then maybe you can be free.

– The hell you hope to find, anyway?

– This whole world is a story.

That last line is a bit of a mix of two different scenes and it connects to the quote above about the “meaningful world”. Story is meaning. The Man in Black is after that story:

– I’ve read every page except the last one. I need to find out how it ends. I want to know what this all means.

And of course the creator of this system legitimizes all that in another scene:

– It’s not a business venture, not a theme park, but an entire world.

We designed every inch of it. Every blade of grass.

In here, we were gods. And you were merely our guests.

This fourth episode seems to point a light at the whole religious undercurrent, so this time I can speculate on what I think is going to be an element of the show: Ford (the “god” of the system) wants to insert the ‘meta’ into the story itself. Making the creators of the park appear within the park as a form of religion.

Why? There can be two ways to interpret this. One is too clever though, the other a bit trite. The trite one is about injecting in the system some metaphysics. In the park there are walking fourth wall “breaches”, the demi-gods who fuck and kill as they please because they play on a different level of rules. They know the world is “fake”, they can’t die, they know it’s all a game. So both demi-gods (the visitors) and gods (the showrunners, so to speak) have active metaphysical intervention inside this system. Literal gods with god-like powers. They can shape and transform, play as the please with a different kind of “game”:

– My father would tell me…
that the steer would find its own way home.
And, often as not, they did.
Never occurred to me that we were bringing them back for the slaughter.

The other way is too complex to be plausible even for this show, though. It’s linked to the quote above where Bernard says “the hosts don’t imagine things, you do”. The metaphoric value of that line is that if you hold a reductionist model of consciousness then there’s no meaning, ever. That sort of first person, high level observer is an illusion. The truth of all human life is that “all things” are imagined, because no one is actually “free”. We are all just machines that behave accordingly to their wiring. Consciousness itself is an illusion.

But what happens *inside* the park is an unprecedented pattern. Some of these machines are starting to “integrate” information they didn’t normally have access to. They break the very substance that makes consciousness “appear”. This happens on two levels. The first is about finding in their memories information about their previous cycles/lives. The second level is the hypothetical one (this religious sidetrack): they receive information from the “gods”. This too is a breach of the fourth wall. Information that comes straight from the outside of the system, and because of its nature (it comes from the outside, so it “opens” the system they are normally locked in) it’s information that can set them free. Or the freedom to understand they aren’t free. They start seeing themselves for what they are (see the last scene of the episode where they realize “none of this matters”).

The paradox is the one at the very foundation: because the system is deterministic and closed, you have free will. Because the system is closed, and so you cannot access the information that tells you that you’re just a robot. So you’re stuck believing in free will.

But in this “park” the system isn’t anymore closed. The world is continuously breached by gods and demi-gods. And if the system is cracked open, these robots will start to question their own reality. The illusion of consciousness is coming down, so that it can be rebuilt in a new form.

From X-Files, LOST, Fringe, Awake and True Detective, it seems that television still has something to offer that tickles wild, creative speculation (I’ve yet to see Mr. Robot, so I don’t know if it fits there too). Now we have Westworld, that exists perfectly in the same fold. That it is so clever and ambitious, and about the very stuff I enjoy the most that I’m surprised it can actually exist, and that I fear won’t even get close to its full potential since they planned something like six seasons and I seriously doubt the larger public is going to stick with a product that is so dense and layered. It’s my own particular quirky, eccentric flavor. It is going to have an hard time trying to please everyone else while retaining its ambition.

So I’m also thankful to read someone like Jeff Jensen, who during LOST, Fringe and True Detective was writing the ‘recaps’ on EW, but they just weren’t simply recaps, they were OPENING the episodes WIDE. They were bursting with interesting ideas and possibilities. Shows like Fringe were always more powerful about what they were suggesting than what they were explicitly doing. Because it’s fun to run with the ideas and see how they might play out in different contexts. To see what they actually mean outside strict plot functionality. The ‘meta’ was more fun than the explicit content.

All this long premise to say I’m going to interpret Westworld in ways that probably no one has attempted or will attempt. I’m pushing the ideas to their limit, instead of sticking to what the authors plausibly drove toward. I’m running with it. But this without disrupting the content of the show. I’m not writing “fan theories”, I’m exploding out the interpretations. The bigger picture. The ‘meta’ itself.

The first thing is the image above that probably everyone else dismissed without a thought. The mise en abyme. Not only this is a symbolic concept written in the show: the effect is what you obtain playing with mirrors, and mirrors have a role in the “consciousness” of the AI, we’ve seen multiple scenes where Dolores looks at herself in a mirror (it’s by seeing herself that she can question her own reality, of course), but at the same time this also symbolically represents the ‘meta’ of the show. There are fictional ‘showrunners’ that write the stories taking place inside the park, as if the park was a surrogate of the TV show itself. A game of mirrors: what is inside reflects what’s outside, recursively. This is purely second-order observation, second-order cybernetics. But it doesn’t stop there, because that image also represents consciousness itself. Hofstadter’s strange loops. Human consciousness is shaped recursively, self-observing in a pattern. It returns on itself, over and over, until everything blurs out of definition. It applies to itself over and over the distinction between system and environment (Spencer Brown Laws of Form as used by Niklas Luhmann). An observing system in order to make an observation operates a distinction. While self-observing the observing system makes a distinction between the self that observes and the self that is observed. Being both subject and object, it obtains a double from a whole. It creates the Cartesian dualism that makes human experience possible, and makes it alienated from reality (reality that has no actual dualistic levels, it all operates on one). The fundamental illusion that is one of the basic premises of consciousness.

The second aspect is the wildest one, and the one I’m pretty sure absolutely no one is going to contemplate. You can read it here, and that’s it. I’d really challenge the writers of the show because I’m sure they didn’t dare go there, or even THINK about seeing it this way.

Here’s a couple of quotes from Alan Moore talking about his book, Jerusalem:
If you read only one Alan Moore Jerusalem interview, make it this one

Deep into our six-hour talk, somewhere around the dessert (three scoops of ice cream for Moore, hold the whipped cream), the Sage of Northampton is explaining how he came to see the world as Doctor Manhattan does. In 1994, he experienced an “absolute, crystalline understanding” during a magical ritual. Since then, Moore has believed, as Einstein supposedly did, that time is a solid in which our lives are embedded; it is only our perception of it which makes it appear linear.

In other words, everything that has ever happened is still happening. Everything which is about to happen has already happened. We never truly die: the lives we are living now are solid and eternal. That’s all major religions out of business, then.

“The thing is,” says Moore, “we don’t have free will, or at least that’s what I believe, and I think most physicists tend to think that as well, that this is a predetermined universe. That’s got to pretty much kill religion because there aren’t any religions that aren’t based on some kind of moral imperative. They’ve all got sin, karma or something a bit like that. In a predetermined universe how can you talk about sin? How can you talk about virtue?”

Four decades later, this year, he was doing a spoken performance in Milton Keynes, in which he riffed on an article in New Scientist which speculated that because we will soon have quantum supercomputers capable of holding more particles than there are in the entire universe, we will then be able to simulate an entire universe, including all the life forms in it, which will not know they are simulated.

“And if we’re going to be able to do this,” says Moore, “the odds of this being the first time this has happened are vanishingly small. It is much more likely that we are in a simulation, of a simulation, of a simulation, and so on.

The programmer of the game, therefore, will be God. And if he is at all like the humans he has created, the article postulated, he will want to put an avatar of himself in the game.

Westworld 2nd episode:

“Everything in this world is magic, except to the magician.”

See what I did, when you use that as a frame of reference for Westworld?

Westworld’s “hosts”, the AIs, exist in the exact same context Alan Moore described.

A simulation, of a simulation, of a simulation, over and over. This equals the hosts storing in their memory archive their previous ‘roles’ and ‘storylines’. At every cycle they are reset and restarted. At the same time an external observer can go sift through those memories and consider them as a kind of “solid”, something that already went through and that can be replayed.

So, the AIs of Westworld represent metaphorically the same structure to the larger system of reality. Trapped into cycles but without means of accessing information of the previous ones. Bound to that occluded horizon, caged in their fictional lives.

This is, once again, a game of mirrors. You artificially fabricate an AI that reflects life as it is experienced. It recursively recreates itself. Consciousness is the status of being trapped inside. And the AI consciousness is not unlike the one of its creators. The same rules apply.

And so the third aspect. How consciousness for these AIs works. This is specifically something that the last third episode provided, in two particular moments.

The first is the mention of the Bicameral Mind theory by Julian Jaynes. Quoted as a first attempt to reproduce and unlock the mystery of human consciousness. They say they eventually abandoned that approach, but it is interesting they referenced it specifically.

Then, Dolores’ first display of something that resembles consciousness is that even in analysis mode she isn’t able to “explain” something she said. Something “unexpected” happens. But the truly important aspect is that she doesn’t know. She’s unable to track her own thought.

This is fundamental because it reproduces Scott Bakker theory of consciousness (Blind Brain Theory). You can read here an absolutely perfect story that explains it intuitively:

It is defined “conscious” a thought process that the mind isn’t able to track. A thought that “appeared” in Dolores’ mind that she doesn’t know how it came to be. That seems to be non-consequential, outside the domain of self-analysis.

The Bicameral Mind can too be interpreted as a form of a similar feature, if much simplified. One “chamber” doesn’t know the existence of the other, so the conscious mind “receives” thoughts that seem external, alien. That come from somewhere else, a god. A memory that one has but cannot recall. Even in this case the basic feature is the occlusion.

Consciousness, in Bakker’s Blind Brain Theory, is “a magic show”. Or more precisely, it’s absence of information.

The magician can make you believe an object magically moved from one hand to the other by hiding the movement itself. It’s information that was withdrawn. Hence that object magically jumped from one hand to the other because you missed the information of the actual movement.

In the same way consciousness is just a magic trick. Since consciousness is structurally blind to its own process, consciousness cannot see how thoughts are actually formed. It doesn’t know their true origin. They just suddenly appear. And what consciousness can do through introspection is to confabulate an explanation. Post hoc.

We don’t know if the AI in Westworld is faithful to this theory. But for now it respected the basic feature of what we recognize as consciousness: the impossibility to track a thought. The trackless space. The invisibility of the mental process to itself.

The ideal of the cosmic cycles of simulated reality, downsized and applied to the single AI system, creates the possibility of a kind of “bicameral mind”. The AI receives inputs from previous cycles. These are experiences that are unhinged from a sense of history that the consciousness is able to track (since the AI consciousness can only normally access memories that are part of the current active cycle). They are alien thoughts, alien voices, interferences that will have to be confabulated back into an explanation.

But again, the basic feature that creates consciousness is not the source of those thoughts, what’s truly crucial is simply the occlusion of the process itself: the fact that the AI can’t track its own process, that it is blind to itself.

Consciousness is not freedom. Consciousness is withdrawal of information. The more limited your access, the more conscious you are. Freedom by darkness.

Are Westworld showrunners even aware of what they’re doing? Or are they stumbling into all this because that’s the natural point where these things ultimately lead, regardless of the path you take?

As mentioned in my review of the first book in the trilogy, Wolfhound Century, I had a brief “rant” with Gollancz, the publisher, when the book came out and they decided to split a not-so-big whole story into three smaller volumes. You can see the discussion here:

More precisely:

Author’s intention and desire have no part in this then? Thanks for assuming worst.

You just wanted to assume the publishers were being venal. Because that’s controversial.

And why the assumption this is one novel told in three parts?

So, the book was being split in three not as per request of the publisher with the intention of maximizing sales, but because of “author’s intention and desire”.

But this month it’s the month of the omnibus coming out: Wolfhound Empire. And we have an interview with the writer:

It works much, much better this way. Although I set out to write three separate books, the way they turned out was three sections of one continuous story, so it makes much better artistic and narrative sense to read them together, and regard them as one thing. One epic story. To me it feels like the finished work is whole and together at last. I’m hugely proud of it.

In the end I actually do believe splitting the story in three was indeed how Peter Higgins decided to sell it to the publisher, and then to the public. And now that he gives an interview for the release of the omnibus he just sells the idea that is currently more convenient for him. I’d say I don’t blame the publisher for this.

But we have now an explicit admission by the author himself that the omnibus was always a better format for that kind of story. Better but less convenient. And my assumptions were quite correct, after all.