Category Archives: Blog


I’m sorry to say this but the showrunners are not even close to be good enough to sustain this kind of show. I always praise the ambition no matter what, and Westworld has ambition aplenty. I wouldn’t write about it if it didn’t have excellence in it. Yet, it’s a complete let down, at least for the aspects I’m looking for.

We had bad two episodes after the first excellent four, where episode 6 salvaged a little bit even if overall mediocre. Then episode 7 was really quite good and able to salvage a lot more of what came before. So I went into episode 8 with expectations high again… to find the first ten minutes of the show at its worst ever.

The first scene between Ford and Bernard should have the potential to be good, instead it’s rather pointless exposition meant solely for the audience. The dialogue is stilted and even out of character. It seems to want to delve into moral complexity, instead it only devolves into banality. Someone living in a world permeated by artificial consciousness shouldn’t be caught off guard, yet Bernard acts like someone who suddenly finds himself into a sci-fi story. He sits there, for the most part, without even thinking at the implications of what’s going on. Bernard reacts and speaks like a character in any other TV show, regardless of the unique context here.

The writers of Westworld must be aware the current cool thing is to have “gray” characters that are neither completely good or completely evil. So of course now we have two contrived “sides”, one about the board of the park, that has been deliberately presented to be the antagonist, driven by greed and cynical pragmatism to obtain what they want, whatever it takes. But Ford of course can’t be simply “good” either. So they have to turn the character in this control obsessive guy who only thinks in terms of power. Even if it makes no logical sense. The science and plot of this show don’t mix well at all.

It’s a bad scene from beginning to end, but there are at least two particular points that are truly bad. One is that Ford is shown to have this fascination with emotions, and he explains that it was with the help of Bernard that they unlocked the mystery of the “heart”. But there are no actual ideas to back this up. It’s just that, human emotions are human merely because they are realistic, compared to the first hosts that instead were more primitive. For someone like Ford who has unlocked all secrets these displays of human emotion shouldn’t have been interesting at all, they should be boring, since it was all codified, all predictable and all repeated over and over. The other bad part is a little detail, Ford says “I need you to clean up your mess, Bernard.” Excuse me, WHOSE mess? This is again just poor writing used to artificially make Ford into a unlikeable character, because this is the whole point of this scene: make Ford into another cynical villain who’s pushed science too far. By manipulating Bernard and talking the way he does, he’s made into the bad guy who doesn’t have any empathy.

And that underlines where the problem of the show is. It tackles important scientific and philosophical implications, but then it reduces all that into the usual trite TV characterization. Westworld isn’t and cannot be a character driven show, because the totality of TV shows out there are already character driven. They all reuse the same trite formula of putting some character under unprecedented distress in order to highlight the emotions and make who’s watching empathize among all the drama. All the big movers being the selfishness of greed, power, money and various combinations of these, family relationships, conflicting interests and whatnot. Westword is supposed to question deep into the morality, now that science has exposed some unsettling truths. It’s about exploring the implications of all this. And yet we keep backpedaling into trite agendas, where all this moral complexity is lost in the face of yet another struggle for control or power. Westworld explores new territory, yet keeps populating that territory with old characters and trite writing. It wipes clean the slate, only to repopulate it with the worst tropes that plague the industry, multiplying sameness everywhere.

You need new tools to deal with new themes. Westworld proposed new themes but has only old tools to toss at it. It’s clumsy.

Following that bad scene whose only purpose was bad, stilted exposition meant for the spectator, there’s Maeve’s scene, and that’s even worse. For me the breaking point isn’t even the overall context, but the mention of the explosive in her spine that sets off if she tries to leave. This is another unnecessary plot contraption that has no reason to exist. In a world that is almost The Matrix where code is literally the fabric of perceived reality, the idea of an explosive in the spine is blunt and absurd. What exactly would regulate the behavior of that “bomb” if not more code? How can it be logical that if there’s a major fuck up in the scale of an host trying to leave the park then the solution is an hidden bomb? Because the potential of an host leaving the park is way, WAY beyond the scale of what can be fixed by a bomb. Or the bomb triggering because of a mistake. Given the context, it’s the most idiotic and potentially catastrophic idea ever. And to achieve what exactly? The “locality” of the hosts seems to be the smallest of the problems.

Again, this is all written as if the writers didn’t know how to deal with new themes, and so resorted to their usual tools. It’s all baggage due to the facts these writers have no idea how to deal with complex themes, and so they fall back to default gears sprinkled with a slight futuristic context. And once again, even Westworld degenerates into a show that uses science fiction only as decoration, instead of its focus.

But this means that Westword presents new questions, only to produce the same old answers that were innocuous and useless all along. It’s the same shitty writing that is pervasive everywhere. It’s repetition disguised as something new. Trying to have it both ways, and doing poorly regardless.

Of course on the internet they don’t share my own interests, but they certainly didn’t swallow that scene with Maeve anyway. This is a good summary of what everyone noticed. Even worse, EW already criticized how implausible and contrived the scene between Maeve and the two idiots is, and asked about it to the writers themselves:

Nitpicky question though: Couldn’t the body shop guys just jack down Maeve’s levels to knock her out, and make some lobotomizing so-called “mistake” to take out her memory? We’ve been shown over and over the humans have so much control, it’s hard to believe they couldn’t get the upper hand on a rogue host.

Nolan: I will point you toward episode 8.

Beside the fact that’s not nitpicking AT ALL, that’s a huge elephant in the room, but that answer lead everyone to expect they would provide a logic explanation in episode 8, just have patience. So now we do have episode 8, and it’s fucking ridiculous. This isn’t even bad writing, it feels like you watch a scene that belongs to a show, then the following scene seems to come out right from a parody. And it’s not even about the ideas in that scene. It’s not because it doesn’t feel plausible. It’s all of it to be ridiculously awful. It’s very badly written, badly acted and with a very bad screenplay. It’s downright amateurish. And of course it completely breaks the tension when you have a show that tries to be all serious and dramatic and then has a scene taken out of Scrubs.

The problem is much larger, though. Westworld is a castle of cards that tries to pile up lots of complexity but that has zero skill handling them. When it fails not only it’s messy, but it’s even more incompetent than LOST, that also had wild ups and downs, but that was always inspired even in its failures. Westworld is a cool concept without any insight. Backpedaling into proven tropes that still won’t work for anyone here. Trying to wrestle this back into a character driven show when everything else failed is not going to work. People expect you do something interesting with the ideas you scattered on the table. And yet it devolves into corporate backstabbing or AI going evil, that we’ve seen millions of times before, but now in a show that tries to be even more obtuse about it, trying to create unnecessary mysteries everywhere.

That scene between Maeve and the two idiots is exactly what happens when you start with the concept of the “robot revolution” but without its logical causes. The actual context has been built with so much care and detail that in the end there’s actually no space left for old school “AI now runs wild”. We moved past that. The implications are higher. The science the show is based on is much, much more critical and far reaching that a robot out of control. The moral implications more subtle, deeper, unsettling. But again the writers have no tools to explore all this, so we fall back into cartoonish villany.

Maeve had just a moment of enlightenment when she starts wondering what happened to her daughter, but then stops and says “no. Doesn’t matter. It’s all a story.” That’s the point, she questions her own reality. Meaning that reality is redefined. Deeper implications. But then she’s back being obtuse because she follows that line with “It’s all a story created by you to keep me here.” …WHAT? No one cares where Maeve goes. She should know the “story” isn’t created for her, she’s merely a backdrop to entertain human beings who go there. She’s a prop. She’s a cardboard, exactly as she’s written, no matter how maxed her character values. She’s supposed to be super humanly smart, and yet she’s one of the dumbest character in the whole show. Whose poorly explained agenda has become “I’m getting out. I’ll know I’m not a puppet living a lie.” Yep, that’s EXACTLY what some dumb idiot would think. As if by exiting the park she can outrun her own mind.

At every point Westworld fails because it cannot run with what it set up. Maeve has zero introspection, her whole agenda is to stick to the robot revolution she’s written for, even if it makes no sense. As it makes no sense that those two guys should follow every of her commands. This is as terrible as saying “let’s split” in a horror movie. It’s so much beyond believability that it isn’t even good for a laugh.

Deus ex machina is the writing style from scene to scene. Everything happens just because it’s necessary for some rough outline the writers had. The whole thing has lost all plausibility along with all its depth. Without a solid foundation all its mysteries are simply obnoxious failures.

A parenthesis, we now know Bernard was chocking Elsie, because now we have a glimpse of that scene. And with that the show has put itself in the position of being utter crap no matter what path it takes. Incredible. Every hypothesis is shit. The most far fetched is that she comes back as an host. The other more plausible twos is that she’s either dead, or somewhat survived to show up later as a surprise. In all these cases it’s fucking terrible writing all around. If she’s dead it’s terrible because of how contrived was the scene of her going all alone unearthing dangerous mysteries, and if she’s alive it’s terrible because of how artificial and contrived would be not showing the attack. So that when she’s back we won’t have a gasp of shock, but only a groan of exasperation at the most obnoxious and predictable twist ever.

Follows another pointless scene between William and Dolores whose only purpose is more baiting the audience about whether there are two timelines or not. And then a scene between Ford and Charlotte that’s all about implied threats you can find in a million of other shows. And as it happens in a million of other show, it’s written terribly. Both characters know the other knows, yet they won’t speak clearly because otherwise the side plot would be closed there. This sidetrack had nothing relevant to say two episodes ago when it started, now it’s only growing more idiotic and petty. It’s unnecessary bloat added just because someone thought the show needed more conflict.

Then we have a boss fight. Then Charlotte goes to the other most obnoxious character in the show to make use of her chain of command and prepare some retaliation versus Ford. Bloat once again.

There goes half the episode where quality reached the rock bottom. No other episode up to this point was so densely atrocious. From scene to scene there was absolutely nothing to salvage, and I’m surprised of how wildly the quality goes up and down from one episode to the other. But thankfully follows a scene that is quite good, even if not significant. We have a repetition of the shooting scene we’ve seen before, but the music changed, the mood is more playful. The show plays with itself. Maeve is interfering with a pattern we’ve seen before, so she’s gracefully god-like, in the new revealed world where she is in control. It’s essentially sublime because at least here all the premises are solid and the scene is playful while still retaining its meaningfulness. We see what happens when reality is being manipulated, when the fictional drama collapses all around. It’s both character actualization and liberation. It works.

But that’s five great minutes in a bad hour of television. Follows another scene with Bernard and Ford. But at this point neither has anything meaningful to say. The problem is that what they actually say is downright silly.

I understand what I’m made of, how I’m coded,
but I do not understand the things that I feel.

Are they real, the things I experienced?

This is the guy who spent all his life shaping consciousness and reality of the hosts, who now voices the most trite of the doubts.

The first two lines are the “qualia”, some novelty concept for him I’m sure. And the last line is just plain stupid, as the question would rather be about how you define “real”, given what you know, more than answering yes or no to that pointless question.

This happens when you touch the actual dilemma: how would we think if we had solved the problem of consciousness? We have a show where the replication of a human mind is a fact, but this is fictional because we don’t know how, and so when the characters will think about it… they will have no answer.

So they built this show on a premise, but since they don’t actually know how this premise works, the characters themselves are also clueless about what they have done.

At least when Ford speaks he still holds the pretense of slight plausibility, “The self is a kind of fiction, for hosts and humans alike.” Which is at least correct. Meaning that, answering the questions that Bernard just asked, there’s no difference between humans and hosts. And so there’s no difference with “feels” and “reality”. If the “self” has been written away, then all categories have already shifted. It’s all relative to the frame of reference.

The dialogue continues on the right track: “Lifelike, but not alive? So what’s the difference between my pain and yours?” The obvious answer would be “none”. But here the writers need to plug once again their contrived plot against Arnold, so instead of an answer we get a quotation of the usual mystery: “This was the very question that consumed Arnold, filled him with guilt, eventually drove him mad.” Thankfully after the plot plugging we also get an answer from Ford: “The answer always seemed obvious to me. There is no threshold that makes us greater than the sum of our parts, no inflection point at which we become fully alive. We can’t define consciousness because consciousness does not exist.” Hooray, that managed to be all coherent. And it concludes the other small bit of goodness in the episode.

But the scene continues, and Ford degenerates into folk psychology to the point of undermining what he just said before: “Humans fancy that there’s something special about the way we perceive the world, and yet we live in loops as tight and as closed as the hosts do, seldom questioning our choices, content, for the most part, to be told what to do next.” This bit is mostly wrong. We don’t live in loops, we very, very often question our choices, and there’s no one who tells us what to do next. He goes from discussing things literally, to metaphorically, as if the same vocabulary could apply when you completely switch the context. A scientist wouldn’t talk like that, because that’s wildly imprecise. And you cannot answer a literal question in a metaphoric way. That’s pure bullshit.

Of course the writers of the show don’t have the literal answer, and so we get the metaphorical one. It could have been fine, if it wasn’t logical that Ford actually had the literal answer too, given the context. So coherence is shattered again.

When Ford says “there is no threshold that makes us greater than the sum of our parts” he touches on the Ship of Theseus philosophical problem, so it touches exactly the core of the theme of the dialogue. But then he’s sidetracked into metaphor. The literal answer would have dealt with the nature of language. What’s “human”? Exactly what you want, since language is based on an agreement. You can define “human” exactly as it’s useful. It’s a word. It means whatever you want, as long we can agree so that we can understand each other.

“I’m so sorry, Bernard. Of course you never studied any cybernetics. You’re only a dumb character in a TV show, after all.”

Follows another pointless scene between Dolores and William, just repeating the same stuff about dream, reality and figuring out if it’s the past or the future until the writers decide to stop being obnoxious about it. Then more stupid plotting between Charlotte and the writer guy, who randomly bump into Dolores father, because of course convenient coincidences are fun, cueing future plot twists. And then Bernard and the security guy to conveniently implant some implausible hole into Ford’s plan, because of course you can’t let Ford win in the end. Ford is so omniscient and omnipotent… only when he’s not because the plot requires otherwise, so he has to make his own bad move too to offer the premise for his defeat.

Then MiB explains his own story, but he doesn’t really explain anything anyway. He says a whole lot of nothing, to conclude with “I’m a good guy… Until I’m not.” Apparently his wife and daughter are “terrified” even if there’s no motivation. The whole dialogue follows a logic that makes sense only in the mind of who wrote it:

– She killed herself because of me.

– Did you hurt them, too?

– Never.

Apparently his wife killed herself because “she knew anyway”. Knew what? Whatever. Wife and daughter were somehow able to gaze into MiB’s soul and know he was a real villain deep inside. How? Because that’s just convenient for the plot, of course.

This is how the show knows to be dumb about the things it just stated. We move from the Ship of Theseus problem, that shows how there are no real thresholds, no convenient lines to cross. The ship IS nothing more than its parts. The “ship” is just a term we use to categorize those parts, so we are the ones to decide where to draw the line. We are the ones to decide when to call a ship a ship. There’s nothing more to it, no special quality, what is inside is outside, Ford confirmed as much. But here MiB contradicts all that. What is inside is the contrary of what’s outside. He says that a good man is not the one who proves to be a good man with his actions for all his life. Nope, a good man is the one his daughter calls good man after having scanned him with her supernatural insight that is able to gaze right into a man’s soul. We are into pure unreality. We moved from science fiction to baseless, retarded mysticism.

You are a bad man because I said so after having scanned your soul with my super sight. Prove it false if you can!

What’s a good guy, then? Isn’t it obvious? A good guy is one who rapes and kills, but deep down he has a good soul. The show says.

If deep in your heart you think you’re good (or your unbiased daughter or wife say so), then you’re a good guy. Your actions don’t matter.

That’s how Westworld tries to deal with its deep moral dilemmas.

The scene then mixes with Maeve’s and degenerates into more crap. “I had never seen anything like it. She was alive, truly alive, if only for a moment.” In a show about questioning reality you wonder why questioning words is too much. What’s means being “alive”? How MiB is able to identify the difference? What’s actually the difference? The way of walking? A particular wrinkly expression of the face? Anguish? How’s Maeve dying there any different from hosts dying everywhere else?

The show tries to state all this as if it’s factual, even if it makes no sense.

“Arnold’s game.”

Apparently “Arnold” is the keyword used for “deus ex machina”. But not meant intelligently or metaliguistically. It’s just used every time the plot doesn’t make any fucking sense: Arnold did it.

Then it seems the episode was moving toward something. Maeve’s scene links with a flashback. Maybe that Barnard was actually Arnold and we could have seen Maeve killing him, at least eliminating another mystery. But nope. The whole finale of the episode flops into irrelevance. Maeve stabs herself, achieving nothing at all. In the present she’s taken away, so nothing is revealed there either, she just seems to have acted erratically the whole episode. And the epilogue with the MiB provides even more McGuffin without any consideration: “The maze is all that matters now, and besting Wyatt is the last step in unlocking it.”

Of course, if you say so.

When the big cliffhanger leading to the very last two episodes is such a stupid McGuffin you know the show is gone to shit.

The 7nth episode of Westworld resurfaced into goodness after two mediocre ones. The plot moved. Superficial complexity reigned in, although there’s still an annoying amount of petty agendas driving the plot, instead of what I care about, scientific potential and mythical depth.

This was Bernard episode. The initial sequence with Bernard dreaming about his son is interesting not because of what’s obvious, but because the recurring question “have you ever questioned the nature of your own reality”, asked by Bernard himself, is superimposed on the same sequence, BEFORE the transition to the new scene. I guess for most that little hint was lost, but the moment I noticed it I knew that the question was referred to Bernard himself. And in fact the episode ends answering that question. That was some perfect opening and perfect closure.

During the week I started to be persuaded of the fan theory of Bernard as a host. I never pick up fan theories until there are elements in the show that offer concrete references, and in this case more evidence was piling up. The problem is that the evidence I got is all stuff that is still not confirmed in the episode: Bernard is supposed to be Arnold, that’s what motivated the concept of Bernard as a host. Also, when this idea of Bernard as a host was tossed around, most doubts revolved around the scene of Bernard going to talk with his wife. So it’s good that this episode clarified that part. It’s all done deliberately to make it more convincing.

Is Bernard Arnold? The reason I was picking up the idea of Bernard being a host, and modeled after Arnold, was because it compresses some more complexity. There’s this dangling thread we’ve lost behind: not only Ford was interfacing with Dolores, but also Bernard was doing that, telling her about the potential of consciousness and the maze. Now with Delos out of the picture (their meddling was just an attempt to steal the code, there doesn’t seem to be more than these petty reasons), we still have three subjects tampering with Dolores and other hosts. There’s Ford’s main code, there’s Arnold in the form of “ghost in the machine”, and then there’s that scene with Bernard and Dolores.

If later will be revealed that the hidden voice Dolores hears, and that she keeps secret from Ford, is Bernard’s own, then we obtain that Arnold is acting as Bernard. So in this case we have a different split. There’s the Arnold that died, that lurks in the hosts’ code, and there’s an Arnold that survives in the artificial form of an host that Ford built in the shape of his colleague. Another good reason to confirm this is that we’ve been explicitly told not even a picture is left of him. So there needs to be a good reason to “hide” what Arnold looks like, and the only good reason is that it would reveal something: that Bernard looks like Arnold.

Problem: if Bernard is an host under Ford’s control then Ford already knows that Arnold is messing with the code. That’s what Elsie revealed to Bernard in the last episode.

So, this episode puts everything back firmly into Ford’s hands, and I’m relieved.

Problem number 2: this episode both confirmed and denied a popular fan theory. The idea that a segment of the show is happening in the past. The Man in Black is present day, whereas William and Dolores are in the past, and William will become the MiB.

The confirmation comes from a quite explicit hint. William this episode says
“This place, this is like I woke up inside one of those stories.
I guess I just wanna find out what it means.

And this is echoes exactly what the MiB has said a couple of episodes back. It’s a direct reference, and this show doesn’t drop hints casually.

But it also seems to me certain aspects are not coherent. Maeve is awakened in the present, and we know she’s been awakened by Dolores (and Dolores by her father). In the scenes with William we do see an awakened Dolores. Once again it seems way too contrived to have this duality where Dolores is awakened both in the present AND the past. It’s too clunky. And yet that hint between MiB and William is too big to be ignored.

The only possibility is that past and present are similar because they are mirroring each other. Dolores awakening in the past, triggered by Arnold, is what ultimately caused the crisis leading to Arnold’s death (and we know the MiB is the one who “stopped” the crisis, maybe William killing Arnold once he knew it was Arnold manipulating Dolores as a love interest). But Dolores has been awakened even in the present time, as if the cycle is now repeating, maybe this time triggered by Ford. But it’s still too messy for me. In episode 5 we’ve seen Dolores fainting in what’s supposed to be the past, to be recalled in the present and have a conversation with Ford. This is either heavy handed misdirection, or a good proof we don’t have these two timelines.

Finally I wanted to point out the most important aspect for me, and that’s some thematic depth. An idea of conflict between Free Will and consciousness. We usually think they are directly causally connected, consciousness means having free will. But this episode suggests a new way to look at the two, and to keep them separated.

“Being free” means exiting the code. Behave in a way that can’t be predicted, and so that violates some rules that define a behavior. Free Will cannot be coded, it inherently implies the possibility of acting otherwise, of stepping outside a code. But instead there’s nothing inherent to “consciousness” that negates the possibility of codification. We know that consciousness is an hard to crack problem, and philosophers say maybe it’s impossible to solve. But that’s the horizon. We don’t know what to make of consciousness. The problem is exactly whether it is merely complex code, or something transcendental. Something about gods and the world outside the world.

During the “demo” of this episode, the scene where the host is shown to violate the rules, all being set up by Delos to put the blame on Bernard, it is explained what “consciousness” is. There’s irony, because what they say in order to frame Ford/Bernard is exactly what Ford is doing. The reveries allow the hosts to tap into previous cycles, and integrating that former information into their present selves allows them to… guess what? Introduce “new information”. Loops that were supposed to be closed, are instead now left open. That means all this still happens within deterministic code, because previous memories being new information alter the loop behavior, but they don’t directly alter the underlying code.

The reveries allow the hosts to reach a form of consciousness, of awareness. They let them *question their own reality*, same as Maeve is doing. Maeve isn’t behaving outside her code. She’s simply behaving in the way a host would behave when exposed to information that wasn’t previously available, or supposed to be available. She understands she’s part of a loop, she suddenly receives information about the reality of her own reality, so information to correct a blindness, anosognosia. But she’s still a slave of her own code. It’s not new code. It’s the same old code that is being fed new data. The new behavior of Maeve is not unpredictable. It’s new behavior because the information was new.

(leading to my suspicion: that the Arnold code Elsie discovered is Ford’s. So Ford has nothing to learn from that revelation. He’s the one who’s introducing Arnold’s code back into the hosts, in the form of those “reveries”)

This self reflection and self awareness is “consciousness”. And now Maeve can alter her own mind, recursively, giving herself new capabilities. But, again, it’s still the same underlying code reacting to new data. It’s still deterministically sent on its course. But this means the hosts (and human beings in general) aren’t really “conscious”. They only have the appearance of it. It’s still code.

The big point here is that it’s all relative to the level of the analysis. The hosts, at the bottom level, are already as free and conscious as possible, being life-like. To an external observer, like Bernard, that freedom is limited, because he sees the code and can predict the hosts’ behavior. They are just robots. “Awakened” hosts are one level further, they are aware of the loops, Maeve becomes aware she’s an automaton in a park, going through cycles, she even gets the possibility to self-correct by reprogramming herself, but again she’s still slave of the code that initiated this. She’s still not free from the point of view of someone higher in the chain like Ford.

The big point is that freedom is inversely proportional to the information available. The more information you have, the more you realize the artificiality of the process. The Maeve before awakening was entirely “free”, exactly because she wasn’t questioning her reality. The experience she had was directly believable. The choices she made, to herself, were perfectly free for the level of awareness and information available to her. No different from the level of awareness and information we ALL possess by living this life. But the more she receives actual information of the Big Picture, the more she should realize that freedom is lost. She sees her own code, her own dialogue trees. No matter how she recursively feeds that information to her own code, that code is inescapable. Self referential loops don’t break the pattern, it’s merely mise en abyme. Information increases in a way that is inversely proportional to freedom.

Which means, Consciousness and Free Will are the qualities of being limited. Of living under a dome.

Even if this is used in the show for a slightly different and plain meaning (he means the hosts are merely free from human pains), Ford’s lines are revelatory in all their power:

I have come to think of so much of consciousness as a burden
The hosts are the ones who are free.
Free here under my control.

That’s exactly how it is in “reality”.

We have Free Will because we are limited. Because we don’t have the information. The more information we get, the less free we become.

I could have skipped writing about this. The 6th episode is marginally better than the 5th, but still not good overall, and the writing is maybe even worse. I don’t even know how it’s possible to go downhill so fast, all the thematic depth of the first four excellent episodes has been completely swept away to the point there’s almost nothing left. Not only these last two are bad episodes, but the wreck all that came before.

More side-plots are being added without any elegance or consideration, to the point that certain characters can’t even appear every episode since they added so much bloat they have to proceed in a two-step kind of manner. But this apparent richness of things to say is extremely sterile and cliche. It’s not about exploring the depth of the themes, it’s all about superficial plot bloat and very artificial conflict. On top of a very irritating level of completely unexcused obfuscation.

The “mystery” show is fun when you’re given some pieces of a puzzle to fit together, and then keep filling in until all those pieces move to the right place. But instead here every episode keeps adding brand new pieces that prove you never had enough to solve any mystery. In fact there’s no mystery at all, only a bunch of poorly excused factions that do not earn any interest.

No one in this show has any motive, because motives would reveal too much.

The rest is some objectively bad writing I really wasn’t expecting for this kind of show. The first big issue is that the scenes with the MiB were purely superfluous sidetracks that literally added nothing beside providing another excuse in this episode to show some more shooting and blood. It’s okay if it’s a consequence of something going on, but here the MiB is captured only to get released once again. The whole thing could have been erased from the episode and we wouldn’t have lost anything.

The other big issue is Elsie tracking down the signal into some old arbitrary deposit, wasting time tapping on arbitrary crates, looking at arbitrary devices. It’s all a prop. The whole scene is so poorly written that it seems completely out of place. First they rely on the annoying cliche of “I’ve made a big discovery but I can’t tell you now”. Thankfully it’s not 100% stupid and the phone call happens later in the episode, but without really revealing anything. And then she gets caught but some unknown presence, because that’s the shitshow Westworld has to derail toward, apparently. “Oh no! Elsie has been caught!”

With just 10 episodes in the season, and dire hopes to see this renewed for a second season, without even thinking about the now pretty stupid plan of 5/6 seasons, this is quickly becoming just an exercise in frustration. They are planning for a long term they don’t have, and as a consequence they are wrecking the little they can have.

So what happened this episode? That we now know the “conflict” grows to three different agents, all with unknown motives. There’s Ford, whose own mystery plan is tied to this new storyline in the works, but apparently Ford is not anymore the genius of the first episodes. Stuff happens around him and he’s as surprised and caught with his pants down as everyone else. Then there’s the actual revelation of this episode, the fact that “Delos”, the company that finances the park, is smuggling data out of it, probably as a way to appropriate the thing and take control away from Ford. And finally there’s Arnold, the ghost in the machine, that is now a very obvious active agent.

This means it’s not anymore Ford who is working to give the hosts their consciousness. All the subtlety and complexity I had seen in the character is GONE. The “reveries” he coded were just that, ways to make these puppets more life-like. They weren’t part of a plan, they didn’t have more to them. Ford is just plain stupid and he just didn’t know about anything. I had misinterpreted his calm as insight. Instead he’s just too stupid to figure out the problem. The show seems to suggest Ford was the unchallenged king of this place for so long he grew complacent.

The problem here is that actually nothing changes from episode to episode. They just keep shifting the goalposts. There’s still someone working to give the hosts consciousness, but it’s Arnold instead of Ford. Absolutely nothing changes in the economy of the story. There’s just this shift of motivations from one character to another, due to an artificial obfuscation, that is meandering for no real reason.

Initially it seemed it was Ford that jumpstarted the host consciousness by giving them access to previous memories/cycles, but no, he did that just for the aesthetics. Someone else is reprogramming them. It is Delos, who’s smuggling information for their own corporate businesses. But there’s not just Delos, because there’s another third party who’s recoding the host, and that’s Arnold. And it’s Arnold who’s actually unlocking Ford’s reveries for what they actually are (full access to memory).

How many more factions we need in the show re-coding the same system? How anyone thought this multiplication of obscure agencies was a good idea?

That, without even considering there’s Bernard TOO interfering with Dolores’ programming. IT’S A FUCKING MESS. It has no thematic depth, it has no substance. It’s just a tangle of artificial plots built just for the sake of complication. Arbitrary people struggling for power, is this Game of Thrones?

Why is it that, after 35 years, and exactly when Ford decided to code the reveries, it is now Arnold to surface right at this time to unlock the Hosts memories. This show still withdraws fundamental information that is necessary to FOLLOW the show as a coherent thing.

Instead we moved from the first intelligent episodes dense with depth and layers of meaning, to a shitshow of incoherent plot lines that were inflated to the point that now they can’t even fit all together in a single episode.

I was initially thinking Westworld was going to be canceled because it was too smart and too dense for a large public, but nope, it looks like it is going to be canceled because it is too stupid.

“Oh no! Science has gone too far!”

I had big expectations about this episode, but instead it was a rather weak one. Plot is a bit meandering and there wasn’t much depth to the ideas and themes, this time.

Most of my theory seems already gone. This episode added a number of brand new elements to the puzzle, so previous theories cannot fit with the new picture.

That said, I was reading EW recap, including absurd theories like Arnold being Ford or Dolores being Arnold. Not only I find this incredibly silly, but it’s incoherent with what I saw on screen.

What I got from the episode is this: Ford gets emotional at the end of that dialogue with Dolores. Dolores asks if they were old friends, and Ford replies nope, not friends at all. That seems so clear to me. Dolores was Ford’s wife or girlfriend or something like that. She probably died as well, back then, and either Ford or Arnold made a copy. It’s even possible that Dolores was the love interest of both Ford and Arnold, and that brought conflict. Or Arnold built a copy of Dolores after the original’s death in order to convince Ford that these androids should be more than machines.

In general there’s this obvious dichotomy with Ford on one side coldly treating the park as a toy, and Arnold instead seeing it as something more. But then why is Ford the one currently programming the new update that is giving the host awareness? It seems to contradict his motivations.

Complicating things is the fact Dolores isn’t simply gaining a sort of introspection, but also hearing voices. So this creates a contradiction. Initially it seemed Dolores was an instrument of Arnold or Ford, sent on a path of self-awareness. But now it’s shown she instead follows a voice and that voice has been identified with Arnold. These two aspects do not make a whole lot of sense.

Same for the encounter between Ford and Man in Black. They talk without achieving anything. MiB says he’s the one who saved the park. It’s possible he’s the one who killed Arnold (or maybe he saved the park by just putting more money into it), but now he’s after Arnold’s plan. Ford is interested, but he won’t stop MiB.

I’m not too sure what to do with these pieces.

I was updating the previous post as I looked up more stuff but decided to yank all that and move it to a separate one because it looks like all pieces of the puzzle already fell into place. We have a fairly plausible ending, at least for Season 1 (hopefully they at least get to this point).

If it turns out I’m right then it means they dropped too many clues, or just didn’t spin this well enough, also because I still think that it ends up a little too dry.

After listening to this, seeing the very obvious reference at the core, and reading straight from Nolan that “We wanted a big story. We wanted the story of the origin of a new species and how that would play out in its complexity.”

So how does Westworld end?

It’s plausible to assume that the show is pointing both the Man in Black and Dolores to the same “Maze”. What we know about this Maze is that it’s where the real endgame is, that it’s “a story with real stakes, real violence”, and that if Dolores finds the center she’ll be set free. It’s easy to connect the dots, the first episode opens with Dolores versus Man in Black, and both seem now to converge at that showdown right in the center of the Maze, maybe as the climax of the season finale. So we can assume the maze is that particular place where guests like the Man in Black aren’t anymore protected by their supernatural status and both guests and hosts play under the same rules, so that the hosts can actually harm the guests.

The showdown at the center of the Maze will likely see Dolores prevail on the Man in Black, since it projects a nice arc and loops back to the first episode where Dolores was instead the victim, and this likely will trigger a full-blown rebellion, lead by Dolores herself. Something close to “Rise of the Planet of the Apes” reboot, where in this case the androids seize the simulation itself, not only setting themselves free, but starting a conflict.

(The “bicameral mind”, being the device Bernard uses to normally interface with Dolores, giving her voice commands she ends up receiving without explicit awareness, since she’s normally bound by her fictional perspective, is likely the mean through which Dolores will gain her freedom. Being able to take charge of her own programming. She seals her mind in, becoming immune to external control.)

All this being part of Ford’s master plan. Because it’s obviously Ford who is triggering the whole process, starting to inject some self-awareness into the hosts. All the scenes where Ford mistreats androids as “things” are pure misdirection and ways to directly manipulate Bernard to send him on the opposite path. Ford shows so much cynicism to Bernard that Bernard ends up empathizing as an inverse reaction. But very obviously that too was carefully anticipated by Ford. To Ford his fellow human beings are very simple to understand and control, that why he plays the god’s game: to jumpstart a better species. The overall theme is the creature versus his maker, in order to gain freedom the gods need to be killed. A form of “patricide”. And that’s why there’s also a new planned storyline that seems to play around the theme of “religion”, so that Ford can give the hosts awareness of their cruel “gods”, and to trigger that paradigm shift, the rebellion against the gods themselves in order to seize real freedom.

So Ford’s behavior is ultimately ambiguous, he cares for his androids more than he cares for his fellow human beings, because his ultimate plan is to replace them. In the end he’s only working to complete the job that his partner Arnold started.

I was thinking of highlighting this quote from Scott Bakker, because it’s meaningful, touches on the ‘meta’, and imagines what happens to literature when the world changes. It also links back to this, if you want to look at it from the specular opposite perspective (“the inside”).

“Exactly the same lesson is learned by Captain Kirk and Captain Jean-Luc Picard as they travel the galaxy in the starship Enterprise, by Huckleberry Finn and Jim as they sail down the Mississippi, by Wyatt and Billy as they ride their Harley Davidson’s in Easy Rider, and by countless other characters in myriad other road movies who leave their home town in Pennsylvannia (or perhaps New South Wales), travel in an old convertible (or perhaps a bus), pass through various life-changing experiences, get in touch with themselves, talk about their feelings, and eventually reach San Francisco (or perhaps Alice Springs) as better and wiser individuals.” 241

Not only is experience the new scripture, it is a scripture that is being continually revised and rewritten, a meaning that arises out of the process of lived life (yet somehow always managing to conserve the status quo). In story after story, the protagonist must find some ‘individual’ way to derive their own personal meaning out of an apparently meaningless world. This is a primary philosophical motivation behind The Second Apocalypse, the reason why I think epic fantasy provides such an ideal narrative vehicle for the critique of modernity and meaning. Fantasy worlds are fantastic, especially fictional, because they assert the objectivity of what we now (implicitly or explicitly) acknowledge to be anthropomorphic projections. The idea has always been to invert the modernist paradigm Harari sketches above, to follow a meaningless character through a meaningful world, using Kellhus to recapitulate the very dilemma Harari sees confronting us now:

“What then, will happen once we realize that customers and voters never make free choices, and once we have the technology to calculate, design, or outsmart their feelings? If the whole universe is pegged to the human experience, what will happen once the human experience becomes just another designable product, no different in essence from any other item in the supermarket?” 277

(an aside: That last quote is a very unlikely scenario in my opinion, because it describes a fully reductionist strategy to solve a system that is absurdly high in complexity. And you cannot really apply a reductionist strategy to a system where you know less than 10% of its elements. It’s not that the reductionist approach is not possible, it’s that we aren’t even remotely there to make it plausibly work. We are majorly underestimating the scale of the task.)

Then I watched Westworld fourth episode and, amidst delicious fourth wall elegant dancing, the Man in Black delivers a nice connection to the same argument.

– Do you know where you are?
– I’m in a dream.

[…]

The hosts don’t imagine things, you do.

[…]

– If you did consider your choices, you’d be confronted with a truth you could not comprehend… That no choice you ever made was your own.

Locked in your little cycle like a prized poodle after its own tail.

You have always been a prisoner.

[…]

– But this world… I think there may be something wrong with this world.

Something hiding underneath.

– There’s something I’d like you to try. It’s a game. A secret. It’s called… the Maze. It’s a very special kind of game, Dolores. The goal is to find the center of it. If you can do that, then maybe you can be free.

– The hell you hope to find, anyway?

– This whole world is a story.

That last line is a bit of a mix of two different scenes and it connects to the quote above about the “meaningful world”. Story is meaning. The Man in Black is after that story:

– I’ve read every page except the last one. I need to find out how it ends. I want to know what this all means.

And of course the creator of this system legitimizes all that in another scene:

– It’s not a business venture, not a theme park, but an entire world.

We designed every inch of it. Every blade of grass.

In here, we were gods. And you were merely our guests.

This fourth episode seems to point a light at the whole religious undercurrent, so this time I can speculate on what I think is going to be an element of the show: Ford (the “god” of the system) wants to insert the ‘meta’ into the story itself. Making the creators of the park appear within the park as a form of religion.

Why? There can be two ways to interpret this. One is too clever though, the other a bit trite. The trite one is about injecting in the system some metaphysics. In the park there are walking fourth wall “breaches”, the demi-gods who fuck and kill as they please because they play on a different level of rules. They know the world is “fake”, they can’t die, they know it’s all a game. So both demi-gods (the visitors) and gods (the showrunners, so to speak) have active metaphysical intervention inside this system. Literal gods with god-like powers. They can shape and transform, play as the please with a different kind of “game”:

– My father would tell me…
that the steer would find its own way home.
And, often as not, they did.
Never occurred to me that we were bringing them back for the slaughter.

The other way is too complex to be plausible even for this show, though. It’s linked to the quote above where Bernard says “the hosts don’t imagine things, you do”. The metaphoric value of that line is that if you hold a reductionist model of consciousness then there’s no meaning, ever. That sort of first person, high level observer is an illusion. The truth of all human life is that “all things” are imagined, because no one is actually “free”. We are all just machines that behave accordingly to their wiring. Consciousness itself is an illusion.

But what happens *inside* the park is an unprecedented pattern. Some of these machines are starting to “integrate” information they didn’t normally have access to. They break the very substance that makes consciousness “appear”. This happens on two levels. The first is about finding in their memories information about their previous cycles/lives. The second level is the hypothetical one (this religious sidetrack): they receive information from the “gods”. This too is a breach of the fourth wall. Information that comes straight from the outside of the system, and because of its nature (it comes from the outside, so it “opens” the system they are normally locked in) it’s information that can set them free. Or the freedom to understand they aren’t free. They start seeing themselves for what they are (see the last scene of the episode where they realize “none of this matters”).

The paradox is the one at the very foundation: because the system is deterministic and closed, you have free will. Because the system is closed, and so you cannot access the information that tells you that you’re just a robot. So you’re stuck believing in free will.

But in this “park” the system isn’t anymore closed. The world is continuously breached by gods and demi-gods. And if the system is cracked open, these robots will start to question their own reality. The illusion of consciousness is coming down, so that it can be rebuilt in a new form.

From X-Files, LOST, Fringe, Awake and True Detective, it seems that television still has something to offer that tickles wild, creative speculation (I’ve yet to see Mr. Robot, so I don’t know if it fits there too). Now we have Westworld, that exists perfectly in the same fold. That it is so clever and ambitious, and about the very stuff I enjoy the most that I’m surprised it can actually exist, and that I fear won’t even get close to its full potential since they planned something like six seasons and I seriously doubt the larger public is going to stick with a product that is so dense and layered. It’s my own particular quirky, eccentric flavor. It is going to have an hard time trying to please everyone else while retaining its ambition.

So I’m also thankful to read someone like Jeff Jensen, who during LOST, Fringe and True Detective was writing the ‘recaps’ on EW, but they just weren’t simply recaps, they were OPENING the episodes WIDE. They were bursting with interesting ideas and possibilities. Shows like Fringe were always more powerful about what they were suggesting than what they were explicitly doing. Because it’s fun to run with the ideas and see how they might play out in different contexts. To see what they actually mean outside strict plot functionality. The ‘meta’ was more fun than the explicit content.

All this long premise to say I’m going to interpret Westworld in ways that probably no one has attempted or will attempt. I’m pushing the ideas to their limit, instead of sticking to what the authors plausibly drove toward. I’m running with it. But this without disrupting the content of the show. I’m not writing “fan theories”, I’m exploding out the interpretations. The bigger picture. The ‘meta’ itself.

The first thing is the image above that probably everyone else dismissed without a thought. The mise en abyme. Not only this is a symbolic concept written in the show: the effect is what you obtain playing with mirrors, and mirrors have a role in the “consciousness” of the AI, we’ve seen multiple scenes where Dolores looks at herself in a mirror (it’s by seeing herself that she can question her own reality, of course), but at the same time this also symbolically represents the ‘meta’ of the show. There are fictional ‘showrunners’ that write the stories taking place inside the park, as if the park was a surrogate of the TV show itself. A game of mirrors: what is inside reflects what’s outside, recursively. This is purely second-order observation, second-order cybernetics. But it doesn’t stop there, because that image also represents consciousness itself. Hofstadter’s strange loops. Human consciousness is shaped recursively, self-observing in a pattern. It returns on itself, over and over, until everything blurs out of definition. It applies to itself over and over the distinction between system and environment (Spencer Brown Laws of Form as used by Niklas Luhmann). An observing system in order to make an observation operates a distinction. While self-observing the observing system makes a distinction between the self that observes and the self that is observed. Being both subject and object, it obtains a double from a whole. It creates the Cartesian dualism that makes human experience possible, and makes it alienated from reality (reality that has no actual dualistic levels, it all operates on one). The fundamental illusion that is one of the basic premises of consciousness.

The second aspect is the wildest one, and the one I’m pretty sure absolutely no one is going to contemplate. You can read it here, and that’s it. I’d really challenge the writers of the show because I’m sure they didn’t dare go there, or even THINK about seeing it this way.

Here’s a couple of quotes from Alan Moore talking about his book, Jerusalem:

If you read only one Alan Moore Jerusalem interview, make it this one

Deep into our six-hour talk, somewhere around the dessert (three scoops of ice cream for Moore, hold the whipped cream), the Sage of Northampton is explaining how he came to see the world as Doctor Manhattan does. In 1994, he experienced an “absolute, crystalline understanding” during a magical ritual. Since then, Moore has believed, as Einstein supposedly did, that time is a solid in which our lives are embedded; it is only our perception of it which makes it appear linear.

In other words, everything that has ever happened is still happening. Everything which is about to happen has already happened. We never truly die: the lives we are living now are solid and eternal. That’s all major religions out of business, then.

“The thing is,” says Moore, “we don’t have free will, or at least that’s what I believe, and I think most physicists tend to think that as well, that this is a predetermined universe. That’s got to pretty much kill religion because there aren’t any religions that aren’t based on some kind of moral imperative. They’ve all got sin, karma or something a bit like that. In a predetermined universe how can you talk about sin? How can you talk about virtue?”

Four decades later, this year, he was doing a spoken performance in Milton Keynes, in which he riffed on an article in New Scientist which speculated that because we will soon have quantum supercomputers capable of holding more particles than there are in the entire universe, we will then be able to simulate an entire universe, including all the life forms in it, which will not know they are simulated.

“And if we’re going to be able to do this,” says Moore, “the odds of this being the first time this has happened are vanishingly small. It is much more likely that we are in a simulation, of a simulation, of a simulation, and so on.

The programmer of the game, therefore, will be God. And if he is at all like the humans he has created, the article postulated, he will want to put an avatar of himself in the game.

Westworld 2nd episode:

“Everything in this world is magic, except to the magician.”

See what I did, when you use that as a frame of reference for Westworld?

Westworld’s “hosts”, the AIs, exist in the exact same context Alan Moore described.

A simulation, of a simulation, of a simulation, over and over. This equals the hosts storing in their memory archive their previous ‘roles’ and ‘storylines’. At every cycle they are reset and restarted. At the same time an external observer can go sift through those memories and consider them as a kind of “solid”, something that already went through and that can be replayed.

So, the AIs of Westworld represent metaphorically the same structure to the larger system of reality. Trapped into cycles but without means of accessing information of the previous ones. Bound to that occluded horizon, caged in their fictional lives.

This is, once again, a game of mirrors. You artificially fabricate an AI that reflects life as it is experienced. It recursively recreates itself. Consciousness is the status of being trapped inside. And the AI consciousness is not unlike the one of its creators. The same rules apply.

And so the third aspect. How consciousness for these AIs works. This is specifically something that the last third episode provided, in two particular moments.

The first is the mention of the Bicameral Mind theory by Julian Jaynes. Quoted as a first attempt to reproduce and unlock the mystery of human consciousness. They say they eventually abandoned that approach, but it is interesting they referenced it specifically.

Then, Dolores’ first display of something that resembles consciousness is that even in analysis mode she isn’t able to “explain” something she said. Something “unexpected” happens. But the truly important aspect is that she doesn’t know. She’s unable to track her own thought.

This is fundamental because it reproduces Scott Bakker theory of consciousness (Blind Brain Theory). You can read here an absolutely perfect story that explains it intuitively:
https://rsbakker.wordpress.com/2016/03/22/the-dime-spared/

It is defined “conscious” a thought process that the mind isn’t able to track. A thought that “appeared” in Dolores’ mind that she doesn’t know how it came to be. That seems to be non-consequential, outside the domain of self-analysis.

The Bicameral Mind can too be interpreted as a form of a similar feature, if much simplified. One “chamber” doesn’t know the existence of the other, so the conscious mind “receives” thoughts that seem external, alien. That come from somewhere else, a god. A memory that one has but cannot recall. Even in this case the basic feature is the occlusion.

Consciousness, in Bakker’s Blind Brain Theory, is “a magic show”. Or more precisely, it’s absence of information.

The magician can make you believe an object magically moved from one hand to the other by hiding the movement itself. It’s information that was withdrawn. Hence that object magically jumped from one hand to the other because you missed the information of the actual movement.

In the same way consciousness is just a magic trick. Since consciousness is structurally blind to its own process, consciousness cannot see how thoughts are actually formed. It doesn’t know their true origin. They just suddenly appear. And what consciousness can do through introspection is to confabulate an explanation. Post hoc.

We don’t know if the AI in Westworld is faithful to this theory. But for now it respected the basic feature of what we recognize as consciousness: the impossibility to track a thought. The trackless space. The invisibility of the mental process to itself.

The ideal of the cosmic cycles of simulated reality, downsized and applied to the single AI system, creates the possibility of a kind of “bicameral mind”. The AI receives inputs from previous cycles. These are experiences that are unhinged from a sense of history that the consciousness is able to track (since the AI consciousness can only normally access memories that are part of the current active cycle). They are alien thoughts, alien voices, interferences that will have to be confabulated back into an explanation.

But again, the basic feature that creates consciousness is not the source of those thoughts, what’s truly crucial is simply the occlusion of the process itself: the fact that the AI can’t track its own process, that it is blind to itself.

Consciousness is not freedom. Consciousness is withdrawal of information. The more limited your access, the more conscious you are. Freedom by darkness.

Are Westworld showrunners even aware of what they’re doing? Or are they stumbling into all this because that’s the natural point where these things ultimately lead, regardless of the path you take?