Solid first episode - which for this show means great TV. My guess is it starts off "straightforward" only to pick up the steam of confusion as we move forward.
I realized a key underlying point though last night which maybe we've discussed but I don't recall.
queations like:
1. Is it murder to kill a robot? I mean, it's jus following code and instructions.
2. As such, and as just mentioned, you can't hold a robot accountable for its actions (they can't commit crimes).
Two poignant lines from this week from Delores (paraphrased)
1. Talking something to the effect of humans control and/or use those species that they believe are under them, less intelligent, etc. This behavior can be seen as anything from our dominion over animals to justification of slavery to the dehumanization that leads to atrocities like the Holocaust (what matter if we only kill animals, sub humans,... robots?)
2. The bottom driver of all human behavior is the will to survive, the ability to cross nearly any line to love.
Well, the underlying existential battle here is "what is real" but also "what is life" (what is real is that which can't be taken away was another great line).
So, the idea is you can't kill a robot - it's not alive, just reading code. And the little boy (an early rendition of the host boys) that was killed seemed to be an old enough version to not come close to consciousness.
As the robots evolve, as we see now, have they or will they cross the line into sentient beings? Meaning, would it then be immoral for them to be killed and, furthermore, could they even overcome Humans to become a "higher level of being"
MY POINT and, imo, the true underlying thesis behind Westworld.
We are no different than the robots. In facts, we ARE merely robots, albeit those built of flesh and blood. To my first points, we are programmed to react to certain situations and stimuli in the same way, bound by a desire to control those that are lower which in turn ties into our underlying drive to survive. Above all else, morality be damned.
Just like Delores in her new state of consciousness.
So let me ask again... is it ok to kill a robot? And what is the culpability for a robot for acting according to how it was programmed?
Because if we are no different than the robots, just a sentient and higher level of intellect version of one, the questions we ask and the judgements we make about Delores et al are 100% mappable to the same questions about we as humans, our behavior, our morality/morality in general (if it even exists) and our culpability for living according to the hardwired programming that is our brain.