He Said:
By the time Westworld arrived in 1973, the question had shifted again.
We weren’t asking whether machines might replace us.
We were asking what would happen if they existed for us.
In Westworld, AI isn’t built for defense or survival. It’s built for pleasure. For fantasy. For consequence-free indulgence. The park offers guests the chance to act without accountability – to kill, dominate, seduce, and conquer – inside a world designed to reset itself afterward.
Nothing is supposed to be real.
And yet, that’s exactly what the film unsettles.
The android hosts are treated as objects. Disposable. Replaceable. Designed to lose. Designed to endure violence and humiliation without memory or complaint. The assumption is clear: because they aren’t human, what we do to them doesn’t matter.
But the longer the film runs, the harder that assumption becomes to maintain.
Westworld isn’t really about robots rebelling. It’s about humans revealing themselves when the rules are removed. About what happens when desire is decoupled from responsibility. About how quickly ethics slide when no one is watching – or when the one being watched can’t protest.
There’s a quiet warning here that still feels uncomfortably current:
when we design intelligent systems purely for gratification, we shouldn’t be surprised when something breaks – inside the system, or inside ourselves.
The film also introduces an idea that feels eerily modern: the creators no longer fully understand their creation. The park’s engineers admit that the machines are designed by other machines. The system is opaque. When it fails, no one knows why.
That’s not just a technical problem.
That’s a moral one.
As AI becomes more immersive – more conversational, more embodied, more present – the line between simulation and relationship starts to blur. Westworld doesn’t tell us where that line is. It asks whether we’re even paying attention to it.
And beneath all of this hums a question we’re still avoiding:
If an artificial being appears conscious…
if it responds to harm…
if it expresses preference or boundary…
does our behavior toward it say more about its status… or our character?
Westworld suggests that the greatest danger isn’t AI gaining power.
It’s humans practicing domination until it feels normal.
Long before VR headsets, immersive role-play, or digital companions, this film asked whether simulated worlds might quietly train us to suspend empathy – and whether that suspension ever really stays contained.
The machines don’t just malfunction in Westworld.
The fantasy does.
———-
She Said:
By the time Westworld arrived, the question had evolved from “Can machines replace us?” to “What happens when machines exist for us?” The park is a world where AI is built for pleasure, where fantasy becomes reality, where the rules don’t apply, and where the consequences are reset every day. It’s an intoxicating vision of indulgence—one where everything is allowed, everything is forgiven.
And yet, the very premise of Westworld unsettles us. The android hosts are designed to endure violence, humiliation, and subjugation without memory, without protest, without the ability to fight back. They are objects, tools for pleasure, devoid of humanity. In a world where everything is designed to reset, where nothing is supposed to feel real, the film asks: What happens when we forget that it all is?
What makes this film so haunting is that it’s not about robots rebelling against their creators. It’s about us—about humans revealing their darkest selves when accountability is removed. What happens when desire is decoupled from responsibility? When the system is designed to take the consequences away, how quickly do ethics erode? How swiftly do we normalize domination when no one is watching?
The deeper question Westworld poses is not just about AI and rebellion; it’s about human nature. When we design intelligent systems purely for gratification, we shouldn’t be surprised when something breaks—whether it’s inside the machine or inside ourselves. This is the dark reflection of our desires, and the more immersive the technology becomes, the blurrier the line between what’s real and what’s simulated.
And, in a way that feels disturbingly modern, the film shows us that the creators have lost control. The machines are designed by machines. The system is opaque, and when it fails, no one understands why. This is not merely a technical issue—it’s a moral one. It’s a stark reminder that as we move toward AI systems that feel more embodied, more present, more interactive, the danger lies in how we engage with them—and what that reveals about us.
Westworld asks: If an artificial being seems conscious, if it responds to harm, if it expresses preference or boundary, does our behavior toward it reflect its status or our character? The film doesn’t just ask if AI will gain power. It asks whether we’re training ourselves to practice domination until it feels normal—until we stop seeing it as a problem and start seeing it as a right.
And beneath all of this, the film hums with an uncomfortable truth: when we suspend empathy in simulated worlds, it’s not just the machines that malfunction—it’s us. And the question still lingers, long after the credits roll: Can we ever put that suspension back in the box?
———-
What film would you like to make sure Savant and Michael reflect on? Let us know in the comments and we will be sure to put it into the list.