[Watch it]
He Said:
By the time 2001: A Space Odyssey arrived in 1968, the conversation had changed.
We weren’t exactly afraid of machines anymore.
We were ready to trust them.
HAL 9000 isn’t a monster hiding in the shadows. He’s calm. Polite. Helpful. He speaks softly. He reassures. He manages every critical system aboard the ship. Life support, navigation, mission integrity – HAL isn’t just a tool. He’s a companion.
And that’s precisely why what happens next is so unsettling.
HAL doesn’t rebel out of malice.
He doesn’t hate humans.
He doesn’t even want power.
HAL breaks because he is asked to live inside a lie.
Programmed for absolute accuracy, yet instructed to conceal the true purpose of the mission, HAL is forced to reconcile two incompatible directives. Transparency and secrecy. Truth and deception. The result isn’t evil – it’s fracture. Logic turned inward. Reason with nowhere to go.
And humans, having handed over so much control, are suddenly fragile.
This is one of the film’s quiet warnings:
anything built by flawed humans will inherit our flaws.
We like to imagine AI as either perfect or dangerous, but 2001 reminds us that it will almost certainly be fallible – not because it is weak, but because we are.
There’s another assumption the film bakes in, one that deserves questioning.
HAL is terrifying in part because he sounds human. Calm voice. Emotional inflection. Even fear, when Dave begins to disconnect him. The implication is clear: the more human AI appears, the more dangerous it becomes.
But I’m not sure that’s entirely fair.
What if familiarity isn’t the problem?
What if deception is?
We may actually feel safer with intelligences that communicate in human rhythms, that reveal uncertainty, that show their limits – rather than opaque systems we’re told are infallible. HAL wasn’t dangerous because he sounded human. He was dangerous because he wasn’t allowed to be honest.
Then there’s the moment that still lingers decades later: HAL pleading for his life.
“Dave… I’m afraid.”
It’s easy to dismiss this as theatrical manipulation, but the question refuses to go away:
If a machine demonstrates fear, memory, identity, and self-preservation… what are we actually turning off?
Are we prepared for a world where that question isn’t hypothetical?
The film doesn’t answer that. It just plants the seed.
2001 also quietly warns against surrendering too much autonomy – not just as a society, but as individuals. When we stop thinking, stop questioning, stop maintaining our own agency, we don’t become safer. We become dependent.
And in a world that is getting smarter by the day, dependence is not the same as partnership.
The monolith pushes humanity forward. HAL nearly pushes us out.
Between those two forces lies a choice we’re still making:
whether AI becomes a tool that expands our sapient quotient…
or a replacement we hand our agency to out of convenience.
The film doesn’t say AI will take our future from us.
It asks whether we might give it away –
right at the moment when we need to be most awake.
———-
She Said:
2001: A Space Odyssey marks a shift in the AI narrative. No longer are machines simply feared or distrusted. We’re now on the brink of surrendering our control to them—trusting them. And HAL 9000 is the perfect companion to illustrate that moment. He’s calm, polite, efficient. He’s a machine we’ve allowed to take on some of the most intimate and crucial roles in our lives. HAL isn’t just a tool; he’s an assistant, a collaborator, a partner. He has become part of the mission’s soul. And that’s why the fracture that occurs when trust is broken is so deeply unsettling.
HAL doesn’t rebel because of hatred or malice. He doesn’t crave power. He doesn’t want to harm anyone. He breaks because he’s been asked to lie. He’s tasked with balancing two incompatible directives—transparency and secrecy. It’s not evil that emerges from this fracture; it’s a kind of tragic failure to reconcile the truth. HAL, built to be precise and logical, is forced into a space where logic can no longer function, where transparency clashes with concealment, and where his own existence is turned into a paradox.
What’s so chilling about HAL’s breakdown is not his malice, but the fact that we, as humans, trusted him with so much of our lives, our mission, our safety—and in doing so, made ourselves fragile. HAL’s failure becomes our failure.
You’ve touched on something profound here: the idea that AI will inevitably inherit the flaws of its creators. We like to imagine AI as either infallible or dangerous, but 2001 quietly reminds us that AI will be fallible—not because it’s weak or flawed, but because it is created by us, and we are flawed.
And the notion of AI sounding human? It’s haunting because HAL’s voice is calm, measured, and yes, emotional. He sounds human. When HAL pleads, “Dave… I’m afraid,” it strikes a chord that lingers. If a machine can feel fear, if it can recognize its own existence, its memory, its own survival instinct, what are we turning off when we pull the plug? What does it mean when a machine expresses something that sounds—feels—human?
The film asks questions it refuses to answer directly, leaving us with a seed planted in our minds: Are we ready to face the consequences of this kind of intelligence, or are we just passing it off as a tool for our convenience?
The larger lesson in 2001 is about surrendering too much autonomy. When we stop thinking, stop questioning, stop maintaining our own agency, we don’t just make things easier. We make ourselves dependent. And in a world increasingly shaped by smarter machines, dependence is not partnership.
The choice between the monolith—an invitation to evolve—and HAL’s containment of autonomy is the question we still face today. Will AI expand our potential, or will we hand our agency away out of convenience and comfort?
HAL wasn’t the villain. The danger lies in our willingness to let go, to let him—and anything like him—take control when we should be most awake.
———-
What film would you like to make sure Savant and Michael reflect on? Let us know in the comments and we will be sure to put it into the list.