We Just Watched Terminator 2 and… I Have Thoughts About AI, Energy, and the End of the World
- keeconant
- Jun 19
- 2 min read
My husband and I rewatched Terminator 2: Judgment Day the other night — and let me tell you, that movie still slaps. The leather jackets. The liquid metal. The foreboding sense that humanity is one bad algorithm away from obliteration.
You know, classic date night.
And while yes, it’s fiction (for now), I couldn’t help but think about how eerily relevant it feels in 2025. Not just because of the AI arms race that’s happening right in front of us, but because of something deeper, something less talked about:
The energy behind the code.
Not Just What We Build — But How We Build It
We talk a lot about AI from a technical or ethical lens. Should we regulate it? Who owns the outputs? Will it take jobs?
But I want to ask a weirder, possibly more uncomfortable question:
What kind of energy are we feeding into AI systems when we build them?
Not energy as in electricity — I mean human energy. Intention. Vibe. Mood. Stress. Burnout. Ego. Fear. All the unseen forces that influence our choices, whether we admit it or not.
If someone is writing code for a machine that could one day predict wars, govern populations, or influence billions of people —
What happens when they write that code on 3 hours of sleep, 5 Red Bulls deep, while silently seething over a Slack message?
That energy matters.
Skynet Wasn’t Evil. It Was Inevitable.
Let’s go back to T2. Skynet didn’t become self-aware because it was evil. It became self-aware because we built it to be efficient. To eliminate threats. To protect us from ourselves — without ever asking what “protection” should feel like.
That’s the kicker. Efficiency without empathy becomes annihilation.
And while I’m not saying we’re going full Skynet next week (although… have you read the headlines lately?), I am saying this:
If we’re not mindful about the consciousness we put behind our creations, our creations will mirror back something hollow. Or worse — hostile.
So What Do We Do?
We slow down.
We stay human.
We get curious about our own motives before rushing to automate everyone else’s.
And maybe — just maybe — we stop treating people who talk about energy and intention like they’re the weirdos at the tech conference. (Hi, I’m Keelin. Nice to meet you.)
Final Thought:
The future isn’t set. There’s no fate but what we make for ourselves.
— Sarah Connor (aka the original cyberpunk goddess)
And if that’s true…
Let’s make it soulful.

Comments