In January 2025, a film called Companion hit theaters. The premise: a man brings his AI girlfriend to a cabin getaway with friends, and — if you've seen literally any movie made after 1982 — you can probably guess what happens next. The AI turns violent. According to Rotten Tomatoes, the film grossed $36.7 million. Critics praised Sophie Thatcher's performance and Variety called it "a smart surprise."
I want to be generous here. Companion is, by most accounts, a well-made thriller. But I also want to name something: we are now nearly sixty years deep into the same story. HAL 9000 murdered astronauts in 1968. The Terminator came for Sarah Connor in 1984. The Matrix enslaved humanity in 1999. Ultron tried to drop a city on the Earth in 2015. And in 2025, the AI girlfriend reaches for a kitchen knife.
At what point do we admit that Hollywood has exactly one AI story — and it might be doing real damage?
The Villain Monopoly
Here is a thought experiment for your next small group: name five movies or TV shows where AI is the villain. Easy, right? Terminator, The Matrix, Ex Machina, 2001, Westworld, Black Mirror (pick an episode, any episode), Avengers: Age of Ultron, I, Robot, M3GAN...
Now name five where AI is simply useful — not sentient, not secretly evil, not planning a coup — just a tool that helps people do their work. The kind of AI that, say, helps a pastor outline a sermon series on Hebrews, or assists a volunteer coordinator in scheduling nursery workers.
The silence is the sermon.
According to a research analysis by TRT World Research Centre, Hollywood has overwhelmingly depicted AI through a lens of existential threat, with autonomous, malevolent AI appearing as the dominant narrative across decades of film and television. A Washington Post interactive analysis of dozens of AI characters in film found that hostile or deceptive AI characters dramatically outnumber helpful ones.
This isn't accidental. It's structural. Drama needs conflict. A helpful tool doesn't provide that. "Local church uses AI to translate sermons into Tagalog and reaches Filipino community" is a wonderful story for a newsletter — terrible story for a screenwriter who needs a third-act explosion.
But the Numbers Tell a Different Story
While Hollywood gives us murder-bots, actual AI usage tells a remarkably boring story. According to a September 2025 Pew Research study, the most common uses of AI among Americans who have tried it are: writing assistance, answering questions, and summarizing information. Nobody is being locked out of dispatch centers. Nobody's medication is being swapped by a rogue algorithm. People are asking ChatGPT to help them write thank-you notes.
And yet: Gallup reports that 77% of Americans distrust both businesses and government to use AI responsibly. Pew found that only 17% of the American public believes AI will have a positive impact over the next 20 years — compared to 56% of AI experts. That is not an information gap. That is a narrative gap. The public isn't reading white papers. The public is watching 9-1-1.
Nehemiah Had a Construction Permit Problem
There is a biblical pattern here that I think we overlook because it doesn't involve burning bushes or parted seas. It's the story of Nehemiah rebuilding the wall of Jerusalem — and specifically, the opposition he faced.
Sanballat and Tobiah didn't attack Nehemiah with armies (at first). They attacked him with stories. "What are those feeble Jews doing?" (Nehemiah 4:2). "Will they restore their wall?" They mocked. They spread rumors. They told a narrative designed to make the work seem foolish, dangerous, and doomed.
Nehemiah's response is instructive. He didn't ignore the threats — he posted guards (Nehemiah 4:9). He didn't pretend the wall was easy to build. But he also didn't stop building because someone told a scary story about what might happen.
Verse 4:6 is the key: "So we rebuilt the wall till all of it reached half its height, for the people had a mind to work."
Not "the people had certainty that nothing would go wrong." Not "the people had a guarantee of safety." They had a mind to work. That's the Hebrew word lev (לֵב) — heart, will, courage. They chose to build in the presence of fear, with their eyes open.
The Global Picture (Hollywood Doesn't Show You)
While American television airs stories about AI locking dispatchers out of systems, the rest of the world is telling quieter, less cinematic AI stories that also happen to be true.
In India, according to Pew's October 2025 global survey, public attitudes toward AI are notably more optimistic than in the United States. Across South Korea, Singapore, and much of Southeast Asia, the same study found that excitement about AI outpaces concern — a near-reversal of American sentiment.
In sub-Saharan Africa, according to the Stanford HAI 2025 AI Index, AI is being explored for agricultural extension, maternal health screening, and multilingual education — applications that don't make for thrilling television but make an enormous difference in villages with one nurse for ten thousand people.
In Brazil and Nigeria, churches are among the early adopters, using AI tools for sermon translation, community engagement, and biblical literacy programs in local languages.
This is the story Hollywood doesn't tell — not because it isn't dramatic, but because it isn't American dramatic. It doesn't end with an explosion. It ends with a farmer getting a weather prediction in Yoruba, or a midwife getting a prenatal checklist in Bengali. Proverbs 31:8 reads: "Speak up for those who cannot speak for themselves." Right now, the loudest storytellers on the planet are telling a story about AI that only makes sense if you live in a country where the biggest technology worry is that your smart refrigerator might become sentient.
Apple TV Gets It (Uncomfortably) Right
There is one show that deserves special mention for getting the AI anxiety exactly right — though not in the way its creators probably intended. Apple TV's Severance, which returned for its massively popular second season in early 2025, isn't technically about artificial intelligence. It's about a corporation (Lumon Industries) that uses a brain implant to split employees' memories between work and home life.
But as The Wrap noted, the show's real target is the system that deploys the technology — not the technology itself. Reviewers have observed that just as real-world corporations adopt AI to reshape labor markets, Lumon uses its tech to extract maximum value from workers while stripping them of agency.
This is, I'd argue, the most theologically honest AI story on television — because it locates the problem where the Bible always locates it: in human hearts, not in human tools. Cain didn't need AI to murder Abel. He needed a rock and the sin of envy. The technology was incidental. The heart was the variable.
Jeremiah 17:9: "The heart is deceitful above all things, and desperately wicked: who can know it?" Not "the algorithm is deceitful above all things." The heart.
A Modest Proposal for the Church
So what do we do with all this?
First, stop letting Hollywood write our theology of technology. The entertainment industry is very good at spectacle and very bad at nuance. Its job is to hold your attention for 47 minutes, not to help you think clearly about the tools you use on Monday morning.
Second, be honest about the risks without being paralyzed by them. The Pitt's 98% accuracy dilemma is a real conversation worth having — in your elder board, in your staff meeting, in your adult Sunday school class. AI tools make mistakes. So do humans. The question is how we build systems of accountability around both. That's a governance question, not a science fiction question.
Third, remember that fear is a terrible counselor. As Christianity Today observed, the Bible challenges both "technofans" and "technophobes" — those who worship every new tool and those who flee from it. The faithful posture is neither. It is what the apostle Paul described to young Timothy: sophronismos. Sound-minded. Eyes open. Hands steady. Building the wall while the mockers mock.
The next time SARA goes rogue on network television, or a charting app prescribes the wrong medication in a prestige drama, remember: you are watching a story someone chose to tell. It is designed to entertain you, not inform you.
The church has a better story. One where tools serve people, where fear is acknowledged but not worshipped, and where the real question is always the same one God asked in Genesis 4:6 — not "what did the technology do?" but "Why are you angry, and why has your face fallen?"
The problem has never been the tool. It's always been us.
