Skip to content
CW
ChurchWiseAI
Back to Blog
AI Ethics & TheologyJanuary 28, 20265 min read

Made in God's Image, Tracked by Algorithms: AI, Immigration, and Human Dignity

Somewhere in a database, there's a row in a table. It contains a name, biometric data, and a risk score. The row is also someone's daughter. Someone's father. Someone made in the image of God. From Palantir to Frontex to Aadhaar, the machinery of sorting is global — and efficiency has a way of obscuring humanity.

Rev. John Moelker

Rev. John Moelker

Founder & Theological AI Architect

Somewhere in a database, there's a row in a table.

The row contains a name, a country of origin, a photograph, biometric data, location history, known associates, and a risk score generated by an algorithm.

The row is also someone's daughter. Someone's father. Someone made in the image of God.

We should be able to hold both of these realities at once. We rarely do.

The Machinery of Sorting

AI-powered immigration enforcement is not science fiction. It's Tuesday.

In the United States, companies like Palantir have built sophisticated systems used by ICE (Immigration and Customs Enforcement) to identify, track, and prioritize individuals for detention and deportation. But this isn't just an American phenomenon.

The European Union's border agency Frontex uses predictive algorithms at its borders. Australia's immigration system employs AI for visa risk assessments. The UK Home Office has faced legal challenges over algorithmic visa decision-making. India's Aadhaar system—the world's largest biometric database—links identity to services in ways that affect millions of migrants and marginalized communities.

From the Mediterranean to the Rio Grande, from Calais to Christmas Island, the machinery of sorting is global.

This technology isn't inherently evil. Governments have legitimate interests in border security. Tracking criminals who pose genuine threats is reasonable. Technology that helps identify human trafficking victims is genuinely good.

But here's my concern: efficiency has a way of obscuring humanity.

When enforcement becomes algorithmic, the person disappears into the data point. The mother fleeing violence becomes a "case." The teenager who's known no other country becomes an "overstay." The family becomes a "unit."

The machinery of sorting doesn't see imago Dei. It sees rows in tables.

The Stranger in Your Midst

Scripture has remarkably consistent things to say about immigrants and foreigners. Remarkably consistent—and remarkably uncomfortable for everyone across the political spectrum, in every nation.

"When a foreigner resides among you in your land, do not mistreat them. The foreigner residing among you must be treated as your native-born. Love them as yourself, for you were foreigners in Egypt" (Leviticus 19:33-34).

The command isn't complicated. Don't mistreat. Treat as native-born. Love as yourself.

Notice what's absent: no mention of how they arrived. No distinction between documented and undocumented. No risk assessment protocol. Just: love them as yourself, because you know what it's like to be the outsider.

This doesn't mean borders are meaningless or laws don't matter. But it does mean that how we enforce those laws reveals something about our character. Efficiency that forgets dignity isn't wisdom. It's something else.

The Face and the File

Emmanuel Levinas, the Jewish philosopher, argued that ethics begins with the face. When we encounter another person's face, we encounter an infinite demand: you shall not kill. The face resists our attempts to categorize, totalize, reduce.

AI systems don't encounter faces. They process facial recognition data.

There's a difference. A significant one.

The face of the immigrant says: I am someone. I have a story. I matter infinitely to God. The file says: risk score 73, enforcement priority medium, last known location—pick a city, any city, on any continent.

Both contain information. Only one contains a person.

When we build systems that process files instead of encountering faces, we risk something profound: the gradual erosion of our capacity to see the human being in front of us.

The algorithm isn't malicious. It's just efficient. And efficiency without encounter can become cruelty without conscience.

Who Is My Neighbor?

A lawyer once asked Jesus the most important question: "Who is my neighbor?" (Luke 10:29)

He was hoping for a boundary. A definition that would limit his obligations. Neighbors are people like me, right? People in my community, my ethnicity, my legal status?

Jesus answered with a story about a Samaritan—the outsider, the wrong ethnicity, the one good religious people avoided.

The neighbor, it turns out, is whoever you encounter. Especially the one beaten and left by the road. Especially the one the algorithm would flag as "other."

"Go and do likewise," Jesus said.

He did not say, "Go and check their documentation first."

The Surveillance Asymmetry

Here's something worth noticing: across the globe, the most surveilled people are usually the most vulnerable.

The wealthy can afford privacy. Gated communities in São Paulo. Lawyers in London who fight subpoenas. Shell companies in Singapore that obscure ownership. The powerful are watched less because they have resources to resist the watching.

The poor, the immigrant, the marginalized—they get cameras, databases, predictive algorithms, and risk scores. From the favelas of Rio to the refugee camps of Jordan, from the detention centres of Nauru to the processing facilities of Texas, they cannot afford to resist. So they are seen, tracked, processed, sorted.

This is not a bug in the system. It's how power has always worked. The empire counts and categorizes those it wishes to control.

Remember: Jesus was born during a census. Caesar wanted to count his subjects. Mary and Joseph traveled to Bethlehem because the empire demanded data. The Incarnation happened in the middle of bureaucratic surveillance.

God showed up as one of the counted, not the counters.

Dignity Is Not a Data Point

"So God created mankind in his own image, in the image of God he created them" (Genesis 1:27).

This is the foundational claim of Christian anthropology. Every human being—regardless of nationality, legal status, productivity, or usefulness to the state—bears the image of God.

The Syrian refugee in a German processing centre: imago Dei.

The Venezuelan family at the US southern border: imago Dei.

The Rohingya fleeing Myanmar: imago Dei.

The undocumented worker in your community—wherever your community is: imago Dei.

The person reduced to a row in any government's database: imago Dei.

No algorithm captures this. No risk score accounts for infinite worth. No database field contains "beloved by God."

This doesn't give us easy answers to complex policy questions. Immigration is genuinely complicated. Good people disagree on enforcement priorities and border policy in every nation.

But it gives us a non-negotiable starting point: dignity comes first. The image of God is not forfeited by crossing a border without permission.

Technology as Moral Mirror

The tools we build reveal what we value.

If we build AI that maximizes enforcement efficiency without asking about human cost, that tells us something. If we celebrate technology that tracks and sorts and flags without pausing to ask whether the tracked and sorted and flagged are being treated as image-bearers, that tells us something too.

Technology is a moral mirror. We may not like what we see.

Aleksandr Solzhenitsyn, who knew something about surveillance states, wrote: "The line separating good and evil passes not through states, nor between classes, nor between political parties either—but right through every human heart."

The question is not whether AI is good or evil. The question is what we do with it. Whether it serves human dignity or undermines it. Whether it helps us see faces or only process files.

What Faithfulness Looks Like

So what do we do?

We remember. Israel was commanded to remember their own immigrant past. "You were foreigners in Egypt." Memory breeds compassion. Forgetfulness breeds cruelty. Most of us, if we trace our family history back far enough, were once the stranger in a strange land.

We humanize. When conversations turn to dehumanizing language—"illegals," "aliens," "queue jumpers," "boat people"—we gently insist on words that preserve humanity. "Undocumented immigrant," "asylum seeker," "refugee"—these aren't just politically correct terms. They're theologically accurate. These people are not less than human. They are not other than human.

We show up. Immigrant communities in your city—whether that's Toronto, Manchester, Sydney, or Nairobi—need advocates, language tutors, legal aid, and friends. The algorithm can track them. The church can love them.

We speak. When policies treat people as problems to be processed rather than neighbors to be loved, we say so. Prophetic witness isn't partisan. It's biblical. And it sounds the same in every language.

We examine ourselves. The same heart that builds surveillance systems lives in us. The same impulse to sort, categorize, and distance ourselves from the "other"—it's not just in the algorithm. It's in the mirror.

The Row in the Table

Back to where we started: somewhere in a database, there's a row in a table.

But that row has a name. A history. A family that loves them. A future that matters to God.

The algorithm will never know this. It processes data. It doesn't encounter souls.

But we can know. We can see the face behind the file. We can remember that every data point is someone for whom Christ died.

And we can insist—with our words, our actions, our politics, and our presence—that efficiency without dignity is not progress.

It's just faster cruelty.


Sources & References

"The Lord watches over the foreigner and sustains the fatherless and the widow." — Psalm 146:9

Rev. John Moelker

Rev. John Moelker

Founder & Theological AI Architect

John is a pastor, software engineer and theologian passionate about making AI accessible and theologically faithful for churches of all traditions. But most importantly, John wants to see others come to know Jesus better.

Ready to Add AI to Your Church?

Join churches already using ChurchWiseAI to answer every call, engage every visitor, and free their staff for real ministry.