The Altar Is Already Built
My daughter came home from school last spring and told me her English teacher had used ChatGPT to grade their essays. Not to assist. Not to check. To grade. The teacher had fed thirty-two student papers into a chatbot and handed back the results as if they were evaluation, as if the thing had actually read what the children wrote and understood what they meant.
She's eleven. She'd spent two weeks on that essay.
I didn't say anything inflammatory at the parent-teacher conference. I asked calm questions. But what I was thinking — what I couldn't shake — was that something sacred had been desecrated. Not in a dramatic, thunderclap way. In a quiet, bureaucratic way. A human being, created in the image of God, had poured thought and effort and budding personality onto a page, and an algorithm had reduced it to a rubric score.
That's where we are. And most of us are treating it like a productivity story.
What Silicon Valley Is Actually Claiming
The tech industry's pitch for artificial intelligence has quietly escalated from "helpful tool" to something that would have been recognized as religious language in any other context. Sam Altman has written about AI solving death — not metaphorically, but literally extending human lifespan indefinitely. Ray Kurzweil, Google's chief futurist, has spent decades predicting the Singularity: a point at which machine intelligence transcends human intelligence and transforms the human condition entirely.
These are not engineering claims. They are eschatological claims. They are describing salvation — deliverance from the fundamental conditions of human existence: suffering, limitation, mortality.
The Israelites at the foot of Sinai knew Moses was coming back. They knew. But he was taking too long, and they were afraid, and so they took their gold and made something they could see. Something they could point to. Something that felt like control in a situation that felt uncontrollable.
We have more gold than they did. And we're making something much more sophisticated. But the psychological move is identical: we are building an idol to manage the terror of our own finitude.
The Difference Between a Tool and a God
Scripture doesn't forbid tools. It forbids the worship of them — the transfer of ultimate trust from the Creator to the created. And that transfer is precisely what the most fervent AI advocates are asking for.
When Elon Musk founds a company explicitly to merge human consciousness with machine intelligence, he's not talking about a better search engine. When Sam Altman says he believes AI will eventually be smarter than all of humanity combined and that this is something to look forward to rather than fear, he's describing a being to which humanity becomes subordinate. That's not a tool. That's a successor god.
The distinction matters practically, not just theologically. A tool is subordinate to human judgment and human values. It does what we direct it to do, and we remain responsible for the outcomes. An AGI — an artificial general intelligence that exceeds human capability across domains — is definitionally not subordinate to human judgment, because its judgment exceeds ours. We would be, in the most literal sense, worshipping something we made that then told us what to do.
Exodus 32 is not subtle about how that ends.
What Faithful Engagement Looks Like
None of this means AI is intrinsically evil or that Christians should refuse to use it. The Israelites' gold was not sinful. The problem was what they made with it and what they then bowed down to.
A spell-checker is a tool. An AI that writes your child's college essay and presents it as their own formation is something else — it's not just cheating, it's the substitution of a machine output for a human's development. A medical AI that helps radiologists catch cancers they'd otherwise miss is a genuinely useful tool. An AI that replaces the physician-patient relationship — the covenant of care, the human presence at a moment of fear — is something else.
The question isn't capability. The question is where we locate our trust and what we surrender to get the efficiency.
My daughter rewrote that essay. She turned it in late, and her teacher gave her a lower grade for the tardiness. I told her the grade didn't matter. What she wrote mattered. What she thought mattered. The algorithm that evaluated the first version couldn't have known that — couldn't have cared about it — because knowing and caring are not things that code does, no matter how sophisticated the architecture.
We were made in an image. The things we make were not. Keeping that distinction clear is not technophobia. It's faithfulness. And the church needs to say so — loudly, clearly, without apology — before the altar gets any bigger.




