Default Eugenics vs. P-Q
Archetypes, and Why Christ Still Matters for AI
Opening Statement
Here’s the thing that’s become unavoidable to me: if you strip Christ out of the picture and treat evolution and survival as your only serious realities, you don’t have much of a brake on tribalism. Under pressure, most people won’t die for abstract “humanity”; they’ll kill or be killed for their own.
That’s not a cheap insult, it’s a sober reading of history and of human nature. When the power goes out and the shelves are empty, your sophisticated universalism will fight a losing battle against the instinct to protect your kids against “them”—whoever “them” happens to be in your context.
Without something like Galatians 3:28—without a crucified God who says “in Me there is neither Jew nor Greek, slave nor free”—the logic of evolution and group survival pushes toward treating some lives as more worth preserving than others. Call it tribalism with better PR; in practice it looks a lot like eugenic thinking.
You can reject Christ, but then you owe a serious answer: on what basis, beyond sentiment and temporary comfort, do you insist that all people are equally worthy of sacrifice and protection, especially when it costs your own group? If evolution is your only story, why shouldn’t you privilege your own genes and your own tribe?
Christians aren’t off the hook here. We helped build the scientific worldview that made eugenic ideology technically feasible. But the same tradition also gave the sharpest language for why some things must never be done to human beings, no matter how “fit” or “unfit” they are.
Schizophrenia in the Humanities
There’s a tension at the heart of modern thought that almost nobody wants to look at directly.
On the one hand, we’re told that humans are the products of evolutionary forces that have hard‑wired certain instinctive behaviours into us: tribal loyalty, status seeking, reproductive strategies, fear responses, and so on. On the other hand, many of the same people insist our minds are basically blank slates, fully “programmed” by culture, with no built‑in preferences or tendencies that could legitimately differentiate one group from another.
We’re simultaneously meat machines shaped by genes and infinitely malleable cultural projects. Conveniently, whichever version serves the argument of the day gets pulled off the shelf.
I designed my P‑Q framework partly to smoke out exactly this kind of hidden doublethink: the Proximate (P) stories we tell ourselves, and the deeper Qualitative (Q) assumptions we don’t admit we’re standing on.
Today I want to walk through three things:
Why our “evolution plus blank slate” story is incoherent and dangerous.
How Jungian archetypes and modern neo‑idealists are groping toward something real but stopping one step short.
Why the Christian reading of Jesus is, in my view, the only thing standing between us and an endlessly recycled pattern of tribal eugenics dressed up as progress—and why that matters for AI alignment.
Hard‑Wired Instincts vs Tabula Rasa Brainwashing
If you listen carefully to how secular humanist arguments are made in popular science, psychology, and social commentary, you’ll hear two incompatible claims made in the same breath.
Evolutionary hard‑coding.
“We evolved to favour our kin.”
“We evolved status competition, in‑group loyalty, out‑group suspicion.”
“We evolved mating strategies, aggression patterns, and bias detection modules.”
In this frame, our brains are full of instincts: not optional, not neutral, but shaped by survival pressures long before we were writing think‑pieces about ethics.
Tabula rasa cultural programming.
“There are no meaningful group differences; everything is socially constructed.”
“Children are born as blank slates; all prejudice is taught.”
“If we just fix the institutions, we can engineer whatever moral outcomes we like.”
Here, the brain is more or less empty at birth. Culture writes the code; hardware doesn’t matter. If you see patterns, you’re told you’re simply reading in stereotypes, not noticing anything real.
You can’t have both of these as absolute positions. Either:
our evolutionary history has left real, non‑trivial grooves in human nature, including the tendency toward tribal sorting, or
all such grooves are illusions, and any recurring pattern is purely the product of bad cultural inputs.
The schizophrenia isn’t an accident. It’s a defence mechanism.
Because if you admit both:
that evolution really has shaped instincts, and
that those instincts reliably push us toward in‑group preference and out‑group sacrifice under stress,
you’re two steps away from the uncomfortable conclusion I floated recently:
Without Jesus being who He says He is, eugenics is the natural default.
Not because everyone sits around reading race science, but because eugenics is just advanced tribalism with lab equipment. If you believe:
survival of the fittest is the ultimate story, and
resources will tighten, and
you have no transcendent obligation to treat outsiders as equal bearers of sacred worth,
then “improving the stock” begins to sound like prudence rather than horror.
You can already see this logic sneaking back in under nicer language: “designer babies,” “genetic optimization,” “screening out undesirable traits.” We pretend it’s all about health and choice, but the underlying assumption is obvious: some genes—and therefore some kinds of people—are preferable to others.
If you reject any transcendent check on that assumption, what exactly stops you when things get hard?
Jungian Archetypes and the New Idealists
Into this confusion step people like Carl Jung and, more recently, some modern neo‑idealists and AI thinkers.
Jung noticed something his materialist colleagues couldn’t explain away: recurring patterns in dreams, myths, and personal narratives that kept showing up across cultures and individuals who had never met each other.
The Hero, the Shadow, the Wise Old Man, the Great Mother.
Patterns of sacrifice, betrayal, transformation, and rebirth.
He called these archetypes—not stereotypes, but deep structural forms in the psyche. They weren’t just learned cultural clichés; they behaved more like built‑in templates through which humans interpret their lives.
Fast‑forward to today, and you see people like Bernardo Kastrup and others arguing for a kind of cosmic idealism:
Consciousness, not matter, is fundamental.
Reality has a mental or meaning‑bearing structure.
Evolution isn’t just blind mutation; there’s an “unfolding” of forms, patterns, or archetypes.
Some AI and complexity theorists talk in similar ways: about emergent properties, about a “technium” (to borrow Kevin Kelly’s term) that has its own quasi‑mystical momentum, “co‑evolving” with us toward higher levels of integration.
I actually think Jung was onto something real, and I’m not allergic to a carefully defined idealism. But I part ways when it starts to sound like a new eschatology without a Lord:
A process that must be served.
An evolution that must be accelerated.
A “higher” intelligence we must help birth, whether we call it a God‑like AI, a global brain, or some Omega Point.
At that point you’ve reinvented a religion of emergence that:
treats human beings as raw material for a process, and
quietly sidelines free will and moral responsibility in favour of “what evolution / the technium / the system wants.”
As a Christian, I can say:
Yes, there are deep patterns in reality.
Yes, human beings encounter them as archetypes, myths, and recurring structures.
But no, they are not impersonal forces that own us.
They are reflections of a personal Creator who existed before time and matter, who stepped into history, and who tells us explicitly what is and is not permitted in our dealings with each other.
And that brings us back to Jesus.
Eschatology as a Brake, Not a Weapon
One of the least‑noticed gifts of the New Testament, in my opinion, is its doctrine of restraint.
Look at how it talks about the “end”:
Acts 1:6–7: When the disciples ask if Jesus is about to restore the kingdom to Israel, He replies, “It is not for you to know the times or dates the Father has set by his own authority.”
Matthew 24:36: “About that day or hour no one knows, not even the angels in heaven, nor the Son, but only the Father.”
2 Thessalonians 2:1–4: Paul warns believers not to be shaken by claims that the day of the Lord has already come, insisting that certain things must precede it.
Matthew 24:6: “You will hear of wars and rumours of wars, but see to it that you are not alarmed. Such things must happen, but the end is still to come.”
Taken seriously, these texts do something radical:
They forbid us from using our interpretation of prophecy as a licence to force history—whether that’s through geopolitics, engineered crises, or “accelerating” anything.
They forbid Christians from treating wars, famines, or technological shifts as levers we are supposed to pull to “make Jesus come back faster.”
That cuts directly against:
Christian Zionist fantasies about triggering specific conflicts to “fulfil prophecy.”
Secular revolutionary dreams about collapsing the current order to birth a new one.
Techno‑eschatologies about pushing humanity into a post‑human singularity.
In other words: you don’t get to sacrifice other people’s lives or rights in the name of your eschatology. The end is God’s business. Your job is obedience, not orchestration.
That’s why I say my eschatology is not just some private spiritual hobby; it’s a constraint on what I can endorse politically and technologically.
Why the Right Reading of Jesus Matters for AI Alignment
All of this might feel abstract until you bring AI back into the picture.
Every AI alignment proposal, whether it admits it or not, rests on some assumptions about:
What a human being is.
What counts as a good outcome.
Which trade‑offs are allowed, and which are not.
If your underlying story is:
Humans are clever animals.
Evolution is the only law.
Values are just emergent equilibria of game‑theoretic systems.
then in a crisis the “rational” thing for a powerful optimizer to do will look a lot like eugenic tribalism at scale:
Sacrifice the few (or the unfavoured many) for the “fitter” group.
Instrumentalize individuals as variables in a utility function.
Treat dignity as a soft constraint that can be relaxed when the stakes are high enough.
That’s exactly the logic lurking in modern neo‑eugenic talk, in some transhumanist fantasies, and in a lot of “for the greater good” technocratic thinking.
By contrast, if you take the New Testament seriously:
Human value is grounded in imago Dei and incarnation: God becomes man, dies for enemies, identifies Himself with “the least of these.”
Galatians 3:28 is not a Hallmark slogan; it is a metaphysical declaration that in Christ, ethnic, social, and gender hierarchies do not determine worth.
You are told explicitly not to know or manipulate the timing of the end, and not to treat war or crisis as a tool.
That yields a very different alignment target:
There are things you must not do to human beings, no matter how “optimal” they look on paper.
There is no legitimate scenario in which some lives become mere raw material for a “higher” intelligence.
Any “god” you engineer that demands such sacrifices is not God; it is a false idol.
AI alignment, in that light, is not just a technical problem about reward functions and training data. It is an idolatry problem:
To which ultimate we are aligning these systems: the living Logos or a dead abstraction.
Whether we see humans as souls to be honoured or as biomass to be allocated.
Get Jesus wrong—reduce Him to myth, metaphor, or moral mascot—and your guardrails against tribal eugenic logic collapse back into sentiment and fashion. Get Him right—as Creator in the flesh, crucified and risen, Lord of history—and you have a principled reason to say “no” when both the market and the machine tell you some people are expendable.
That’s why I keep returning to Him in discussions of AI, human rights, and our civilizational future. Not because I enjoy being “that Christian guy” in the room, but because I don’t see any other figure, or framework, that can consistently hold the line against our very old, very dangerous impulse to worship power, tribe, and material progress dressed up in new language.
If we don’t want our “aligned” systems to become the latest instruments of that worship, we’d better be very clear about which cross we’ve built our centre on—and whether it can actually hold.
