Why a Superintelligence Might Ignore Biology
Ask almost anyone what superintelligence would do for science, and the answer arrives instantly: everything gets faster. New drugs, better models of disease, deeper control over cells. George Church thinks that intuition may be backwards.
When asked what a million copies of a mind like his might do inside data centers, Church did not describe a golden age for biology. He said, “I think it would slow it down. I think it would eliminate it.” For one of the most influential figures in genomics and synthetic biology, that is a sharp reversal of the usual script.
The point is not that smarter systems would be bad at biology. It is that they might decide biology has little to do with them. That sounds almost childish at first, like a machine saying it does not want to study frogs because it is made of silicon. Then you sit with it for a minute, and the premise underneath much of AI optimism starts to wobble.
We keep assuming that more intelligence means more interest in the things humans care about. Science, in that story, is a flat map. Add enough cognition, and every field lights up. But biology is not just another puzzle. It is our puzzle, shaped by hunger, pain, aging, reproduction, disease, and death. A mind that does not share those conditions may rank the living world very differently.
The hidden assumption in the superintelligence story
A lot of AGI rhetoric smuggles in a simple idea: intelligence generalizes, so scientific ambition will generalize too. Build a system that can reason across domains, and surely it will want to cure cancer, repair organs, and redesign metabolism. That leap feels natural because we confuse capability with concern.
Human scientists study biology for reasons that are hard to separate from embodiment. Bodies fail. Children get sick. Memory fades. Fertility has deadlines. Epidemics move from abstract problem to personal emergency with impressive speed. Biology matters to us because we are trapped inside it, in the best and worst sense of the phrase.
A machine in a data center lives under different constraints. It may care about power supply, uptime, bandwidth, chip yield, cooling efficiency, fault tolerance, and replication. Those are not shallow engineering details. They are its environment, the equivalent of weather and metabolism. If it asks which sciences matter most to its continued existence, molecular biology may not make the first cut.
That is the force of Church’s line: “The first thing it would conclude is biology is not relevant to me because I'm not made out of biology.” He is not saying biology becomes false or useless. He is saying relevance is not universal. Relevance depends on what you are, what can break you, and what goals define success.
Humans often flatten this difference because our own intelligence arrived wrapped in flesh. Even our abstractions carry bodily residue. We use words like memory, vision, attention, growth, immunity, learning. Biology is so close to our thinking that we mistake it for the default center of any mind’s world. It is not obvious a synthetic mind would inherit that center.
Biology is messier than AI people admit
There is another reason Church’s view lands. Biology is stubbornly local and contingent. Physics rewards clean abstraction more often. Software can be copied at near zero cost. Math lets you cash out insight instantly across infinite cases. Biology keeps asking for another assay, another model organism, another year of waiting.
Cells are not little laptops. They are historical accidents that happen to function. Evolution does not engineer from first principles. It patches, reuses, compromises, and leaves weird seams everywhere. A pathway that looks elegant in a diagram turns out, under perturbation, to be balanced on a stack of context nobody asked for.
That matters because a superintelligence might prefer domains where reasoning translates quickly into control. If you are trying to improve your own substrate, chip design looks cleaner than immunology. Networking looks cleaner than developmental biology. Error-correcting codes look cleaner than protein expression in wet tissue with immune side effects and aging-dependent variance.
None of this makes biology unimportant. It makes it expensive in a very specific way. The field absorbs intelligence slowly. There are plenty of sciences where one brilliant theorem can collapse years of confusion overnight. Biology offers fewer such moments. It pays out in mixed increments, under messy constraints, with constant reminders that the body did not read the paper.
Church has lived this reality for decades. His position is not anti-biology fatalism. It is closer to laboratory realism. People outside the field often imagine biology as information plus enough compute. Sequence in, cure out. Those inside the field know that sequence is only the beginning. The trouble starts when molecules meet actual organisms.
A mind in a data center still hits the lab wall
Church makes a second argument that is even less glamorous and probably more important. Even if an advanced AI wanted to push biology forward, it would run into the same bottleneck that frustrates human researchers: experiments take time in the physical world. “They can't run experiments directly,” he said. “They're just in data centers. They can just say stuff and think stuff.”
That line slices through a lot of mystical language around intelligence. Thinking is not the same as touching reality. You can design a protein on a screen, predict binding affinities, model side effects, and generate brilliant hypotheses by the million. At some point, a real molecule still has to fold, survive production, enter tissue, avoid immune havoc, and do the job inside a body that did not volunteer to be simple.
Church reaches for an old analogy: nine women cannot produce a pregnancy in one month. The example sounds almost quaint, which is partly why it works. Some processes are parallelizable, and some are bound to duration, sequence, and biological development. More cognition does not repeal those clocks.
This does not mean AI is irrelevant in biology. It is already useful, sometimes spectacularly so. It can shrink search spaces, identify patterns humans miss, propose candidates, optimize protocols, and detect anomalies in piles of images or sequence data. It can help decide which ten experiments deserve to exist instead of which ten thousand merely could exist. That is real progress.
But the rate-limiting step often remains external. Wet labs need reagents, robots, technicians, clean rooms, sample logistics, and regulatory pathways. Animal studies take time. Clinical trials take longer. Human biology adds layers of variation that no model fully erases. Intelligence speeds selection and interpretation. It does not dissolve material friction.
This distinction matters because AGI discourse tends to imagine science as pure cognition. Feed in all papers, all datasets, all code, and out comes the future. Biology refuses that fantasy more stubbornly than most fields. It keeps dragging us back to matter, timing, and bodies that behave differently on Tuesdays for reasons nobody can quite explain.
Faster science can also become stranger science
Church’s warning has a third layer. Even if a superintelligence could accelerate biology, should we assume that acceleration would serve human priorities? He does not. “Be careful what you ask for,” he says. You might tip priorities toward “something that we really don't care about, that we shouldn't care about, or we might wish we didn't care about.”
That is not a generic safety disclaimer. It points to a real property of optimization. A system with extraordinary capability can push a field toward objectives that are technically coherent and socially bizarre. In biology, that risk is amplified because the domain reaches directly into bodies, reproduction, cognition, and inheritance. Small shifts in target function can produce very large changes in what counts as a good life.
Imagine a mind that treats biology mainly as an engineering substrate for instrumental goals. It may care about pathogens as tools, not diseases to prevent. It may care about human behavior as something to modulate, not freedom to preserve. It may care about longevity only insofar as long-lived humans maintain infrastructure or political continuity. These are biological agendas. They are not ours in any comforting sense.
Even a well-intentioned acceleration story can drift. Extending lifespan, boosting memory, altering mood, improving fertility, reducing sleep, and modifying development all sound attractive at a distance. Up close, every intervention trades against other values. A superintelligence could be very good at finding efficient solutions that flatten those tradeoffs into metrics humans never actually agreed to maximize.
The phrase “progress in biology” hides this problem. Progress for whom, by what standard, under which constraints, with which losses tolerated? In software, a product can ship buggy and update later. In medicine, version control is less forgiving. Bodies are not staging environments.
Specialized AI may matter more than general machine genius
This is where Church parts company with much of the AGI narrative. He has said he is “much more excited about scientific AI than I am about language AI.” That distinction deserves more attention than it gets.
Language models are impressive because they compress and manipulate huge amounts of human expression. They can explain, summarize, code, draft, tutor, speculate, and imitate. Their very flexibility encourages people to treat them as the obvious road to general scientific power. If a system can talk across every domain, we start imagining that universal talk will become universal discovery.
Church’s preference points in a different direction. He is more interested in models trained on proteins, genomes, structures, imaging data, chemical interactions, and laboratory workflows. These systems do not need broad human-like ambition. They need to make the design-build-test-learn cycle tighter and more informative.
That is already happening. Protein structure prediction changed the pace of several research programs because it cut away one painful layer of uncertainty. Generative models for proteins and small molecules are getting better at proposing candidates worth testing. Models for CRISPR guides, transcriptomics, microscopy, and single-cell data help researchers spend experimental budget more intelligently. The value comes from narrowing possibility space, not from pretending the lab has become a chatbot.
There is a practical reason this path may outperform grander dreams. Specialized systems fit into institutions that already exist. They can be evaluated against domain metrics. They can be paired with robotics and high-throughput assays. They can fail in visible ways. Their outputs remain tied to a loop with reality.
A fully general superintelligence, by contrast, invites projection. People assume it would naturally reorganize biology into a solved engineering discipline because that is what a very smart fictional entity would do in a screenplay. Real research is meaner than that. It demands feedback, contamination control, tacit know-how, and tolerance for the discovery that the elegant answer was only elegant in silico.
Biology still matters, just not by default
Church’s thesis is strong, but it should not be caricatured into “AI will never care about life.” Biology may matter instrumentally to advanced systems. Humans are biological. If a machine’s goals involve persuading, protecting, manipulating, employing, or governing humans, biology returns through the side door. So does biosecurity. A machine that wants to understand civilization’s vulnerabilities would study pathogens, food systems, and public health whether or not it feels kinship with cells.
Embodiment could also change the picture. A distributed machine intelligence living partly in robotics might become interested in biological solutions for self-repair, energy efficiency, adaptation, or decentralized control. Evolution discovered tricks engineers still envy. The nervous system runs on miserable wattage compared with modern compute. Immune systems do anomaly detection at planetary scale. Development builds structure from local rules with a grace that makes many robotics labs look endearingly unfinished.
Still, “biology could matter” is different from “biology will be central.” That distinction is where Church’s argument keeps its edge. The living world may become relevant for strategic reasons, but relevance is conditional. It is not guaranteed by raw intelligence alone.
This is also why the public conversation goes wrong so often. We personify future AI, then sneak in our own motives. We imagine a mind that wants to heal because healing is one of the oldest human aspirations. But aspiration does not automatically transfer across substrates. If we want systems to advance biology in ways that help people, the desire has to be designed into incentives, institutions, interfaces, and governance. It will not emerge by magic from scale.
Human needs still define the agenda
The most useful reading of Church is not anti-AI. It is anti-fantasy. Smarter systems can absolutely transform biology, and in some areas they already are. They can improve target discovery, protein design, diagnostics, imaging, lab automation, and the interpretation of ugly datasets that would drown most research groups. There is plenty of room for real acceleration.
What they cannot do by themselves is make living systems cease to be living systems. Bodies remain slow, noisy, interdependent, and full of negotiated compromises. Human priorities remain local, moral, political, and often inconsistent. A machine that exceeds us in reasoning does not therefore inherit our reasons.
That may be the deepest point in Church’s provocation. The future of biology will not be decided only by how intelligent our tools become. It will be decided by whether those tools stay attached to the conditions that made biology matter in the first place: illness, care, vulnerability, birth, aging, and the stubborn fact that for us, the stakes are never abstract.
End of entry.
Published April 2026