"Fold Everything and Give It to Everyone"
The meeting where the default logic died
A product manager could have turned AlphaFold into a printing press.
DeepMind had just crossed a threshold that structural biology had chased for decades. Predicting a protein’s 3D shape from its amino acid sequence had been a half-century problem, partly because proteins fold through a bewildering set of physical constraints, and partly because experimental structure determination is slow, expensive, and unevenly distributed. In the room, John Jumper did the obvious extrapolation: if the system worked this well, DeepMind could predict all known sequences in a month, then a billion, then two billion.
The normal next step in tech was sitting right there. Build a service. Let scientists submit sequences. Charge for access. Offer premium throughput to pharma. Add enterprise terms, usage tiers, and maybe a startup-style spinout if the spreadsheets looked cheerful enough.
Demis Hassabis went the other direction. “We should just run every protein in existence. And then release that. Why didn’t someone suggest this before? Of course that’s what we should do. Why are we thinking about making a service and then people submit their protein? We just fold everything. And then give it to everyone in the world.”
That quote matters because it captures a fork in the road most people never see. The story of AI is usually told as a race to build capability. The more consequential question is what organizations do when capability arrives. AlphaFold is one of the clearest cases where a lab looked at a scarce, valuable technical breakthrough and chose distribution over extraction.
The business model they declined
If you can predict protein structures reliably, selling access is not a weird idea. It is the obvious one.
Protein structure sits upstream of a lot of expensive work. It shapes how biologists think about function, how chemists reason about binding, how drug discovery teams narrow search spaces, and how academic labs decide which experiments deserve precious time on a cryo-EM instrument. A paid prediction service would have slotted neatly into existing budgets. Compared with the cost of a failed experiment or a delayed lead compound, API fees would have looked modest.
DeepMind could have built a very convincing business around that choke point. Every new sequence becomes a metered request. Every pharma company becomes an account. Every dependency becomes recurring revenue. Silicon Valley has taught itself to see these moments as destiny.
Instead, DeepMind and EMBL’s European Bioinformatics Institute built the AlphaFold Protein Structure Database and made it free. DeepMind also released the AlphaFold code. That combination is easy to miss if you flatten it into the generic word “open.” It was not just source availability. It was a decision to package a frontier model into a public resource that ordinary scientists could actually use, without needing a cluster, a machine learning engineer, or a week of preprocessing.
Ewan Birney at EBI put it plainly: “Demis called us up and said, ‘We want to make this open. Not just make sure the code is open, but we’re gonna make it really easy for everybody to get access to the predictions.’”
That last part is the hinge. A lot of “open” work in AI is open the way a stripped-down race car is open: technically visible, practically inaccessible. The code exists. The compute bill does too. AlphaFold’s release did the less glamorous, more important thing. It removed friction at the point where science actually happens.
From model to public utility
A protein structure predictor is impressive. A database of predicted structures changes behavior.
Before AlphaFold, obtaining a structure often meant waiting months or years, if the experiment worked at all. X-ray crystallography requires crystallizing the protein, which can be maddening. NMR is specialized and limited. Cryo-EM has become transformative, but the instruments are expensive, scarce, and not ideal for every target. Even excellent labs ration their shots.
AlphaFold did not eliminate those methods. It changed what happens before them. A researcher with a sequence can now look up a predicted fold, inspect confidence scores, compare domains, and generate hypotheses in minutes. Maybe the prediction is good enough to guide mutagenesis. Maybe it reveals an active pocket worth probing. Maybe it saves six months chasing a false assumption. Maybe it tells you that you still need the experiment. All of those outcomes are useful.
That is why precomputing almost everything was such a sharp move. If DeepMind had built a paid service, scientists would still have gained access to a powerful tool. By predicting around 200 million protein structures and making them browsable online, DeepMind changed the default workflow for biology. You do not start by asking whether you can afford an inference call. You start by checking whether the answer is already there.
Janet Thornton, who helped build structural bioinformatics as a field, described the feeling well: “That is fantastic. It’s like drawing back the curtain and seeing the whole world of protein structures.”
The metaphor is more literal than it sounds. Structural biology used to illuminate islands. AlphaFold sketched a continent.
Launch day showed what access actually means
People sometimes talk about openness as if it were a moral style. AlphaFold’s release made it measurable.
Accounts from the launch describe traffic climbing from a few hundred users in the early phase to roughly 100,000 concurrent users as the wave spread globally, beginning with heavy activity in Japan and then rolling across time zones. Hassabis wrote at the time, “Can’t quite believe it’s all out. What an absolutely unbelievable effort from everyone. We’re gonna all remember these moments for the rest of our lives.”
Those numbers matter for more than theater. They reveal demand that had been pent up by cost, tooling, and geography. A structural biology resource at that scale is not just useful to elite labs in Boston, Cambridge, Basel, or Tokyo. It becomes useful to students, underfunded groups, researchers outside major centers, and adjacent fields that never had the budget or expertise to become structure-first disciplines.
This is the part many AI companies still underestimate. Distribution is not a press release. Distribution is interface, hosting, uptime, search, metadata, integration, and trust. It is the boring scaffolding that determines whether a discovery becomes a field-wide habit or remains a citation.
AlphaFold succeeded as open science because it behaved like infrastructure. That required product discipline as much as scientific ambition. The database had to feel less like a demo and more like a place people would return to on Tuesday afternoon when a sequence looked strange.
Open science usually means papers. This was something else.
Academic science has strong traditions of openness, but they are often narrow. You publish the paper. You deposit the dataset if the repository exists. You share code if the dependencies are not a small horror show. That model works, up to a point. It does not usually deliver industrial-scale public goods overnight.
AlphaFold did.
The leap was not simply financial, though money mattered. It was organizational. DeepMind used the habits of a high-capability engineering lab—compute planning, model optimization, data pipelines, product polish, and global infrastructure—to create a scientific commons. That is unusual enough to deserve its own category.
It also exposes a blind spot in the way AI firms describe openness. Releasing weights or code can be valuable, but it does not automatically widen participation. If access still depends on specialized hardware, arcane setup, or insider knowledge, the benefits flow to people who already have power. DeepMind’s choice was more expansive because it shipped the expensive part on behalf of everyone else.
Eric Schmidt later said, “They released the structures of 200 million proteins. These are gifts to humanity.” The phrasing is grand, but not wrong. The gift was not only information. It was prepaid labor at planetary scale.
That said, calling it a gift can obscure the strategy.
Why it made sense for DeepMind
This was not naive altruism. It was a coherent move inside DeepMind’s larger project.
First, AlphaFold was a credibility engine. DeepMind had spent years making systems that dazzled people inside AI and felt abstract to everyone else. Protein folding was different. It was technically difficult, scientifically prestigious, and legible to the public. By releasing a huge corpus of useful predictions, DeepMind turned a benchmark victory into a durable proof that machine learning could accelerate real science. No enterprise sales deck would have bought the same legitimacy.
Second, the economics worked because Google sat behind the curtain. Precomputing hundreds of millions of structures is expensive. Hosting and maintaining a public database is expensive. The salaries of the researchers and engineers who made AlphaFold possible were already being paid by an organization with a giant cash engine. Most startups cannot make a similar decision because their backers expect the frontier system itself to become the business.
Third, letting others create the downstream breakthroughs increased DeepMind’s influence instead of reducing it. John Jumper put it beautifully: “The moment AlphaFold is live to the world, we will no longer be the most important people in AlphaFold’s story.” That sounds modest. It was also strategically brilliant. Once thousands of labs start using your work to make their own discoveries, your technology stops being a proprietary feature and starts becoming a standard reference point.
There is a deeper reason this matters. Scientific tools gain authority when they survive contact with many users, many organisms, and many failure modes. A private service can sell capability. A public resource can accumulate legitimacy.
Giving up control created a stronger kind of control
The usual platform instinct is to stay in the loop. You keep every request flowing through your servers. You monitor demand. You price by urgency. You preserve optionality because optionality is another word for leverage.
DeepMind broke that pattern. By precomputing structures and releasing them widely, it reduced its control over the moment of use. Researchers did not need to ask permission. They did not have to justify a budget line. They could simply build on top.
Paradoxically, that made AlphaFold harder to dislodge.
Think about the difference between a toll road and a map. A toll road can be profitable, but you only matter when people pass through your gate. A map becomes embedded in planning itself. Once a field absorbs a public resource into its routines, the resource shapes questions, methods, training, and expectations. Students learn with it. Reviewers assume it. Labs design around it.
That is why the strongest comparison is not to a cloud API. It is to infrastructure such as GenBank, BLAST, or the human genome reference. Those resources are so foundational that their existence disappears into the background of the discipline. They stop feeling like products and start feeling like the floor.
DeepMind’s choice turned AlphaFold from a valuable tool into part of biology’s ambient environment. For a research organization, that can be a more consequential win than capturing direct revenue from each interaction.
The model is powerful, but it does not travel easily
It would be comforting to treat AlphaFold as a template every frontier lab can copy. It is not.
One reason is patronage. Open science at this scale needs someone willing to absorb cost without demanding immediate returns. Google could do that. Most companies cannot. Even among rich companies, few have the patience or mission structure to justify it internally.
Another reason is the shape of the problem. Proteins are unusually well suited to this kind of release. They are discrete biological objects with standard representations, established benchmarks, and huge public sequence databases. You can precompute a giant archive and know that millions of future queries will line up with what you stored. Many scientific domains are messier. They involve dynamic systems, proprietary inputs, local context, or feedback loops that make “just run everything once” impossible.
There are also real questions around misuse, though they are easy to state too vaguely. Structural knowledge can support beneficial work and potentially harmful work. Biology has always carried some dual-use concern, and AI may lower the cost of combining public information in unsettling ways. That is a serious issue. It is also worth remembering that scientific secrecy has costs of its own. If a handful of firms privately control major biological inference systems, access becomes uneven, validation becomes harder, and public-interest research gets pushed to the back of the line.
The tension is not openness versus safety in the abstract. It is which kinds of openness, for which artifacts, under which governance, create more public value than risk.
AlphaFold landed in a zone where the upside was unusually broad and the downside manageable enough to accept. That should not be generalized carelessly. It should be studied carefully.
A new precedent for AI in science
What AlphaFold really introduced was a different answer to the question of where value lives.
The standard software answer is that value lives in the service boundary. You own the model, meter access, and capture payment at the edge. AlphaFold suggested another possibility: sometimes value compounds faster when the expensive computation is pushed upstream and turned into a commons. The organization that pays for it may earn reputation, talent attraction, political goodwill, scientific partnerships, and strategic centrality that exceed what a narrower commercial product would have delivered.
That is not a universal law. It is a specific pattern. It works best when the artifact can be broadly reused, the public good is obvious, and the sponsor has resources strong enough to survive not charging for the thing itself.
Still, it is a bigger challenge to the current AI industry than many people noticed. Companies love to talk about empowering researchers, accelerating discovery, and democratizing access. AlphaFold showed what those phrases look like when taken literally. It looked like shipping the outputs, not just the model card. It looked like building a public interface instead of a billing system. It looked like deciding that the fastest way to matter was to become commonplace.
There is a quiet confidence in that choice. You only give away the crown jewels when you believe the act of giving will increase your standing. DeepMind understood that in science, being indispensable often matters more than being exclusive.
The deeper lesson
The most interesting part of the AlphaFold story is not that a famous lab behaved generously. It is that generosity, in this case, was also sound institutional design.
By releasing code, predictions, and a usable database, DeepMind compressed the distance between breakthrough and benefit. It let thousands of other people decide what AlphaFold would become, which is exactly how major scientific tools earn their place. The result was not merely applause. It was adoption, citation, dependence, and a durable change in how biology starts.
Plenty of AI companies claim they want to help science. Very few have shown a willingness to turn a frontier capability into shared infrastructure before the monetization team can get to it. AlphaFold did, and that decision made the system larger than any product plan could have. The database turned a celebrated model into a shared starting point, and that is much harder to replace than a premium API.
End of entry.
Published April 2026