Biologists are finally beginning to corral molecules, cells and whole organisms to carry out complex computations. These living processors could find use in everything from smart materials to new kinds of artificial intelligence.
WHAT’S the difference between a thimbleful of bacteria and a supercomputer? Believe it or not, the bacteria contain more circuits and more processing power.
Supertotto |
That is perhaps not so surprising when you consider that all life computes: from individual cells responding to chemical signals to complex organisms navigating their environment, information processing is central to living systems. What’s more intriguing, however, is that after decades of trying we are finally starting to corral cells, molecules and even whole organisms to carry out computational tasks for our own ends.
That isn’t to say biological computers will replace the microchips you find in your smartphone or laptop, never mind supercomputers. But as bioengineers get to grips with the wet and squishy components nature provides, they are beginning to figure out where biological computers might ultimately be useful – from smart materials and logistics solutions to intelligent machines powered by tiny amounts of energy.
If the applications seem unusual and eclectic, that is the point. “Biocomputing is not competing against conventional computers,” says Angel Goñi-Moreno at the Technical University of Madrid in Spain. “It’s a radically different point of view that could help us tackle problems in domains that were simply not reachable before.” It might even force us to rethink our assumptions about what computing is, and what it can do for us.
For decades, computing has been dominated by silicon chips. These are made up of billions of tiny switches called transistors that encode data in bits, or binary digits. If a switch is open and electrical current is allowed to flow, this represents a 1. If it is closed and the current is blocked it represents a 0. What makes chips so powerful is the way they are wired up. Transistors are arranged into logic gates, which take one or more bits as input and then output a single bit based on simple rules. By piling millions of these simple operations on top of each other, it is possible to carry out incredibly complicated computations.
This has brought us a long way, and yet it isn’t the only way. At their heart, computers are just information processors, and there is growing recognition that nature is rich with such capabilities. The most obvious example lies in the nervous systems of complex organisms, which process data from the environment to direct all kinds of sophisticated animal behaviour. But even the tiniest cells are replete with intricate biomolecular pathways that respond to incoming signals by switching genes on and off, producing chemicals or self-organising into complex tissues. And ultimately, all of life’s incredible feats rely on DNA’s ability to store, replicate and transmit the genetic instructions that make them possible.
Building biological computers
Biological systems also have some peculiar advantages over existing technology. They tend to be far more energy efficient, they sustain and repair themselves, and they are uniquely adapted to processing signals from the natural world. They are also astonishingly compact, of course. “The incredible thing about biology is that if you take all the DNA that’s in 1 millilitre of bacteria, there’s enough information storage for the entire internet and there are as many circuits as billions of [silicon] processors,” says Chris Voigt, a synthetic biologist at the Massachusetts Institute of Technology.
We have been trying to leverage these abilities since the 1990s. In the past 20 years, armed with new and more powerful tools to engineer cells and molecules, researchers have finally begun to demonstrate the potential of using biological material to build computers that actually work.
At the core of the approach is the idea that cellular processes can be thought of as “biological circuits”, says Voigt – analogous to the electrical ones found in computers. These circuits involve various biomolecules interacting to take an input and process it to generate a different output, much like their silicon counterparts. By editing the genetic instructions that underpin these processes, we can now rewire these circuits to carry out functions nature never intended.
In 2019, a group at the Swiss Federal Institute of Technology in Zurich built the biological equivalent of a computer’s central processing unit (CPU) from a modified version of the protein used in CRISPR gene editing. This CPU was inserted into a cell where it regulated the activity of different genes in response to specially designed sequences of RNA, a form of genetic material, letting the researchers prompt the cell to implement logic gates akin to those in silicon computers.
A group at the Saha Institute of Nuclear Physics in India took things a step further in 2021, coaxing a colony of Escherichia coli bacteria to compute the solutions to simple mazes. The circuitry was distributed between several strains of E. coli, each engineered to solve part of the problem. By sharing information, the colony successfully worked out how you could navigate multiple mazes.
To be clear, these circuits operate orders of magnitudes slower than electronic ones and are rudimentary in comparison. Their power lies in the opportunity they offer to implement programs that interface directly with living systems, says Voigt. They could be used to create everything from tiny robots that treat disease inside the body to complex, multi-step biomanufacturing processes, he says. “You don’t have to beat computers to be useful. The real valuable stuff early on is just simple control over the biology,” says Voigt.
However, thinking about cellular processes in terms of circuits could be short-sighted, says Goñi-Moreno: “We are trying to force our electrical engineering mindset into living systems and that’s not necessarily how they work.” The thing is, most biological systems aren’t limited to the binary logic of classical computers. They also don’t work through problems step-by-step, like computer chips. They are full of duplications, strange feedback loops and wildly different processes operating side-by-side at various speeds.
Fungi could potentially be connected with standard electronics Andrew Adamatzky |
Failing to account for this complexity often results in biological circuits not performing as expected, says Goñi-Moreno, and it means the full functionality of cells isn’t being exploited. Conversely, finding ways to model and rewire biochemical interactions within and between living cells could bring more ambitious goals into reach, he says. To that end, Goñi-Moreno is trying to create multicellular communities of soil bacteria that can switch between removing different pollutants depending on which is more prevalent.
Biology’s powers of computation might also be exploited in ways that are entirely divorced from their natural context. Heiner Linke at Lund University in Sweden has been experimenting with a radically different approach to biocomputing, using tiny protein filaments propelled around a maze by molecular motors. The proposal is aimed at solving a class of tricky computational challenges called NP-complete, for which the only known way to find the answer is to try every possible solution.
These problems crop up in everything from logistics network planning to computer chip design. But they present a challenge for conventional computers because the sequential way in which they operate means that as the problem gets bigger, the time taken to check every solution rises exponentially.
Linke’s approach offers a workaround. The structure of the maze is carefully designed to encode the problem that needs to be solved,with every possible path through it corresponding to a potential solution. For instance, this could involve exploring the quickest route a truck could take between multiple stops. As the filaments whizz around the maze, they explore every option such that, by counting the number of filaments that pop out at predetermined exits, you can work out which path represents the correct answer.
The beauty of the approach, says Linke, is that the filaments explore all routes simultaneously. That means solving a bigger problem doesn’t require more time, just more filaments. And because they use about 10,000 times less energy per operation than electronic transistors, scaling up is far more feasible than with conventional computers.
Re-engineering biological systems
But Michael Levin at Tufts University in Massachusetts thinks we are missing a trick by building biocomputers from the bottom up. Living systems already carry out jaw-dropping computational feats at every level of biology, he says, and our biggest limitation isn’t our struggle to create and control biological circuits, but rather our inability to harness what already exists in nature. “The things that we’re going to create from scratch are, for a really long time, going to be pitiful compared to what’s already out there,” he says.
Levin thinks we need to shift focus from trying to re-engineer biological systems to finding ways to interface with what is already there. That means thinking of them not as building blocks but more as “materials with an agenda” whose natural behaviour we can redirect to our own ends, he says.
Consider the genetic circuitry some organisms use to regrow limbs. We are decades away from being able to recreate that. But Levin’s lab has shown they can manipulate the electrical communication between cells that helps them decide how and where to grow. This allows them to trigger the innate circuitry that governs how groups of cells self-organise into limbs, getting tadpoles to grow eyes on their guts and frogs to sprout extra legs.
While that doesn’t amount to computing per se, Levin argues that it demonstrates how we could bend nature’s pre-existing circuitry to new ends. With a bit of imagination, similar approaches could be used to solve a wide range of computational tasks. “You can make [biological systems] do new things, but not necessarily by completely rewiring from scratch,” he says. “What you’re doing is shaping the competencies that are already there.”
The esoteric field of fungal computing shows the potential. Fungi have an impressive ability to sense things like pH, chemicals, light, gravity and mechanical stress, says Andrew Adamatzky at the University of the West of England in Bristol, UK. They appear to use spikes of electrical activity to communicate, which opens up the prospect of interfacing them with conventional electronics.
Fungi are attracting attention as lightweight, eco-friendly materials for industries including construction and fashion, suggesting they could ultimately be used to create smart wearable devices or even intelligent buildings that sense and respond to environmental conditions, says Adamatzky.
A more obvious place to look for biological computation is inside the most powerful computer we know: the brain. Advances in tissue engineering mean scientists can grow, from stem cells, complex clusters of neurons tantamount to miniature brains, known as “brain organoids”. Meanwhile, breakthroughs in miniature electrodes that transmit signals to brain cells and decode the response from them mean we have begun to experiment with their memory and learning capabilities.
Earlier this year, a team led by Thomas Hartung at Johns Hopkins University in Maryland outlined its vision for a new field the researchers dub “organoid intelligence”. The goal is the inverse of artificial intelligence: rather than making computers more brain-like, they will attempt to make brain cells more computer-like. “It is not science fiction anymore,” says Hartung.
Lab-cultured brain cells are being repurposed for computing Thomas Hartung, Johns Hopkins University |
In 2021, researchers at Melbourne-based start-up Cortical Labs showed they could train human brain cells cultured on top of a silicon chip to play the video game Pong. They translated game data, such as the positions of the ball and paddle, into spikes of electrical activity that were fed into the mass of cells via electrodes. The electrical responses of the neurons were then recorded and used to control the movement of the paddle.
Brain cells have unique capabilities that make them promising building blocks for intelligent machines, says Cortical’s chief scientific officer Brett Kagan. They learn quickly, unlike today’s AI, which has to be trained on myriad examples. They also readily absorb many different kinds of data and use minuscule amounts of energy. And while there are question marks around whether we can recreate human-like intelligence in silicon, we know you can do it with neurons, say Kagan. “The how might be very complicated, but it’s a very different place to start from,” he says.
A near-term application, Kagan says, could be to exploit neurons’ exceptional pattern recognition skills for cybersecurity applications like monitoring internet traffic for unusual activity that might indicate a hack. In the longer term, he thinks they could help power machines that operate in the real-world, such as robotics or smart sensors. “What do brains do better than machines?” he asks. “Sensing and responding to rapidly changing environments is definitely one thing.”
Organoid intelligence
Cortical Labs is now developing software called biOS, or Biological Intelligence Operating System, a software environment in which anyone with basic coding skills can program tasks for what it calls its “DishBrains”.
But Madeline Lancaster at the University of Cambridge, who was a pioneer of brain organoids, thinks that repurposing them for computational tasks is still a long way off. Keeping them alive for extended periods remains a struggle, she says, and much of the computational power of brains comes from the complex hierarchical structures neurons form, something we are far from being able to replicate. “It’s a little bit jumping the gun,” she says. “At the moment it’s just a bunch of neurons kind of randomly connected.”
The truth is that all of these approaches to biological computing are far from going mainstream. “We have not yet learned how to harness organic computers for anything particularly useful, other than demonstration projects that lead to publishing academic papers,” says Rob Carlson, managing director at venture capital firm Bioeconomy Capital. Our ability to manipulate biology is still rudimentary compared with our capacity to design and build silicon chips, he says.
Even so, Carlson thinks the enormous potential of biological computation and the billions being poured into biotech will bring rapid progress over the next few years. One thing is certain: we have barely scratched the surface of what is possible. “There are biological computers within us, all around us, everywhere,” says Levin. “There’s so much richness already out there.”
0 Comments