Computers That Are Alive?!

How biocomputing could be the dark horse of computing

Published February 24, 2020in Computation5 min read

The year is 2020 in the Kingdom of Computing. The king, classical computing, has ruled the kingdom since 1953 and was the first ruler of the kingdom. But, the king is getting older, and the kingdom is looking into the potential heirs to the throne of the computing kingdom. While the oldest prince, quantum computing, has been the most popular among the people of the Computing Kingdom as the heir to the throne, another heir has emerged to the throne — biological computing or biocomputing.

Synthetic biology is an emerging field at the intersection of computer science and biology that deals with programming cells to move cells around the body, and develop artificial biological systems. However, as research in this area advances and becomes more established, it doesn’t seem like an outlandish idea to think that we will be able to manipulate biological systems to the degree that they will essentially function as a new form of computing.

To understand how bio computers would work, it is crucial to understand classical computers first. Classical computers refer to the computer you are using to read this article, the type that uses something called bits as building blocks.

A classical computer performs elementary functions. At the core of all those functions are transistors, a physical switch that can either block or allow data through it. Classical computers work by programming the most basic form of data, bits. Bits are binary, meaning they can either be 0 or 1, reflecting the states of “on” or “off.” Bits are programmed to encode and process data. A combination of bits is used to represent more complicated pieces of data. Then transistors combine to form logic gates, which then connect to develop modules that can carry out simple operations, such as add two numbers. Upon being able to add, computers are also able to multiply as multiplication is repeated addition.

So, when you are “multitasking” on your laptop, the CPU (central processing unit) in your computer is essentially switching between very small computations at blazing speed. If your computer is 2.5 gigahertz per core, then each core (just a part of a CPU) is making 2.5 billion computations per second!

The process described above done by classical computers has become increasingly complicated as computer parts start to shrink. This is because of two main reasons. Firstly, shrinking computer sizes to add more transistors results in quirky quantum mechanical properties that have not been grasped fully. Secondly, performance in the overall classical computer architecture is that all processes are executed sequentially, meaning that no two processes can occur simultaneously. Modern computers get around that by adding more “cores” in the CPU so that these cores are able to work together to execute more complex computations.

However, as the complexity of problems being solved by computers increases, the resources taken by a classical computer increases exponentially. The way to get around this was to adopt a new type of computing — parallel computing. Parallel computing can accomplish more complex problems. You can think of a single operation being done in the computer being done in parallel with many other operations being done at the same time.

In fact, the biggest promise of quantum computers is to execute more

certain computations at a much faster rate allowing it to solve a variety of problems that are nearly impossible to solve using a classical computer. However, the widespread use of quantum computers and the overall development of quantum computers have been limited by several reasons. The most prominent of which is the issue of getting a quantum computer close to absolute zero (-273 deg C) to minimize the noise in the computations. Such an environment is tough to replicate and leads to a lot of noise in the computations causing the processed data to be inaccurate and unreliable. Despite all of these hurdles, quantum computers are being hailed as the next generation of computing. However, what if there was another type of computing that could potentially replace quantum computers as the next generations of computing.

🥁🥁🥁 … Biocomputing

Even though quantum computers are the ones gaining all the mainstream attention, the field of biocomputing has also been progressing and has produced some remarkable breakthroughs. While a synthesized computing system has not developed, biocomputing has produced remarkable results in individual components, like data storage, transistors and more.

In 2012, a scientist at Harvard was able to store 700 terabytes of data in one gram of DNA! Ther achieved in such a way that each one of the bases corresponds to a binary value (0 or 1) replicating how modern computers work. This research proved that the efficiency of living organisms to store data is far superior to our hard drives and solid-state drives and holds a lot of potential going forward. Ever since then, DNA has been used as data storage for other research. Read more

In 2013, bioengineers at Stanford University were able to recreate a fundamental component of computers, transistors, using biological materials. The team of scientists used genetic materials known as DNA and RNA to replicate transistors to create the “transcriptor.” The scientists were able to arrange the transcriptor in such an order that they were able to replicate logic gates like AND, NAND, OR, XOR, NOR, and XNOR. The biological equivalent of these gates is called Boolean Integrase Logic (BIL) gates. Integrase refers to specific enzymes used to direct current in the transcriptor and acts as a signal amplifier as the RNA travels around the DNA, much like actual transistors in the electrical circuit of a computer. Read more

In 2016, scientists at Lund University were able to perform parallel computation using nanoparticles. The scientists were able to use nanoparticles to create a circuit like structure where molecules act like electrons in current that travels throughout the circuit to perform computations. The path looks quite different than logic gates, and circuits hat exist in modern computers but hold great potential. One of the scientists working on the project, Heiner Linke, explained that “The fact that molecules are very cheap and that we have now shown the biocomputer’s calculations work leads me to believe that biocomputers have the prerequisites for practical use within ten years.”

In the same year, researchers at MIT showed off a ‘biological computer’ that had all the essential elements of a computer. The study showed that using biological materials like bacteria, enzymes and more, they were able to create a computer that can perform basic computations and store data using genetic materials. The main promise highlighted by this paper is the energy and space efficiency of living organisms when compared to modern engineered solutions. With rapid advances in the field of AI, a lot of electricity is used to train Machine learning models with billions of parameters. However, biological computers can significantly aid in optimizing the energy usage of the various components of a bio-computer. Read more

With all of this promising advancement in this seemingly quiet area of research, can biocomputing go on to be the next big thing? At the moment, it is too early to tell. While the foundations of biological computers are being laid by the development of individual components of biocomputing, major roadblocks may come up. Biocomputing may come with its unique quirks like quantum computing, which may slow down its development or render the concept infeasible. The scale of this technology is still unknown, and it is yet to see how it would be implemented.

With that being said, there is an undeniable upside if this technology were to succeed. Biological systems are one of the most efficient systems known to humans. This has already been demonstrated by the high efficiency of biological computers when it came to data storage and energy usage. However, the scale at which biological systems operate is quite small, as most of these systems need to be understood at the nanoscale (1 billionth of a meter). The field of nanotechnology is also on the rise; however, the safety and ethics of using nanoparticles are being investigated. Similarly, the ethics and safety of synthetic biology could turn out to be a controversial topic.

When coming to biocomputing, we have only begun to scratch the surface of how we can use biological systems to our advantage to develop a whole host of unique and much-needed solutions to problems that exist with modern computing. If the field of biocomputing can continue to produce groundbreaking work and keeping developing, it may replace quantum computing as the next generation. However, it is essential to note that both types of computing are fundamentally very different, and both types will play a different role in the future.

TL;DR

  • Biocomputing could lead the next generation of computing, instead of quantum computing.
  • While a lot of progress has been made in developing the individual components of biocomputer, there still a long way to go.
  • Using biological systems as a means of computation is very energy and space efficient when considering other types of computing