July 11th saw the UK government’s announcement to pledge almost £1 million to fund several university research groups’ participation in an international collaboration to create the world’s first synthetic yeast. David Willets, the UK’s science minister, made the announcement during the week of the sixth Synthetic Biology conference which is this year being hosted by Imperial College London.
Willets is expected to make further announcements about a £60 million investment into the field as a whole, some of which will go towards the creation of a centre of ‘innovation and knowledge’ in order to find uses for the deluge of synthetic organisms set to be created in the coming years. Areas among those expected to benefit from the emergent field of synthetic biology include vaccine design and production, biofuels, crops, diagnostic medicine, and nanotechnology.
But what is synthetic biology? Arguably just a decade old, it’s a discipline in its infancy whose implications could be more far researching than anyone could realistically envisage at present. Just the name, synthetic biology, comes across to many as a puzzling oxymoron; biology: the study of living things that emerge as part of the natural order; synthetic: artificial and man made. How does one make fake life, and if something artificial can come to life, then in what way could it then be perceived, or indeed distinguished, as being synthetic? Such thoughts sound like they belong to the realm of science fiction and conjure images such as Frankenstein’s creation, but this isn’t something that may one day come to fruition – this is happening now, and it’s changing the way biologists do biology.
The go to analogy that best explains both what synthetic biology is attempting to achieve and how it intends to do so is by thinking of a car. One way of trying to understand how the inner workings of a car come together to give rise to its functionality is by disassembling it entirely and rebuilding it piece by piece. This allows a deeper awareness of the role each part plays and its role in relation to the whole.
The legendary physicist Richard Feynman famously stated ‘What I cannot create, I cannot understand.’, and it is with this sentiment the field aims to more acutely determine the nuances that conspire together to make living matter from no living matter. An ultimate long term goal of synthetic biology is often touted as having such a complete understanding of how each part of a living system works one would simply be able to sit at a computer and redesign it to improve its functionality, or even better: design new organisms from scratch.
The obvious problem with the car comparison however is that living things, even the simplest, are infinitely more complex than any smart car you’re ever going to run into. The parts that go into the make-up of them interact on the nano-scale, the number and variety of them are mind boggling and their design has arisen over the course of billions of years through the process of evolution. In spite of these challenges, real progress has been made by researchers working in the field.
Notable examples are the range of medical biosensors now coming to market. By a process called gene ‘tagging’, bacteria can be modified to light up in response to specific external stimuli. This is done by attaching the gene for a light emitting florescent protein to the end of the protein which would ordinarily be output in response to the stimulus under scrutiny. Rather than being subject to often invasive testing, the results of which can take a good deal of time to process, gels embedded with such synthesised biosensors can be simply smeared onto the area of concern on the patient and a diagnosis made within minutes on the basis of whether or not the gel lights up.
A novel approach to clearing land mines was pioneered by altering thale cress’s genetic code so that the pigment in the leaves would grow as purple as opposed to the ordinary green when grown in the presence of nitrogen dioxide, a gas given off by the buried mines. Rather than prod along the ground with a stick hoping not to be blown apart, the seeds of such a plant could be scattered upon the ground of land suspected to contain mines and returned to when the plants have reached maturity in order to identify the dangerous locations.
In order to quell the spread of malaria in Sub-Saharan Africa, which is estimated to kill more than half a million people annually, swarms of mosquitoes were created with without the ability to carry the disease. Such flies would compete for the resources of those that could, so while not combating the cause of the disease completely such measures seek to drastically reduce the effectiveness of it spreading by these means.
Unsettling concerns which spring to mind with the latter two examples are the possibility, if not the probability, that these organisms could cross contaminate species already in the wild, giving rise to disastrous and wholly unforeseeable consequences. The beauty of redesigning life at the genetic level is that safeguards to exclude such eventualities can be built in. For instance, their reproductive capabilities can be written out, thus ensuring they can’t mix with nature. However, film buffs may recall this is similar to what was done in ‘Jurassic Park’ and (SPOILER ALERT!) it failed.
The timeliness of these developments being able to be made now are twofold, the cost and ease of nucleotide sequencing and synthesis, and the current level of computing power.
In the wake of the Human Genome Project the speed at which it was possible to read off and write nucleotides, the chemicals on the double helix backbone of DNA whose sequences act as a store of genetic information, became ever more quick, accurate and the cost involved diminished to incredible levels. The total cost of the Human Genome Genome project came in at approximately £3 billion, but thanks to the technology it employed and the invigorating blaze it conferred to the entirety of this domain of science it would now cost less than £1000 if carried out tomorrow. Such progress renders the famous ‘Moore’s law’, which observes the continual doubling of computer power every eighteen months and which has been obeyed for the past forty years now, as looking almost sloth-like. However, without Moore’s law continually driving computing power to ever new bounds the design, virtual representation, and most importantly the ability to model biological systems would be severely limited.
The international effort towards creating synthetic yeast by 2018 sees researchers at the Universities of Cambridge, Edinburgh and Imperial College working on chromosome 13 of the yeast strain of Saccharomyces Cerevisiae, which contains 16 chromosomes in total (we humans have 46). More commonly known as Brewers’ yeast, it has been used in brewing and wine making for thousands of years.
A more thorough understanding of this yeast could allow it to be modified in the future to create cells giving rise to higher alcohol yields, and more robust to external factors such as temperature and water quality the alcohol is brewed in. If the efficiency rises sufficiently, in such a way to allow a scaled up production to take place it is not unreasonable to this giving rise to a sustainable source of biofuel. The primary output of the process may even be changed to a different product entirely; such as medicinal drugs, vaccines, fertilisers, or various chemicals for all manner of industrial purposes. The possibilities could be potentially endless.
As Saccharomyces Cerevisiae was the first eukaryote cell (cells with a major step up in complexity when compared bacterial cells, and crucially having a membrane surrounding their genetic material in an area called the nucleus) to be sequenced in the mid-nineties, it quite fittingly is set to be the first synthetic cell as well.
The claim to creating the first artificial life form was made by a team led by the pioneering US geneticist Craig Ventor in 2010. There the genome of Mycoplasma Mycoides, a parasitic bacterium which commonly resides in the lungs of cows and goats, was synthesised in its entirety and inserted into the vacated nucleus of another host cell of the same species. True to theory the cell then ‘booted up’ and worked as any other Mycoplasma Mycoide would.
However, it could be argued that this exercise, which took a team of 20 scientists ten years at a cost of forty million dollars, was somewhat unremarkable. Given that they removed the actual genetic code from a host cell only to replace it with an exact replica created in the lab, it shouldn’t have been all too surprising to find that it worked. However, one must bear a philosophical attitude when considering the implication of this experiment. Consider that the team had synthesised some inanimate chemicals, inserted them into some other (arguably) inanimate chemicals, and somehow everything conspired to cross the boundary from the non-living living to the living. A Pandora’s crate worth of questions is unearthed, each of which presents many challenges in even attempting to begin to answer, the chief one being: what is life?
On the scales we’re used to living in, such a question seems trivial. It is reasonably clear that dogs, trees and Putin are alive, while pianos, stones and Lenin are not. But when one delves down to the scales probed by synthetic biology the point at which non-living matter becomes living matter becomes a mire of grey area. When do interacting chemicals take on the extra dimensionality which renders them no longer inert? It is very difficult to tell, and is so fiddly to ascribe a definition to in fact that mainstream science long ago gave up on trying to come up with one. It was hard enough ten years ago, but with the rise of this field the goal posts for what life may be are shifting at break neck speed.
The strides being made by the field though are not without controversy, and has met a non-negligible ground swell of opposition. As one could imagine, when tampering with nature – and moreover, when modifying organisms at the genetic level – concerns that come to the fore include public safety, environmental impacts, level of controls in place, and the ethical implications that are raised. On the basis of such reservations over one hundred environmental and civil society groups, such as ‘Friends of the Earth’, ‘International Center for Technological Assessment’ and the ‘ETC Group’, together issued the publication [source] The Principles for the Oversight of Synthetic Biology in 2012. In it, they call for a global moratorium on the release and commercial use of synthetic organisms until more rigorous and robust biosafety measures can be more firmly established.
Such scepticism is understandably warranted, and particularly interesting is the extent to which those in the field are interested in public awareness of their work. Philosophers of science and academics working in science ethics are being closely involved in the field as it progresses, as the scientists are keen to ensure misunderstandings about the nature of synthetic biology don’t take root and result in the PR disaster that befell GM foods in the UK. Given that the science employed in synthetic biology makes GM techniques look like childs’ play, such reservations are not unfounded in the least.
The most cliché of phrases that undoubtedly rears its well worn face in any popular article one is likely to see chronicling developments in the field is the extent to which researchers are playing ‘god’. Though such accusations could be levelled at anyone in history using newly discovered or partially understood laws of nature to make practical applications, the fact that this is fundamental biology where new life can be created outside the bounds of evolution are acts more usually associated with a deities, perhaps the authors can be excused.
Calum Grant is a freelance writer based in London. He has a background in physics and biology research, and also works in science and mathematics teaching.
Photos by SLU Madrid Campus