3D-printed adjustable volume straw pipette

Pippettes, a tool for dispersing specific volumes of liquid, are quite possibly the most used tool in any bio lab. Most biologists buy their pipettes from a manufacture and they’ll usually set you back $500-$1000 for a set*. Or you can do what Kwalus did and make your own using a 3D printer, a balloon, a bit of duct tape and a spring.

3D Printed Pipette

Check out this video of it in action:

Via: Thingiverse

*Note: If you’re in need of quality(read:calibrated) micropipettes, I highly recommend genefoo, they’re DIYbio friendly and affordable.

UK Government Cultivates Synthetic Biology with Cash Grants

July 11th saw the UK government’s announcement to pledge almost £1 million to fund several university research groups’ participation in an international collaboration to create the world’s first synthetic yeast. David Willets, the UK’s science minister, made the announcement during the week of the sixth Synthetic Biology conference which is this year being hosted by Imperial College London.

Willets is expected to make further announcements about a £60 million investment into the field as a whole, some of which will go towards the creation of a centre of ‘innovation and knowledge’ in order to find uses for the deluge of synthetic organisms set to be created in the coming years. Areas among those expected to benefit from the emergent field of synthetic biology include vaccine design and production, biofuels, crops, diagnostic medicine, and nanotechnology.

But what is synthetic biology? Arguably just a decade old, it’s a discipline in its infancy whose implications could be more far researching than anyone could realistically envisage at present. Just the name, synthetic biology, comes across to many as a puzzling oxymoron; biology: the study of living things that emerge as part of the natural order; synthetic: artificial and man made. How does one make fake life, and if something artificial can come to life, then in what way could it then be perceived, or indeed distinguished, as being synthetic? Such thoughts sound like they belong to the realm of science fiction and conjure images such as Frankenstein’s creation, but this isn’t something that may one day come to fruition – this is happening now, and it’s changing the way biologists do biology.

The go to analogy that best explains both what synthetic biology is attempting to achieve and how it intends to do so is by thinking of a car. One way of trying to understand how the inner workings of a car come together to give rise to its functionality is by disassembling it entirely and rebuilding it piece by piece. This allows a deeper awareness of the role each part plays and its role in relation to the whole.

The legendary physicist Richard Feynman famously stated ‘What I cannot create, I cannot understand.’, and it is with this sentiment the field aims to more acutely determine the nuances that conspire together to make living matter from no living matter. An ultimate long term goal of synthetic biology is often touted as having such a complete understanding of how each part of a living system works one would simply be able to sit at a computer and redesign it to improve its functionality, or even better: design new organisms from scratch.

The obvious problem with the car comparison however is that living things, even the simplest, are infinitely more complex than any smart car you’re ever going to run into. The parts that go into the make-up of them interact on the nano-scale, the number and variety of them are mind boggling and their design has arisen over the course of billions of years through the process of evolution. In spite of these challenges, real progress has been made by researchers working in the field.

Notable examples are the range of medical biosensors now coming to market. By a process called gene ‘tagging’, bacteria can be modified to light up in response to specific external stimuli. This is done by attaching the gene for a light emitting florescent protein to the end of the protein which would ordinarily be output in response to the stimulus under scrutiny. Rather than being subject to often invasive testing, the results of which can take a good deal of time to process, gels embedded with such synthesised biosensors can be simply smeared onto the area of concern on the patient and a diagnosis made within minutes on the basis of whether or not the gel lights up.

A novel approach to clearing land mines was pioneered by altering thale cress’s genetic code so that the pigment in the leaves would grow as purple as opposed to the ordinary green when grown in the presence of nitrogen dioxide, a gas given off by the buried mines. Rather than prod along the ground with a stick hoping not to be blown apart, the seeds of such a plant could be scattered upon the ground of land suspected to contain mines and returned to when the plants have reached maturity in order to identify the dangerous locations.

In order to quell the spread of malaria in Sub-Saharan Africa, which is estimated to kill more than half a million people annually, swarms of mosquitoes were created with without the ability to carry the disease. Such flies would compete for the resources of those that could, so while not combating the cause of the disease completely such measures seek to drastically reduce the effectiveness of it spreading by these means.

Unsettling concerns which spring to mind with the latter two examples are the possibility, if not the probability, that these organisms could cross contaminate species already in the wild, giving rise to disastrous and wholly unforeseeable consequences. The beauty of redesigning life at the genetic level is that safeguards to exclude such eventualities can be built in. For instance, their reproductive capabilities can be written out, thus ensuring they can’t mix with nature. However, film buffs may recall this is similar to what was done in ‘Jurassic Park’ and (SPOILER ALERT!) it failed.

The timeliness of these developments being able to be made now are twofold, the cost and ease of nucleotide sequencing and synthesis, and the current level of computing power.

In the wake of the Human Genome Project the speed at which it was possible to read off and write nucleotides, the chemicals on the double helix backbone of DNA whose sequences act as a store of genetic information, became ever more quick, accurate and the cost involved diminished to incredible levels. The total cost of the Human Genome Genome project came in at approximately £3 billion, but thanks to the technology it employed and the invigorating blaze it conferred to the entirety of this domain of science it would now cost less than £1000 if carried out tomorrow. Such progress renders the famous ‘Moore’s law’, which observes the continual doubling of computer power every eighteen months and which has been obeyed for the past forty years now, as looking almost sloth-like. However, without Moore’s law continually driving computing power to ever new bounds the design, virtual representation, and most importantly the ability to model biological systems would be severely limited.

The international effort towards creating synthetic yeast by 2018 sees researchers at the Universities of Cambridge, Edinburgh and Imperial College working on chromosome 13 of the yeast strain of Saccharomyces Cerevisiae, which contains 16 chromosomes in total (we humans have 46). More commonly known as Brewers’ yeast, it has been used in brewing and wine making for thousands of years.

A more thorough understanding of this yeast could allow it to be modified in the future to create cells giving rise to higher alcohol yields, and more robust to external factors such as temperature and water quality the alcohol is brewed in. If the efficiency rises sufficiently, in such a way to allow a scaled up production to take place it is not unreasonable to this giving rise to a sustainable source of biofuel. The primary output of the process may even be changed to a different product entirely; such as medicinal drugs, vaccines, fertilisers, or various chemicals for all manner of industrial purposes. The possibilities could be potentially endless.

As Saccharomyces Cerevisiae was the first eukaryote cell (cells with a major step up in complexity when compared bacterial cells, and crucially having a membrane surrounding their genetic material in an area called the nucleus) to be sequenced in the mid-nineties, it quite fittingly is set to be the first synthetic cell as well.

The claim to creating the first artificial life form was made by a team led by the pioneering US geneticist Craig Ventor in 2010. There the genome of Mycoplasma Mycoides, a parasitic bacterium which commonly resides in the lungs of cows and goats, was synthesised in its entirety and inserted into the vacated nucleus of another host cell of the same species. True to theory the cell then ‘booted up’ and worked as any other Mycoplasma Mycoide would.

However, it could be argued that this exercise, which took a team of 20 scientists ten years at a cost of forty million dollars, was somewhat unremarkable. Given that they removed the actual genetic code from a host cell only to replace it with an exact replica created in the lab, it shouldn’t have been all too surprising to find that it worked. However, one must bear a philosophical attitude when considering the implication of this experiment. Consider that the team had synthesised some inanimate chemicals, inserted them into some other (arguably) inanimate chemicals, and somehow everything conspired to cross the boundary from the non-living living to the living. A Pandora’s crate worth of questions is unearthed, each of which presents many challenges in even attempting to begin to answer, the chief one being: what is life?

On the scales we’re used to living in, such a question seems trivial. It is reasonably clear that dogs, trees and Putin are alive, while pianos, stones and Lenin are not. But when one delves down to the scales probed by synthetic biology the point at which non-living matter becomes living matter becomes a mire of grey area. When do interacting chemicals take on the extra dimensionality which renders them no longer inert? It is very difficult to tell, and is so fiddly to ascribe a definition to in fact that mainstream science long ago gave up on trying to come up with one. It was hard enough ten years ago, but with the rise of this field the goal posts for what life may be are shifting at break neck speed.

The strides being made by the field though are not without controversy, and has met a non-negligible ground swell of opposition. As one could imagine, when tampering with nature – and moreover, when modifying organisms at the genetic level – concerns that come to the fore include public safety, environmental impacts, level of controls in place, and the ethical implications that are raised. On the basis of such reservations over one hundred environmental and civil society groups, such as ‘Friends of the Earth’, ‘International Center for Technological Assessment’ and the ‘ETC Group’, together issued the publication [source] The Principles for the Oversight of Synthetic Biology in 2012. In it, they call for a global moratorium on the release and commercial use of synthetic organisms until more rigorous and robust biosafety measures can be more firmly established.

Such scepticism is understandably warranted, and particularly interesting is the extent to which those in the field are interested in public awareness of their work. Philosophers of science and academics working in science ethics are being closely involved in the field as it progresses, as the scientists are keen to ensure misunderstandings about the nature of synthetic biology don’t take root and result in the PR disaster that befell GM foods in the UK. Given that the science employed in synthetic biology makes GM techniques look like childs’ play, such reservations are not unfounded in the least.

The most cliché of phrases that undoubtedly rears its well worn face in any popular article one is likely to see chronicling developments in the field is the extent to which researchers are playing ‘god’. Though such accusations could be levelled at anyone in history using newly discovered or partially understood laws of nature to make practical applications, the fact that this is fundamental biology where new life can be created outside the bounds of evolution are acts more usually associated with a deities, perhaps the authors can be excused.

Calum Grant is a freelance writer based in London. He has a background in physics and biology research, and also works in science and mathematics teaching.

Photos by SLU Madrid Campus  

DIYBio and Bioart: liberating the laboratory

Life is becoming like raw material, waiting to be engineered – Oron Catts

What is the biolab used for today? Discovering pharma-cures, creating the next pest-resistant crop, synthesizing recombinant bacteria to glow, sense, detect, digest…? We see something slightly different: where artists work alongside scientists to create universally beautiful sculptures, where a public laboratory performance is held to question Monsanto’s patented crops, and where biologists take respite from their day-jobs to conduct experiments they truly care about, but have limited funding to do. Most importantly, we are witnessing technology liberated from these laboratories, and biology becoming more and more accessible to the average citizen.

We as creators of DIYsect: Filming Biotinkering for the Web, seek to document these game-changing people and put them all on a single web-series where each episode focuses on a different theme within biotechnology. We don’t plan to profit from these episodes, nor do we plan to charge a single penny to our viewers. Like our subjects, we believe that knowledge should be open to all.

Stay tuned for our release in early 2014, and support us on Kickstarter:

http://www.kickstarter.com/projects/260055886/diysect-filming-biotinkering-for-the-web

Crowdsource the Cure for Cancer: One slide at a time.

For their latest and possibly most ambitious crowd-sourcing project Cellslider, Zooniverse has partnered with Cancer Research UK to analyze archival cancer research data.

While computers have gotten better at image analysis, the majority of this type af analysis is still being done by real people and is quite time intensive.  By harnessing the collective power of hundreds of thousands of people, Zooniverse hopes to speed up this process to discover new methods of treatment and detection.

So, next time you reach for your phone to play angry birds think about spending that time curing cancer instead.

Printable Iphone to Microscope mount

Thingiverse user Boogie has uploaded a pretty cool but simple design for a 3D printable iphone to microscope adapter. According to Boogie you can also use it with your android phone with the help of a few rubber bands. If you print this out and take pictures with it let us know how they turn out.

Via Thingiverse

Hackerspaces @ the_beginning (the book)

If you were studying the evolution of hackerspaces the year 2008 would stick out as the “cambrian explosion”. It was when the movement grew from being 2 or 3 spaces in europe, to hundreds of hackerspaces around the world. And it was in 2008 when Bre Pettis, Astera Schneeweisz, and Jens Ohlig began chronicling the beginning of these new spaces. Three years later they are now offering that chronicle free for the world to read. If you are planning on starting a hackerspace or similiar creative space and your mind is filled with excuses on why you should’nt start one. This book is a must read, showing you really have no excuse not to start one.

Download HackerSpaces: The Beginning!

Via Hackerspaces Blog

 

 

A Quick Look Inside Genspace: a DIYbio Lab

This past week I was invited to NYC along with other members of the DIYbio community to take part in a FBI/DIYbio Workshop. And before I address the issue of “What in the world does the FBI want with DIYbio?” I wanted to share a video I took of Genspace who was co-hosting the workshop.

For those who don’t know what Genspace is, it’s a community Bio-Lab located in the Metropolitan Exchange building at 33 Flatbrush Ave surrounded by a wonderful community of artists, designers and architects.I’ve seen pictures before but they don’t quite do justice to their lab setup. They have everything you could possibly want out of a BSL1 lab.

Sugar Shot to Space

++++A rather intriguing question was posted in an e-mail discussion forum devoted to sugar propellants. One of the contributor innocently asked “Would it be possible to launch a sugar-propelled rocket into Space?” This query set into motion a flurry of activity, which sought to answer this question, at least from a theoretical perspective. A number of experienced amateur rocketeers pondered the question in-depth, ran computer simulations, and concluded that it might be possible, but barely. Due to the low performance of sugar propellant, which has a “specific impulse” of about one half that of professional rocket propellants, the goal of reaching Space was shown to be very challenging.

++++A conventional single stage rocket would not be capable, at least not one that would boost a decent sized payload. A two stage rocket would be needed. The key advantage to a two stage, versus single stage vehicle, is that of efficiency. A single stage rocket would propel a vehicle to a very high velocity in the lower, densest part of the atmosphere, losing a lot of energy due to aerodynamic drag. With a two stage approach, a vehicle can coast following the first burn, and soar to an altitude beyond much of the densest air before firing the second stage. An alternative suggestion was then submitted for discussion. Why not a single stage rocket that would behave as a two stage rocket? Deemed a “dual-phase” rocket, two serial propellant charges would be separated by a common bulkhead, and share a common nozzle. Following burnout of the first charge (or phase), the bulkhead would be breeched, allowing the two chambers to act as one. The second charge would then fire with the motor behaving in conventional manner. Simpler than dealing with the complexities of staging, at least in theory. And so out of this innocuous discussion, the Sugar Shot to Space project was born, with a mandate to demonstrate that theory and reality could be merged. Amateur rocketeers with a passion for sugar propellant and a commitment to accomplish something on the edge of feasibility could pull this one off.

++++That was over five years ago. Over the course of ensuing time, there arose many unexpected challenges, technical as well as organizational, restrained by the limitations of being an all-volunteer, minimum budget project, delving into a little known technology. It turns out that dual-phase rocket operation is simple in theory, but more than a tad challenging to engineer a workable solution. We learned the hard way that we knew even less about sugar propellant than we thought we knew. For example, we were aware that sugar propellant is brittle, but how brittle, and how would that play out in a large-scale rocket motor? Brittleness can be a bad thing, resulting in sudden, unexpected and potentially catastrophic fracture under certain conditions.

++++Fully understanding those conditions in order to mitigate the risk stipulates a great deal of unglamorous effort. And despite a passion for the goal of reaching Space (a dream nearly every amateur rocketeer shares), many volunteers were simply over-constrained with regard to available spare time, as many have full-time jobs and a life outside rocketry.

++++Realizing that the approach taken to reaching Space “in one giant leap” was fraught with many hurdles that would likely lead to a disappointing end, the project was eventually reborn as a “program”. Instead of trying to reach Space in a single attempt, the new tactic was to apply an incremental “Apollo” style approach, moving forward cautiously step by step. Three key projects were identified for the program: one-third scale, two-thirds scale and then the full-scale “Space” rocket.  Tackled this way we could learn as we progressed, developing scalable hardware and methods.

++++In hindsight, this appears to have been a wise change of course. As things unfolded, the one-third scale “Mini Sugar Shot”required several static test firings before a successful firing was achieved. The difficulties were mainly a consequence of the very demanding “mass fraction” requirement needed to reach Space on a low performance fuel, and secondly due to the unexpectedly severe thermal loading the rocket chamber experienced during the second phase burn. The first of these, which demands that most of the liftoff mass (at least 80%) must be propellant, asserts that the lightest of materials be used. Gone by the wayside was the inherent comfort of using beefy metal motor casings. Only lightweight composite materials could fit the bill. The second issue, made all the more complicated by the first, was eventually resolved through the development of a lightweight ablative material that lined the motor chamber, and served to effectively insulate it from the torrent of hot, highly pressurized and speedy exhaust gases seeking its escape to greater entropy through the chamber and out the nozzle.

++++What challenges associated with the use of sugar propellant lie ahead for the Sugar Shot to Space team as we graduate toward the next project, the two-thirds scale “Double Sugar Shot”?

++++We’ve learned that brittleness of sugar propellant can lead to a catastrophic result. Encouraged by experiments that indicate that storage method, such as deep freezing, can inhibit formation of brittleness; this negative trait can hopefully be tamed. We’ve learned through our Mini Sugar Shot experience that we’ve gotten a pretty good handle on casting sugar propellant. One-third scale propellant “grains” of about a kilogram each (totaling twelve for each motor firing), and of high quality, were consistently produced. Will that same casting technology allow us to cast the much larger grains while maintaining similar and consistent quality? The need for consistency is imperative to scaled-up design approach of a rocket motor, as the burning characteristics and other traits affecting the “internal ballistics” are directly affected. The sheer quantity of propellant needed for Double Sugar Shot, 90 kg for each firing, leads to an unprecedented challenge of safe and efficient mass production. Whatever method we develop should be scalable to the full-sized, appropriately named “Extreme Sugar Shot”, which has a projected motor capacity of 450 kg. That’s a lot of sugar propellant. Other questions come to light when considering mass production of propellant, such as “what happens if sugar propellant is accidentally ignited”?  Controlled experiments were performed to gain a better understanding of this critical aspect of sugar propellant usage. Turns out that the risks and consequences associated with such a mishap can likely be mitigated by intelligent design of propellant handling and casting apparatus, and by appropriate response to such an event.

++++What grain configuration would be most suitable for our requirements? The ubiquitous BATES configuration, as used for most amateur rockets, or some other untried geometry, such as “star” shaped core? Which oxidizer-to-fuel ratio would be best? Stick with the tried and true (65/35) or seek to optimize? These and many other questions remain to be answered. Many challenges undoubtedly lie ahead before we succeed in taming sugar propellant. The only certainty is that success, if that’s to be the fate for the Sugar Shot to Space team, and we truly believe it will be, will demand an unrelenting commitment to achieve an extraordinary goal with a decidedly ordinary rocket propellant.

You can find out more about Richard Nakka and his team on their website SugarShot.org

You can find this article and many more in Issue 01 of Citizen Science Quarterly