Living cells are computers in their own way - quite different from their digital counterparts though. The first glance at a cell shows an unruly chemical soup, myriads of reactions taking place all over its volume; a digital chip is quite static by comparison. Random processes, such as reactions between molecules colliding sporadically, govern cellular function whereas digital chips are unable to deal with such hindrances and rely on precisely controlled events. In fact, one could imagine a computer as a highly structured factory: assembly lines running across the floor with numerous machines on either side working under strict instructions to produce the final output. Cells are nothing of the sort, and would look more like a room full of machines and parts floating directionlessly around simultaneously, waiting for correct pairs to come across for the production to proceed. But under the veil of chaos, fundamental rules govern the process of information in living cells; programmable rules coded in the language of the DNA: A-T-C-G; and that's what brings cells and digital computers together. Which begs the question: how much information can a cell actually process? Let's try to put it in context, by examining the amount of instructions it has on how to respond to various stimuli. Each DNA base can be 1 of 4, therefore encodes 2 bits of information, and the length of the human genome is approximately 3 billion base pairs. Thus, to each cell correspond approximately 715 Mb of code. Besides the innate ability of cells to process a plethora of stimuli, the progress in our understanding of DNA manipulation as well as in our ability to custom make DNA sequences has allowed us to engineer from scratch complex biochemical circuits to perform specific computations. The nature of those circuits is quite malleable and up to the will and ingenuity of the designer.