Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century (1996)

Chapter: BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW

Previous Chapter: BASIC RESEARCH: THE FOUNDATION OF BIOTECHNOLOGY
Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 61

Part 2—
Biotechnology Applications
Today and Tomorrow

Biotechnology makes it possible to do things that previously lay in the realm of science fiction: manipulate genes, grow human tissues and organs outside the body, make endless supplies of drugs extracted from rare plants, destroy polluting chemicals. However, the road from experimental technology to reliable product is not always very smooth.

In this section, authors with expertise in different areas of biotechnology discuss both promising applications in health care and environmental cleanup and practical hurdles that need to be overcome before techniques that work in the laboratory can be made to work reliably in the real world.

In Chapter 5, Eric Tomlinson, of GENEMEDICINE, INC. (one of many new companies in the emerging biotechnology industry) provides an overview of the potential applications of biotechnology in health care. He suggests that, of all the new technologies, the most significant is the ability to manipulate the building blocks of life itself: to add or remove genes from cells and to transplant genes from one organism to another.

Savio L. C. Woo of Baylor College of Medicine focuses on the implications of this ability to manipulate genes for the treatment of disease, known as gene therapy. In Chapter 6, Woo clearly explains how gene therapy works (how genes are manipulated, packaged, and delivered to the patient) and what clinical effects the therapy has had to date.

Although gene therapy was originally conceived as a way to treat diseases that are wholly genetic in origin (such as hemophilia), Woo argues

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 62

that it has potentially far wider applications, for example, in the treatment of cancer, infectious diseases such as hepatitis and acquired immune deficiency syndrome, cardiovascular diseases, and perhaps even neurodegenerative diseases such as Alzheimer disease.

Robert Nerem of the Georgia Institute of Technology describes the potential of biotechnology to develop bioartificial skin, cartilage, blood, and whole organs such as the pancreas and the liver. Pointing out that the need for tissues and organs for transplants far outstrips the supply available from donors, Nerem suggests in Chapter 7 that these techniques, which are collectively referred to as tissue engineering, may offer a solution to this perennial problem.

In Chapter 8, Michael Shuler of Cornell University analyzes the practical difficulties involved in manufacturing biotechnological products on a commercial scale. Living cells are complex organisms that often respond to subtle environmental changes in unpredictable ways, he argues, which can make it difficult not only to control product quality but also to predict how a product will behave when administered to a human subject.

Both Nerem and Shuler stress the need for an engineering-systems approach to solving such problems. Shuler urges the use of mathematical models to enable bioengineers to maximize a system's potential, thus improving production efficiency and lowering costs.

Shifting the focus from health care to the environment, in Chapter 9 Gene Parkin of the University of Iowa discusses bioremediation, the use of living organisms such as bacteria to destroy pollutants. This technology made headlines when it was used in Prince William Sound, Alaska, to ameliorate some of the environmental damage caused by the Exxon Valdez oil spill. Parkin explains the processes involved in bioremediation and examines some of the problems that must be solved for the technology to become a viable way of cleaning up environmental damage.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 63


Effect of the New Biologies
on Health Care

ERIC TOMLINSON

Human health care is being affected by the new biologies, particularly by the products that are emerging from the classical pharmaceutical industry and the new biotechnology industry. The pharmaceutical industry has global sales in excess of $200 billion per annum. The biotechnology industry, which had zero sales 5 to 6 years ago, now has sales of about $6 billion per annum. The goal of the biotechnology industry is to capture the remaining market by producing superior and effective products.

Today we are witnessing the advent of protein therapeutics and the ability to understand protein structure. We can look at the structure of DNA itself. We know that when DNA expresses a gene product it has to bend and that there are certain proteins present inside the nucleus that cause that bending to occur. Without such bending of DNA, the overall expression would be totally inefficient. We are also able to observe the structure of cells. We know that when macromolecules go into cells, they are recognized at the cell surface and go through defined pathways. Interaction of a ligand with a surface receptor results in many different types of trafficking events taking place within a cell.

We now know at the structural molecular level, not just at the phenomenological level, much of what happens within the cell in terms of cell processing, including knowledge about the secretion events that occur after a gene product is expressed. The pharmaceutical and biotechnology industries are exploiting this knowledge to try to gain access to target cells, be it for proteins or genes or biologically active RNA.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 64

TABLE 5-1 Advances in Molecular and Cell Biology

Recombinant DNA and hybridoma technologies
Control of gene expression
Gene amplification (polymerase chain reaction)
Embryo stem cell manipulation
Efficient gene transfer
Protein and carbohydrate engineering
Instrumental analysis

What are these advances in molecular cell biology and how are these affecting the biotechnology industry? Table 5-1 lists some of the new advances that have been made. When the history of this era is written, the ability to manipulate mammalian embryonic stem cells, of all of these, will be seen to have been the most important of all. We are able now to go into a cell, alter its genetic makeup, and control cell differentiation. This step, first taken perhaps 20 years ago, produced events such as shown in Figure 5-1. This figure shows the development of two 11-day-old rat embryos. The embryo on the left is nondifferentiated because the gene for the growth hormone was removed. There was some differentiation, but then the growth stopped. When the gene coding for growth hormone was

image

FIGURE 5-1 Two 11-day-old rat embryos. The embryo on the left is nondifferentiated.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 65

image

FIGURE 5-2 The development of the modern pharmaceutical-biotechnology industry.

returned to the embryonic cells, the embryo developed normally, as shown on the right.

This has given rise to hope, expectation, concern, and interest as well to the development of gene therapy and of so-called gene doctors. In 1985 Business Week devoted a cover story to this topic. The article stated that "hereditary diseases may soon be cured by manipulating human genes." Since 1985 we have seen the appearance of companies, such as my own and 40 other gene therapy companies worldwide, that are not just looking at genetic disorders to be treated with genes, but also at treating acquired disorders caused by environmental factors or a combination of environmental and genetic factors.

What is the effect of all these new biologies on human health care and particularly on health care products? Disease itself is now being understood, diagnosed, and treated at an increasingly higher order of genetic structure, function, and regulation. This means we are getting closer to understanding and controlling gene function within the body. Most drugs act directly or indirectly on gene expression. Antisense molecules affect gene function and gene structure, and the use of genes themselves as drugs is soon to be with us.

The development of the modern pharmaceutical-biotechnology industry in terms of modern medicines is shown in Figure 5-2. We all are aware of conventional low-molecular-weight drugs, such as aspirin or

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 66

acetaminophen, which were discovered by trial and error or by serendipity. Recombinant therapeutic proteins were developed over the past 15 years. The newer developments include so-called targeted drugs, which are based on protein structure and are the low-molecular-weight agonists or antagonists of those proteins. The development of anticodon nucleotides, such as antisense and triple helix-forming DNA, and now the development of cDNA as a drug and even the use of living cells are part of the physician's future armamentarium.

Recombinant Therapeutic Proteins

The first protein was marketed about 6 years ago. Many different recombinant proteins have emerged, either for therapeutic or for diagnostic purposes. These developments resulted from an ability to understand how to manipulate DNA structure and expression. Polymerase chain reaction technologies and the development of efficient expression vectors have enabled scientists in the pharmaceutical-biotechnology industry to obtain high yields of protein from bacteria and now perhaps from plants. The creation of transgenic animals enables us to recreate human disease in animals so that we can efficiently and effectively study a putative drug product.

We are now successfully treating disease with therapeutic proteins. We have an adequate treatment for adult Gaucher's disease, which is an enzyme storage deficiency. The liver accumulates substrates and does not have the enzyme that can break down those substrates. Glucocerebrosidase, initially prepared by extraction rather than recombinant procedures, has alleviated this disease.

Myoscint, an antimyosin cardiac imaging agent, is a diagnostic agent that is based on proteins. When myocardial infarction occurs, the heart muscles rip open and myosin is exposed for the first time to the blood compartment and to an antimyosin antibody. Such an antibody, tagged with a gamma emitter, is administered intravenously and accumulates at the physical point of the infarction, where it can be detected by gamma scintigraphy during or just after myocardial infarction. This product has been followed by many new in vivo diagnostic imaging agents that are enabling the correct diagnosis and effective treatment of many different disorders.

Protein Agonists And Antagonists

Modern drug design involves designing a drug to fit a specific protein that is expressed by a specific gene. A protein structure may then be simulated on the basis of x-ray crystallography and, via computer, drug

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 67

image

FIGURE 5-3 The design bicycle showing the flow of activities needed within the pharmaceutical-biotechnology
industry to develop protein agonist and antagonist low-molecular-weight drugs.

molecules that could fit an active site in this protein are added to the structure. This process, combined with the advent of modern combinatorial chemistries, is dramatically changing the pharmaceutical industry's way of discovering drugs. Figure 5-3 shows the so-called design bicycle, which describes the flow of activities needed within the pharmaceutical-biotechnology industry to develop such protein agonist and antagonist low-molecular-weight drugs. The cycles are based on protein design and engineering and on drug design and synthesis and are a model for how the industry is restructuring itself (Blundell et al., 1990).

Anticodon Nucleotides

The concept that RNA function, or even DNA function, can be inhibited by a complementary strand of RNA or DNA (antisense therapy) resulted from our improving ability to map the structure and function of the human genome, to understand better gene function and structure, to analyze disease at the genetic level, and to create transgenic animals and cells. The paradigm is that DNA is transcribed into messenger RNA and then translated into a protein. If that protein is an aberrant protein, as often occurs inside a tumor mass for example, then one can block the function of its messenger RNA by administering an antisense molecule

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 68

that is complementary in its structure to the messenger RNA. This interaction then blocks the production of the aberrant protein. Clinical trials focusing on various molecular and disease targets are underway, largely brought about by the smaller biotechnology companies.

Gene Therapy

The developments in the use of genetic material itself and living cells containing genetic material as therapeutic modalities have resulted from our ability to efficiently transfer genes into cells. Homogenous recombination is a very fertile research area, and the ability to harvest and to control the quality of cells has greatly enhanced our ability to consider living cells as therapeutic modalities.

Initial gene therapy programs in the late 1980s and early 1990s attempted to treat genetic disease. We tried to understand errors in genetic function and then how to correct that by using a gene as a therapeutic modality. However, most diseases are caused by both an environmental factor as well as by a genetic factor. Very few diseases are purely genetic in their origin, for example, Tay-Sachs disease and to some extent cystic fibrosis. Cholera and botulism are examples of diseases caused only by environmental factors. However, most diseases that will debilitate us as we get older and from which we will eventually die are caused by a polygenetic predisposition to disease that is affected by environmental factors. For example, some of us are predisposed to rheumatoid arthritis; environmental factors such as cold or damp weather will affect the inflammatory response and the development of this disease. Cancer is another example where there may well be a polygenetic predisposition that is sparked by some environmental factor. The gene therapy industry is beginning to think that perhaps there are a whole suite of diseases and clinical indications within each disease that can be treated using genes as therapeutics.

The advent of gene therapy was sparked off by the case of the "bubble boy," David, a 5-year-old boy who had severe compromised immunodeficiency disease (he lacked the enzyme adenosine deaminase). David had to live in a plastic chamber (a bubble) to remain impervious to adventitious viruses, which his immune system could not tackle. Unfortunately, he died before gene therapy was able to help him. Some years later the first human clinical trial in gene therapy took place in a girl named Amy Harper. Her white blood cells were taken from her body, the gene that codes for adenosine deaminase was introduced into those cells, and the cells were returned to Amy, who is alive and well more than 4 years later.

The method of putting the gene into Amy Harper's blood cells, which has now been used with many other patients with different clinical indications,

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 69

was based on modifying a retrovirus. Harmful viral genes are removed and replaced with a useful gene, such as the gene coding for adenosine deaminase. The virus then infects cells, and healthy protein rather than harmful virus is produced, which is the rationale behind retroviral mediated ex vivo gene therapy. We were involved in developing a genetically altered skin patch. The skin patch is transduced by using retroviruses to contain a gene coding for growth hormone. This patch is layered onto normal skin and can produce and deliver its healthy proteins into the body of the patient.

Viral gene delivery systems are based on the fact that viruses effectively transfer genes into humans. The human cold virus, adenovirus, enters human fibroblast cytoplasm through a coated pit in the fibroblast surface. People have tried to use adenovirus to transfer human genes into patients. There are safety concerns about the use of defective viruses for gene therapy. What is emerging is the use of plasmid DNA for gene therapy. These are circular pieces of DNA derived from bacteria. Several companies, including our own, are inserting a therapeutic gene into a plasmid and then administering this genetic software as a pharmaceutical. Plasmid DNA can enter the nucleus but does not become integrated into chromosomal DNA; it remains episomal, or extrachromosomal (Figure 5-4). The administered therapeutic gene is transcribed into its messenger RNA, which is then translated into a therapeutic protein. The protein can be secreted from the cell to have an endocrine effect, such as insulin,

image

FIGURE 5-4 The use of a plasmid (administered DNA vector) as a pharmaceutical.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 70

factor VIII, or factor IX; it can remain within the cell to have an autocrine effect and become a structural protein; for example it can be a receptor for low-density lipoprotein and assist in treating hypolipidemia; or it can actually leave a cell and then work in a paracrine fashion on neighboring cells, such as for a growth factor. The surface of the plasmid gene expression systems can be altered so that the system can target a particular cell. Numerous research groups are looking at placing sugars onto the surface of the plasmid that can recognize sugar receptors on cells. The use of plasmid-based gene expression systems coupled with advanced gene delivery methods is expected to be the next wave in gene therapy.

Gene therapy evokes a variety of concerns in people. All new and interesting areas of require a lot of scientific endeavor as well as a relationship among scientists, clinicians, regulatory bodies, and, of course, reimbursement agencies and patients for this type of treatment to be successfully introduced. Protein therapy had to go through the same process.

Prescription For The Future

There is a long way to go: we still do not fully understand gene function nor do we know the structure of all genes and their controlling elements. Nonetheless, there is a driving force toward trying to develop drugs or modalities that control gene function and gene expression. Excellent scholarship in molecular and cell biology is taking place worldwide. A remarkable feature in the United States is the ability of scientists and others to commercialize that scholarship, to understand how that science can be developed into products. Scientists lend their names to biotechnology and pharmaceutical companies and can own part of the process—the American dream—and the American dream is now having a good effect on human health care. Advances are occurring in human health care because of that process.

TABLE 5-2 Therapeutic Protein Classes

Monoclonal antibodies
Interferons
Anticoagulants and thrombolytics
Colony stimulating factors
Dismutases
Erythropoetin
Human growth hormone
Interleukins
Tumor necrosis factor
Vaccines

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 71

A listing of the leading classes of therapeutic proteins shows that most product activity is still focused on cardiovascular diseases, diseases of the alimentary tract, antibiotics, and antibacterials (Table 5-2). The central nervous system remains as the next key target, with the problem of delivery of molecules as yet unresolved. Perhaps gene medicines will be the prescription for the future.

References

Blundell, T. L., M. S. Johnson, J. P. Overington, and A. Sali. 1990. Knowledge-based protein modelling and the design of novel molecules. Pp. 209-249 in Protein Design and the Development of New Therapeutics and Vaccines, J. B. Hook and G. Poste, eds. New York: Plenum Press.

Tomlinson, E. 1991. Impact of the modern biologies on the medical and pharmaceutical sciences, J. Pharm. Pharmacol. 44 (suppl. 1):147-159.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 72


Gene Therapy:
Beyond Genetic Diseases

SAVIO L. C. WOO

Gene therapy has three principal components. The first component is the therapeutic genes themselves. The gene discovery program, however, is not really within the domain of gene therapy. Therapeutic genes will continue to be discovered by investigators studying genes of interest to them. The discovery of new therapeutic genes will also result from the Human Genome Project. Thousands of human genes will be isolated, each with the potential to be a therapeutic gene. Dozens of human genes with therapeutic potential are already known.

Gene therapy refers to the development of science and technology for the delivery of therapeutic genes into the human body. This is the second principal component of gene therapy: how do we deliver the therapeutic genes to the proper organs with specificity, efficacy, and safety? This is the primary focus of gene therapy research today.

If we accomplish these goals, where will we go in the future? Once we can deliver a gene to a particular organ, we will want it to be expressed at a therapeutic level. The next step will be to regulate the expression of these genes. This is the third principal component of gene therapy, which will be important in the future: the development of expression and regulation vectors.

Ex Vivo Strategy

There are two paradigms for gene therapy. The first one, the ex vivo strategy, is very straightforward. I will use the liver as an example because

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 73

I am interested in metabolic diseases and the liver is the organ that I study. As shown in Figure 6-1, a piece of the liver can be surgically removed from the patient and dispersed into single cells in culture. Using any sort of delivery vehicle, an appropriate therapeutic gene can be transduced into the cells, yielding genetically reconstituted cells that can be transplanted back into the patient. Because the cells were originally derived from the patient, this is autologous transplantation and there should be no immunological rejection of the graft by the host. This procedure was completed in laboratory mice, rats, rabbits, dogs, and nonhuman primates. Five human patients are now undergoing this procedure as part of Dr. James Wilson's treatment of familial hypercholesterolemia at the University of Pennsylvania.

We have contributed to the development of this technology. However, when working with ''large animal" models such as dogs, a piece of the liver is dispersed into billions of cells. Because these cells are cultured at a few million cells per dish, we need thousands of tissue culture plates. After having performed a few of these complicated procedures, I realized that this type of gene therapy is not the best approach for postmitotic tissues requiring the delivery of therapeutic genes to billions of cells.

The approach will be useful, however, for tissues with stem cells. For example, a few cells can be removed from bone marrow, transduced, and

image

FIGURE 6-1 Ex vivo strategy for hepatic gene therapy.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 74

then returned, and they will proliferate and take over a particular organ system. The ex vivo strategy will be very useful for this type of gene therapy.

In Vivo Strategy

Future progress will need to be made on the second gene therapy paradigm, the in vivo strategy. With this strategy, therapeutic genes are delivered directly into the target organ, which is a far less complicated clinical procedure than the ex vivo strategy. The key element of in vivo gene therapy is the delivery vehicle. Three types of delivery vehicles are being explored: recombinant viruses, DNA protein complexes, and DNA liposomes. I will present data on the use of recombinant viruses as therapeutic gene delivery vehicles in living animal models.

Use Of Recombinant Viruses: The Retrovirus

The first recombinant virus used for gene therapy was the retrovirus originally developed by Dr. Richard Mulligan at the Massachusetts Institute of Technology. The retrovirus is a very simple RNA virus (Figure 6-2). It has two long-terminal-repeat (LTR) elements that control the expression of the viral genes. There is a packaging signal at one end and the viral

image

FIGURE 6-2 Retroviral structure. LTR; long terminal repeat.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 75

genome is in the middle. There are only three viral genes coding for the core proteins, the envelope proteins, and the reverse transcriptase enzyme.

Dr. Mulligan removed all the viral genes and replaced them with a therapeutic gene, so that the recombinant virus contained the regulatory elements, the packaging signal, and a therapeutic gene. Using this recombinant virus, Dr. Mulligan demonstrated that the therapeutic gene can be transduced into a variety of cells in culture. The biology of the virus is such that when the cell is transduced, it incorporates the viral genome into the cellular chromosomes. The trait thus transferred becomes a permanent genetic trait of the cell; when the cell divides, the recombinant viral genome will divide with it.

One nice feature of this virus system is that because the transducing virus contains no viral genes, it is no longer capable of replication. Therefore, no continuous virus propagation occurs in the host, which is a very important concept in gene therapy.

Using this particular system, we treated a dog with a genetic deficiency in coagulation factor IX, which causes very severe hemophilia in animals. In normal dogs of this type, blood clots in about 8 to 10 minutes. In dogs with factor IX deficiency, blood clots in about 50 to 60 minutes. If these dogs were in the wild, they would all die from bleeding. A colony of these dogs has been well maintained and characterized at the University of North Carolina under the care of Dr. Kenneth Brinkhous. We collaborated with Dr. Brinkhous to try to deliver, via the retroviral vector, a canine factor IX gene to hemophilic dogs with the deficiency. The gene was delivered externally into a subcutaneous port implanted in the dog. The port was connected to a catheter inserted into the portal vasculature. Virus containing the therapeutic gene was infused slowly into this port, driven by a peristaltic pump. No anesthesia was given and our dog patient was wide awake. During the 1-hour procedure, the patient was on his four feet wagging his tail.

What were the results of this therapeutic procedure? The hemophilic dog began to make a small amount of clotting factor in the blood. The amount was very small, but the change in clotting time was dramatic. After gene therapy, clotting time was dramatically reduced from 50 to 60 minutes to 20 minutes (Figure 6-3). Expression of the therapeutic gene in this experiment lasted for 15 months, and the clotting time remained at 20 minutes.

This experiment shows that we can deliver genes to postmitotic tissue in which the cells have very long half-lives. In man, hepatocyte half-life is measured in years. We do not know how many years, but once we deliver the genes, they can continue to function and provide a therapeutic effect for a long time.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 76

image

FIGURE 6-3 Whole-blood clotting times in a hemophilia B dog
after retroviral-mediated hepatic gene transduction in vivo.

Advantages And Limitations Of Retroviral Vectors

There are both advantages and limitations to using a retroviral vector for in vivo gene therapy. One advantage is that retroviral vectors have a very broad host-cell range, which enables the use of this therapy to treat a variety of diseases in different organs. This advantage is also a limitation, however, because it means that retroviruses lack tissue specificity and can only deliver therapeutic genes regionally, not systemically.

Another advantage is that the use of retroviral vectors results in permanent gene transduction, which potentially permits permanent gene therapy for genetic disorders. The limitation is that it transduces only dividing cells. In postmitotic tissues where the cells are not dividing, we have to stimulate the cells to divide before we can introduce the gene. Therefore, a surgical partial hepatectomy still had to be performed on our dog patient with hemophilia. Much research is currently under way on the use of chemicals to stimulate transient hepatocyte proliferation, so that the virus can be given to the liver cells without surgery.

This virus has an established safety record, at least for the ex vivo approach. Most of the approximately 50 approved clinical protocols of gene therapy in man use a retrovirus as the gene delivery vehicle. A limitation is that the virus has moderately low viral titer. If we have a preparation of recombinant virus at titers of 106 plaque-forming units

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 77

(pfu)/mL, we consider it an excellent preparation. However, liver contains billions of hepatocytes and therapy will require many liters of this virus. Many laboratories are trying to increase the viral titer; success in this area can be expected in the near future.

Adenoviral Vectors

As we look at the advantages and limitations of retrovirus vectors, we need to be aware of the possibility of alternative factors for gene delivery that can provide the same advantages with fewer limitations. Adenovirus, a new vector used for gene therapy, normally infects the pulmonary epithelium and causes common colds, but it also has a very broad host-cell range in culture.

Shown in Figure 6-4 is a simplified genome map showing map units 0 to 100 for a 36-kilobase pair DNA virus. Investigators have replaced a segment of the viral genome with the therapeutic gene. This particular segment of the viral genome encodes a very important gene of the virus called E1A. This is, therefore, an E1A-deficient recombinant adenovirus. The E1A gene product is necessary for the expression of the rest of the viral genome. Without E1A the virus cannot replicate and thus can be used for gene delivery. This construct is conceptually very different from that of the retrovirus. In the retrovirus, all viral genes have been removed. In the adenovirus, all but one of the viral genes still remain in the vector.

What are the advantages of this vector? We took an adenoviral vector containing the image-galactosidase gene of E. coli and injected it directly into

image

FIGURE 6-4 Recombinant adenoviral vectors for gene therapy.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 78

image

FIGURE 6-5 Mouse liver cells; blue indicates the presence of image-galactosidase.
Left: mouse treated with control virus. Right: mouse treated with image-galactosidase gene.

the portal vasculature of laboratory mice without previous partial hepatectomy. We then sectioned the liver and stained it to look for blue cells that would indicate the presence of image-galactosidase. In the control mouse, given a control virus, shown on the left in Figure 6-5, there are no blue cells. In the mouse given the vector containing the image-galactosidase gene, shown on the right, about half the cells are blue. If we performed the same experiment with a retroviral vector, we would be lucky to see 1 percent blue cells. This vector, therefore, is extremely efficient for gene delivery, and its activity level as shown by enzymatic assay is tremendous.

Therefore, we made a recombinant adenovirus vector containing the canine factor IX gene and infused it into the portal vasculature of the hemophilic dogs. We measured the production of canine factor IX in the blood of the hemophilic animal (Figure 6-6). This is an antigen-negative model so there is no factor IX in the blood at time zero. Just 1 day after a single infusion of the virus, this hemophilic dog had accumulated 30 to 50 µg/mL of canine factor IX protein in the blood. The normal level of canine factor IX in an unaffected dog is about 10 µg/mL, so after one infusion of the therapeutic virus, this hemophilic dog was making 3 to 5 times the

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 79

normal level of factor IX in the blood. Needless to say, after 1 day, the clotting time in this dog was normal. This example shows that using the adenovirus vector is a tremendously efficient method of gene delivery that can result in therapeutic levels of gene expression.

The problem is that after 1 week the level started to drop; in 3 weeks it dropped 100-fold. The curve then became biphasic; after 3 to 6 months the level returned to zero. So although this is a very efficient vector, it does not persist. Why not? It could be because the adenovirus is a lytic virus. Its business is to infect a cell, replicate, lyse the cell, and exit. It has no mechanism that enables it to persist in cells.

Another possibility is that these cells transduced by recombinant virus are expressing low levels of the viral proteins, even in the absence of E1A, and are being eliminated by the host's immune system. To determine whether this is the case, we performed the same experiment with another dog under immunosuppression.

The results were dramatic. Without immunosuppression there was a 100-fold loss of factor IX in 3 weeks. When we used cyclosporine A, the factor IX level persisted during the same period (Figure 6-7). These results suggest that the major contributing factor to the lack of persistence of the adenovirus delivery is viral antigen expression and subsequent elimination by the host's immune system.

It is therefore very important to think about the need to further modify the adenovirus vector backbone. If we could delete more viral genes so that the virus would no longer be able to express any viral function, what

image

FIGURE 6-6 Plasma levels of canine factor IX in hemophilia B dogs after in vivo hepatic gene delivery.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 80

image

FIGURE 6-7 Bioassay after adenovirus injection with and without cyclosporine.

kind of persistence or duration of therapeutic period could we then achieve? The scientific community is working in this direction.

Advantages And Limitations Of Adenoviral Vectors

The adenoviral vector for in vivo gene therapy has the advantage of very broad host-cell range, but it lacks tissue specificity. It is extremely efficient for gene transduction but lacks persistence. It transduces nondividing cells (with no surgery necessary) but it is a cytopathic virus.

When we did the experiment in the mice using the image-galactosidase virus, we achieved 100 percent blue hepatocytes in vivo with a single shot, which is a remarkable result. If we increased the dose 5- or 10-fold, the animals died before our eyes. So this is a cytopathic virus, which presents the problem of overdosing. Overdosing is nothing new in medicine; every medicine, including aspirin, is toxic in overdose. This demonstrates the importance of considering the safety of vector systems for gene delivery.

Another advantage of the adenoviral vector is that it has very high viral titer. When we isolate these vectors they are in the range of 1011 pfu/mL,

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 81

105 higher than the retrovirus. All we need is a few milliliters of viral solution: one shot in the arm and it's done.

We are trying to further engineer the adenovirus to give it persistence and to reduce cytopathicity. However, the development of this second-generation vector is going to take some time.

Gene Therapy For Solid Tumors

We are now considering treating solid tumors with gene therapy. The adenovirus vector is desirable because we want to deliver therapeutic genes to a lot of cancer cells. Because the purpose is to deliver toxic genes into the cancer cells to kill them, it does not matter if the effect is transient. Toxicity also does not matter, because we want to kill the cancer cells anyway.

Kenneth Culver reported a couple of years ago that a Herpes simplex virus thymidine kinase (ADV-tk) gene was delivered into rat gliomas. Glioma cells had been injected stereotaxically into a hemisphere in the rat brain; the tumor cells had proliferated and grown into small tumors a few days later. (Such tumors continue to grow and eventually the animal dies from the brain tumor.) A retroviral vector containing the ADV-tk gene was introduced into the tumor directly by sterotaxic injection. The animals were then treated with either buffer or ganciclovir. Animals treated with buffer continued to grow large tumors and died. Tumors in animals treated with ganciclovir regressed.

I wanted to do the same experiment with the adenoviral vector. I thought that the limitations of the retrovirus vector could be offset with our adenoviral vector. The retrovirus is very inefficient with very low titers, which is why Culver did not use the retrovirus to cure the tumor and instead injected virus-producing mouse cells so that there would be enough virus production in vivo to treat the tumor.

Eight patients are now in clinical trials using that particular protocol, which is to inject virus-producing mouse cells into the tumor of human brain. That is a very inefficient process, and we think we can enhance the efficiency of gene delivery with the adenoviral vector.

In the original report, the authors argued that using a retrovirus is desirable for this type of gene therapy because biologically it transduces only dividing cells. It would only deliver the killer gene to the rapidly dividing tumor cells but not to the normal surrounding cells, thus providing a margin of safety. How do we deal with this if we use adenovirus? Ganciclovir by itself is not cytopathic. It diffuses in and out of cells just like any nucleoside, but if it diffuses into a cell that contains ADV-tk, it will be phosphorylated to become ganciclovir phosphate, which is a nucleotide. The nucleotide will then be incorporated into DNA in the dividing

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 82

cells during DNA replication, causing chain termination that will lead to cell death. Thus, the mechanism of ganciclovir toxicity is also targeted for dividing cells, which makes the additional safety margin provided by the retrovirus redundant.

Using The Adenoviral Vector To
Destroy Tumor Cells

With this in mind, we created an adenoviral vector containing the ADV-tk gene and used it in a glioma mouse model in mice. We injected the tumor cells into the brain and after 20 days the tumor grew to a large size. The results are shown in Figure 6-8: on the left is a mouse with a huge brain tumor. If we had waited for another day or two, this animal would have died. On the right is a mouse after gene therapy, and there is no evidence of a brain tumor. In Figure 6-9 we see on the top left the tumor that we resected from the brain. The tumor is about one-third the size of the entire brain. On the right is the brain after gene therapy; it looks perfectly normal and there is no evidence of tumor. There is a small black spot on the left hemisphere caused by our injecting the virus with a little charcoal to mark the site of injection. This brain shows no anatomic evidence of tumor.

image

FIGURE 6-8 Gliomal mouse model in mice. Left: mouse with a large brain tumor.
Right: mouse after gene therapy.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 83

image

FIGURE 6-9 Brains from mice from gene-therapy experiment. Left: tumor (top) and
brain from untreated mouse. Right: brain from mouse that received gene therapy.

What about pathologic evidence? Shown in Figure 6-10A is the brain from the animal treated with ADV-tk virus and given buffer afterwards. The tumor grew to enormous proportions, filling the entire hemisphere. It grew so big that it squeezed the rest of the brain and the ventricles actually disappeared. Figure 6-10B is a high-power section that shows an aggressively growing tumor that looks nothing like a normal brain. When the same experiment was done with animals treated with ganciclovir, the ventricles appear and there is no evidence of tumor (Figure 6-10C). The charcoal that was used to mark the site of injection can be seen under higher magnification (Figure 6-10D). This section looks like a normal brain with neurons, but there may still be a few residual cells that look like tumor cells. In some animals there is no evidence at all of residual tumor (Figure 6-10E, F).

We designed an experiment to provide quantitative data. We injected a large number of mice with tumor cells, divided the animals into two groups, and injected one group with a control virus, the other with a

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 84

image

FIGURE 6-10 A, B: Growth of brain tumor treated with ADV/tk plus buffer; C, D: regression
of brain tumor in mice treated with ADV/tk plus ganciclovir; and E, F: ablation of
brain tumor in mice treated with ADV/tk plus ganciclovir.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 85

image

FIGURE 6-11 ADV-tk gene therapy for C6 gliomas in nude mice.

therapeutic virus. We then subdivided them into two smaller groups: one was treated with buffer, the other with ganciclovir. After 20 days (before the animals would have died if they had not had gene therapy) we killed all the animals, looked at the brains, and measured the tumor sizes.

Each point in Figure 6-11 represents one animal. The animals that received the control virus and no ganciclovir grew huge tumors. The animals that received the control virus with ganciclovir had much smaller tumors, which means that ganciclovir somewhat retarded the growth of the tumor but did not cause tumor regression. The animals treated with the therapeutic virus but no ganciclovir developed big tumors. The animals treated with the therapeutic virus and ganciclovir all had tumor regression. There were no exceptions. The results show the power of this technology in delivering killer genes to tumor cells to treat solid tumors.

Criteria For Gene Therapy For Brain Tumors

With these results, Baylor College of Medicine is preparing to propose a clinical trial using adenoviral-mediated gene therapy for brain tumors. As we look into the future, what are the criteria for using this gene therapy? In our view, the first criterion is that the tumor has to be a primary tumor or there has to be only limited metastasis, because the technology at this time involves injection of the therapeutic genes directly

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 86

into the solid tumor. The second and third criteria are that the tumor must be accessible for injection and it must be nonresectable. The final criterion is that the patient must have a very poor prognosis with no possible alternative treatment. Brain tumors qualify for the trial because, more often than not, they satisfy all four criteria. Many gliomas are very aggressive and are not treatable by alternative methods.

What are our goals? There could be three: first, a long-term cure. If we can achieve that, we will all celebrate. Second, short-term regression would have tremendous value for the patients and their families. Third, palliation, which would improve the quality of patients' lives.

Future Of Gene Therapy

As we continue to look at gene therapy for cancer, are we going to stop at the brain? Can it be more generally applied? If you think about it, this is a virus that we can use to deliver genes into any solid tumors. The effect of ganciclovir is the same in all kinds of cells and it is not tumor specific. Therefore, why not think of other tumors as well? Indeed, Baylor College of Medicine has initiated an extensive program of gene therapy for solid tumors that includes many clinical investigators who collaborate with us. We are now working on tumors in the head and neck, colon, prostate, bladder, breast, and skin, among others. We have preliminary results for the head and neck tumors and for colon metastases in the liver.

What will gene therapy have in store for us in the future? What are the future challenges? Gene therapy started out to be a godsend technology for the cure of genetic diseases. It began with the first human trial under Dr. French Anderson. I have already talked about hemophilia. In cystic fibrosis, clinical trials are ongoing. As I have shown here, gene therapy for genetic disorders has really blossomed. We have developed technologies to deliver therapeutic genes to complement a genetic deficiency in living animals; will we be able to use the same technology to deliver killer genes to get rid of activities we do not want?

We are now entering into the era of gene therapy for acquired diseases such as cancer. Why should we stop with cancer? We should think about infectious diseases, such as acquired immune deficiency syndrome and hepatitis. All we need to do is to kill cells that are infected by the virus. What about cardiovascular diseases, such as atherosclerosis? At Baylor College of Medicine we have a program project supported by the National Heart, Lung, and Blood Institute on gene therapy for cardiovascular disease. In this particular project, we deliver the low-density lipoprotein receptor gene to laboratory rabbits. We have observed a dramatic reduction of serum cholesterol in these animals.

In the future the ultimate challenge will be for us to treat neurological

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 87

diseases. It would be tremendously exciting to figure out what kind of therapeutic genes we could use to combat Alzheimer disease. Gene therapy feasibility demonstrations will be ample in the 1990s. We envision that gene therapy will have a major effect on medicine and health early in the twenty-first century.

Acknowledgment

Mark Kay conducted the hemophiliac gene-therapy experiments, which were supported by NIDDK grant 44080. Gretchen Darlington, Milton Finegold, and Mary Brandt are participating faculty and the collaborators at the University of North Carolina, Chapel Hill, are Dr. Kenneth Brinkhous and his collaborators. I would like to acknowledge may collaborators on the brain tumor project, particularly Dr. David Shine and Dr. Robert Grossman in the Department of Neurosurgery and Dr. Clay Goodman in the Pathology Department at Baylor. Dr. Shu-Hsia Chen, a postdoctoral fellow, performed the study in my lab.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 88


Tissue Engineering: The Union of
Biology and Engineering

ROBERT M. NEREM

Tissue engineering is an emerging part of biotechnology (Langer and Vacanti, 1993; Nerem and Sambanis, 1995). Biotechnology was defined by the Office of Technology Assessment a decade ago as techniques that use living organisms to make or modify products, improve plants or animals, and develop microorganisms for specific purposes. This definition emphasizes making or modifying a product by using living organisms and related types of activities.

The biotechnology industry, for the most part, might be called the genetic engineering industry, an industry based on recombinant DNA technology. Certainly this is an industry that makes products by using living organisms, usually genetically modified living organisms. There is also what I call the tissue-engineering industry; this industry also uses living organisms (i.e., living cells) and thus comes under the heading of biotechnology. This is an area that will become extremely important as we move into the twenty-first century.

A basic goal of tissue engineering is to fabricate living tissue equivalents, that is, biological substitutes to be implanted into the body. Also important is the development of materials that promote the remodeling of tissue or the resurfacing of a nonbiologic material to make it more compatible with the body. A critical issue is the growing of three-dimensional cellular structures. It is quite easy to grow cells in two dimensions in the laboratory but not so easy to grow cells in the three-dimensional structures that are important for any kind of tissue-engineered construct. Tissue engineering is also involved in developing vehicles for introducing genetically manipulated cells into the body (i.e., gene therapy).

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 89

Three things have to occur in tissue engineering. First, it is critical to understand the structure-function relationships. Second, cellular activity and function can then be controlled in collections of cells that have been assembled into tissues or organs, not only individual single cells. Third, once the ability to control exists, biological substitutes can be developed.

Understanding living organisms at the system level is all important in tissue engineering. Systems include the collection of cells and associated materials that make up the tissue and, ultimately, entire organs.

Applications Of Tissue Engineering

Some applications of tissue engineering are artificial skin, bioartificial organs, blood substitutes, neurological implants, tissue-engineered vascular grafts, and various orthopedic devices (Langer and Vacanti, 1993; Nerem and Sambanis, 1995). An example of the last is tissue-engineered cartilage, which is being developed commercially.

One basic formula for tissue engineering involves taking cultured cells and putting them together with a supporting material. This material could be wholly synthetic or synthetic with an extracellular matrix of the normal adhesive proteins that are involved in the extracellular environment of the cells. When necessary nutrients are included, this becomes a living tissue equivalent. Environmental cues—the signals to which a cell is exposed—also influence living tissue. We now know that these signals are extremely varied and complex. From my perspective in mechanical engineering, I know that the mechanical environment of the cells is as important as the chemical environment, particularly for certain types of tissue and organs. Cells from these tissues and organs are extremely talented; they integrate the totality of their environment, including the mechanical stresses that they sense and that stimulate them.

To build biological substitutes, certain critical questions have to be answered. Should human cells be used? Should cell lines be used? Should cells be genetically modified? Are the cells going to express proteins at the level required for a particular tissue or organ? How are the cells going to respond to physiological stimuli? Will function be maintained long term? Will the materials used be biocompatible? Will the materials be stable long term?

Artificial Skin Products

The Food and Drug Administration (FDA) has not yet approved a tissue-engineered product, but several products, particularly in the area of artificial skin, are in clinical trials and undoubtedly will be approved soon. Skin is a relatively simple tissue, and different approaches have

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 90

been used to produce it (Bell et al., 1991; Coorper et al., 1991; Green et al., 1979; Yannas, 1992). One involves fibroblasts seeded in a collagen gel. Another uses fibroblasts seeded on a biodegradable material that provides a scaffold for cellular growth. A Boston company does large-scale expansion of epithelial cells, for example, taking a small amount of tissue from someone with extensive burns and expanding that cellular population to a size necessary cover the burned area. There also are acellular approaches in which materials and an extracellular matrix are used to promote wound healing and cellular growth.

Another example of artificial skin is skin equivalents that are used for toxicity testing. Models of this type are needed to test various products, such as cosmetics, for their toxicity. Having an artificial skin substitute reduces the number of animal experiments required to prove a product before commercialization.

Bioartificial Organs

An important area for tissue engineering is the development of bioartifical organs. These may be of a hybrid nature, involving synthetic materials together with living cells and other natural components.

At the Georgia Institute of Technology (Georgia Tech), one of the tissue-engineering projects, directed by Dr. Thanassis Sambanis, is to develop a bioartificial pancreas as an approach for treating insulin-dependent diabetes. This alternative to insulin injection or other forms of treatment is being pursued in several laboratories and involves the implantation of insulin-secreting cells to create an artificial pancreas with living pancreatic cells that are responsive to glucose and secrete insulin (Colton and Avgoustinatos, 1991; Reach, 1993; Tziampazis and Sambanis, 1994). Immunoprotection or immunoisolation is achieved by encapsulating the cells in a membrane-like biomaterial. The feasibility of using such implants with exogeneic cells to treat diabetes mellitus has been demonstrated in both small and large animal models (Lanza et al., 1991, 1993). Human trials are now underway (Soon-Shiong et al., 1994).

A key question is cell availability. Isolation of the primary beta cells from animal glands is difficult. Primary eyelets cannot be amplified in culture, but it is possible to engineer glucose-responsive, insulin-secreting cell lines, and this is being done. A line that has been used at Georgia Tech is the imageTC3 line, but we plan to use the better lines that are now available (Efrat et al., 1988). Regarding the use of native cells or genetically manipulated cells, we think that engineered cell lines should be used for the bioartificial pancreas.

Figure 7-1 is an illustration of such an encapsulated bioartificial pancreas. This is the system being used at Georgia Tech (Tziampazis and

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 91

image

FIGURE 7-1 Illustration of immunoisolated insulin-secreting cells or spheroids in
alginate/poly-L-lysine/alginate polymer, provided by the courtesy of A. Sambanis.

Sambanis, 1995). It has either cells or spheroids of cells inside the membrane, which has a molecular-weight cutoff that provides immunosiolation. The membrane permits insulin to be secreted and the cells inside to sense the glucose environment and thus be responsive to glucose.

Researchers at Brown University have been using encapsulated cells for cell therapy for nervous system disorders (Bellamkonda and Aebischer, 1994; Tresco et al., 1992). As an example of a technique for physically targeting the delivery of biologically active molecules, this could be considered to be a form of drug delivery.

Another important application of tissue engineering is the development of a bioartificial liver. I have been a scientific advisor to a group in Minneapolis that is working on this. It is a joint project between the University of Minnesota and a company named Regenerex. This project is important because although 2,500 liver transplants are performed in the United States each year, 27,000 patients go without organ replacement and some die while on the waiting list. In the future a goal would be the development of an artifical liver that involves long-term hepatocyte culture

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 92

(Dunn et al., 1991). However, in the short term the focus of many efforts is the development of a bridge-to-transplant system.

One such system being developed in Minneapolis uses a hollow-fiber bioreactor in which hepatocytes reside (Shatford et al, 1992). The blood is removed from the patient; passed through this system, where it become detoxified; and then perfused back into the body. The cells are inoculated with collagen, which then polymerizes, and the collagen contracts to form a collagen gel within the hollow fibers in which cells reside. Within the hollow fibers, there is a lumen stream that provides for nutrients and waste-product removal. The blood passes over the hollow fibers around the outside. Experiments have been done with small animals, but applying this technology to humans involves considering the issue of scale-up.

Obviously, applying such a system to a human requires a considerable increase in the number of cells used. Experiments now use rat hepatocytes; should a different species be used in humans? Should the flow characteristics of the bioreactor be different for humans? As the system is scaled-up, many things change, including the hydrodynamic environment, mixing characteristics, and nutrient delivery. These issues are important in developing such a extracorporeal, bioartifical liver system as a bridge to transplant.

Vascular System

My own interests have focused on the application of engineering to the vascular system. Much of the research in my laboratory has been in the area of heart disease and the influence of blood flow on vascular biology and the processes that are important in the development of the initial stages of atherosclerosis. Progression of this disease in the coronary blood vessels results in myocardial ischemia, even in a heart attack. The surgical treatment involves bypass surgery, where areas of obstruction are bypassed by a saphenous vein or a mammary artery. Everyone is familiar with this procedure and knows people who have undergone bypass surgery.

We believe that the mechanical environment of a blood vessel is all important (Nerem, 1993). Basically, a blood vessel is a conduit for flowing blood. The blood has a pressure associated with it that acts as a mechanical force on the vessel wall. As blood flows over the vessel wall, it also imparts what we call a shear stress, which is a frictional force. Just as air moving over an automobile provides a frictional drag, blood flowing through a vessel provides a frictional force on the vessel wall. The approach of our laboratory at Georgia Tech is first to understand vascular biology; second to control cellular function within the context of a blood vessel; and third, through tissue engineering, to construct a blood vessel substitute. Throughout this process it is important to understand the role

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 93

of mechanical forces (i.e., the mechanical environment in which the cells will reside).

Figure 7-2 shows the morphology of endothelial cells both in static culture and after exposure to a laminar flow and the associated shear stress for 24 hours (Levesque et al., 1989). The response to laminar flow is

image

FIGURE 7-2 Photomicrographs of cultured bovine aortic endothelial cells
grown under controlled conditions (A) and under shear stress (85 dynes/cm2)
for 24 hours (flow from right to left). Reprinted with permission from the
American Society of Mechanical Engineers.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 94

image

FIGURE 7-3 Flow stimulates platelet-derived growth factor messenger RNA expression in
endothelial cells as shown by Northern blot analysis. Reprinted with
permission of the American Physiological Society.

cell elongation and an alignment of the cell's major axis with the direction of flow. This response is active and involves a reorganization of the cell's cytoskeletal structure. An example of this is F-actin, a major structural component within the cell: microfilaments relocate from the dense peripheral band pattern characteristic of static culture to stress fiber bundles aligned with the direction of flow. This reorganization reflects itself in the mechanical properties of these cells. Thus, there is a major difference in morphology and structure associated with the mechanical environment imposed by the laminar flow. Endothelial cells exposed to flow are much more like what we see in vivo in animals and in human beings, except for regions of low flow or stasis.

Laminar flow also influences function, for example, the secretion of various biologically active molecules. This is illustrated in Figure 7-3, where a Northern blot for a particular molecule (platelet-derived growth factor B chain) is presented (Mitsumata et al., 1993). Messenger RNA levels are shown for a control experiment and a flow experiment at different times. Gene expression under control conditions and flow conditions changes significantly over time. Thus, the influence of mechanical forces extends to the level of gene expression. These mechanical forces are part of the environmental cues that regulate the expression of genes.

Reconstituting Blood Vessels

Unfortunately, some people do not have native vessels available for use in bypass surgery. Either they have had bypass surgery before or their

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 95

vessels are not suitable. Thus, an important goal is to develop small-diameter vascular grafts for use in the coronary system. Various approaches have been taken, including the use of synthetic materials. In one approach, a synthetic material is seeded with endothelial cells, the cells that line blood vessels and provide the natural interface between flowing blood and the underlying vessel wall (Zilia et al., 1987). This hybrid vessel then becomes the vascular graft. There has been some initial success with this approach.

Others are pioneering what are called acellular approaches involving the implantation of a material that promotes cell ingrowth. Thus, although the endothelial cells have not been seeded on the material, over a relatively short period of time the cells grow in from neighboring regions of the vasculature.

In my laboratory we are very interested in reconstituting a blood vessel in culture (Ziegler and Nerem, 1994). This work builds on the work of Weinberg and Bell (1986). We are taking smooth muscle cells that constitute the vessel wall, endothelial cells that provide the inner lining, and appropriate extracellular matrix components and putting them together in the form of a blood vessel.

As a first step we are developing a model for our vascular biology experiments that we believe will be a much better simulation of the in vivo environment for cell culture studies (Zeigler et al., 1995). This model involves plating endothelial cells on top of a collagen gel in which smooth

image

FIGURE 7-4 Illustration of the endothelial cell–smooth muscle cell co-culture model of
the blood vessel wall being used in a laminar flow chamber at Georgia Tech.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 96

muscle cells reside (Figure 7-4). We have just started experiments to look at the effects of flow on such a co-culture system. As we move toward tissue-engineered biological substitutes, it will be important to simulate the real physiologic environment as closely as possible in the cell culture laboratory. We believe that adding flow to the static culture models of vascular cells that have been used in the past is a step in the right direction. However, the endothelial cells inside your body do not reside on a plastic surface; they adhere to a basement membrane and have neighboring smooth muscle cells. In this new model we can look at mechanical force effects (flow effects) on endothelial cells along with their normal neighboring partners, the smooth muscle cells. We are in the process of extending this to a tubular configuration, and our long-term goal is to develop a vascular substitute for bypass surgery. Tissue engineering—whether it involves a blood vessel substitute or some other tissue or organ—requires a great deal of understanding of cell biology and molecular biology and an ability to put cells together as a system to develop tissues and organs.

New Biomaterials

Biomaterials are an important core technology for tissue engineering (Peppas and Langer, 1994). Some people believe that for the past 25 years, biomaterials have been in their age of innocence. Much work has been done with good intentions, but maybe with a certain amount of naivete, at least in the sense of thinking there such a thing as a passive material. However, now we are in the age of discovery, in which materials are being developed that interact with cells and tissues specifically, where we can even build certain receptor-like sequences into the surface of the biomaterial to facilitate appropriate interaction (Massia and Hubbell, 1991).

Tissue engineering is bringing microfabrication and nanofabrication technology together with molecular and cell biology (in a combination that can be called the nanotechnology of life). Basically, we are taking nanotechnology in a normal engineering sense and bringing it together with ''nano" biology. If we can learn how to bridge from microfabricated systems into cell behavior, including subcellular events, it will be possible to develop prostheses that are much more effectively coupled to a person's normal body. At the Georgia Tech Microelectronics Research Center, surfaces can be built where the scale of an electric field or a magnetic field is considerably less than that of a single cell. This provides the opportunity for an interaction between an electronic system and a cellular system at the single-cell level. There also is the possibility of incorporating controlled-release polymers for drug delivery at the nano scale. As useful as

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 97

such technologies will be in basic research, they have wide-ranging implications in the coupling of systems within the body to man-made devices.

Current State of Tissue Engineering

Most people think of biotechnology only as the genetic engineering (i.e., recombinant DNA) industry. However, a tissue-engineering industry is emerging, an industry that is 10 to 15 years behind the classical recombinant DNA technology industry, but one that still is an important area for future development. In the United States each year there are 20,000 transplants but there are 2,000,000 implants. There are 100,000 people with transplants, but 10,000,000 people with implants. The need for implantable parts and devices is staggering, and this need cannot be met through organ and tissue transplantation. This is an enormous opportunity for the tissue-engineering industry. Today it is estimated that the industry consists of approximately 20 companies with about 3,000 employees and $200 million in research and development expenditures. Within a decade the total commercial sector dollar volume could easily increase tenfold to $2-5 billion.

There are many other examples of the potential of tissue engineering: work is being done with cartilage, with the expansion of bone marrow stem cells, and in the field of neurology. It is an exciting area that in a sense began with the biological revolution: the ability to culture cells outside the body. This revolution led to many advances in cell and molecular biology and is an important foundation for tissue engineering.

How are tissue engineering and genetic engineering related? To start with, cellular implants may involve cells that have been genetically manipulated. Also, tissue-engineered substitutes may be useful as vehicles for introducing genetically modified cells. Because gene therapy can be viewed as the most sophisticated form of tissue engineering, there is an overlap between these two exciting areas. In building our tissue engineering program at the Georgia Institute of Technology in partnership with Emory University School of Medicine, we have made gene therapy a priority.

Another area of importance to tissue engineering is that of drug delivery, or more precisely, the delivery of biologically active molecules. There are many reasons for targeting drug delivery, and many groups are beginning to look at systems to do this (Langer, 1990). It would be advantageous to deliver drugs specifically to an organ or a tissue. This would give a therapeutic dose at the desired site while having a low systemic dose in nontarget compartments. Such targeted delivery will not only reduce the risk of side effects and tolerance effects, but also facilitate combination drug therapies. Some delivery systems may involve the interaction

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 98

of receptor molecules characteristic of cells and of drugs or the physical aspects of delivery. Because delivery of drugs to the right site at the right dose is important, not drug efficacy in the petri dish, a delivery system should not be an afterthought. Engineering the design of the drug and its delivery system should be concurrent.

There has been a cultural revolution in this country in terms of the importance of biology; I think we need a further cultural revolution in terms of the role of engineering in biology. Engineers need to be involved not only on the production side, but also very early in the research and development process, as they are in other areas of science and engineering. How a product is manufactured depends on how the product was developed and thus is based on scientific investigation. Engineers need to be involved across the entire spectrum of activity from basic research all the way to the final product. Only then will one have a seamless commercialization process.

I also think there is going to be yet another cultural revolution. As we learn more and more about biology from an engineering perspective, as we learn how biological systems have been engineered for us, the use of engineering to develop products for nonbiological, nonmedical applications will radically change. Thus, as we move into the twenty-first century, not only is there a role for engineering in biology with all the resulting applications, but biology has the potential for enormously altering engineering.

References

Bell, E., M. Rosenberg, P. Kemp, R. Gay, G. D. Green, N. Muthukumaran, and C. Nolte. 1991. Recipes for reconstituting skin. J. Biomech. Eng. 113:113-119.

Bellamkonda, R., and P. Aebischer. 1994. Review: tissue engineering in the nervous system. Biotechnol. Bioeng. 43:543-554.

Colton, C. K., and E. S. Avgoustinatos. 1991. Bioengineering in development of the hybrid artificial pancreas. J. Biomech. Eng. 113:152-170.

Cooper, M. L., J. F. Hansbrough, R. L. Spielvogel, R. Cohen, R. L. Bartel, and G. Naughton. 1991. In vivo optimization of a living dermal substitute employing cultured human fibroblasts on a biodegradable polyglycolic acid or polyglactin mesh. Biomaterials 12:243-249.

Dunn, J. C. Y., R. G. Tompkins, and M. L. Yarmush. 1991. Long-term in vitro function of adult hepatocytes in a collagen sandwich configuration. Biotechnol. Prog. 7:237-245.

Efrat, S., S. Linde, H. Kofod, D. Spector, M. Delannoy, S. Grant, D. Hanahan, and S. Baekkeskov. 1988. Beta-cell lines derived from transgenic mice expressing a hybrid insulin gene-oncogene. Proc. Natl. Acad. Sci. USA 85:9037-9041.

Green, H., O. Kehinde, and J. Thomas. 1979. Growth of cultured human epidermal cells into multiple epithelia suitable for grafting. Proc. Natl. Acad. Sci. U.S.A. 76:5665-5668.

Langer, R. 1990. New methods of drug delivery. Science 248:1527-1533.

Langer, R. and J. P. Vacanti. 1993. Tissue engineering. Science 260:920-926.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 99

Lanza, R. P., D. H. Butler, K. M. Borland, J. E. Staruk, D. L. Faustman, B. A. Solomon, T. E. Muller, R. G. Rupp, T. Maki, A. P. Monaco, and W. L. Chick. 1991. Xenotransplantation of canine, bovine, and porcine islets in diabetic rats without immunosuppression. Proc. Natl. Acad. Sci. U.S.A. 88:11100-11104.

Lanza, R. P., P. Lodge, K. M. Borland, M. Carretta, S. J. Sullivan, A. M. Beyer, T. E. Muller, B. A. Solomon, T. Maki, A. P. Monaco, and W. L. Chick. 1993. Transplantation of islet allografts using a diffusion-based biohybrid artificial pancreas: long-term studies in diabetic, pancreatectomized dogs. Transplant. Proc. 25:978-980.

Levesque, M. J., E. A. Sprague, C. J. Schwartz, and R. M. Nerem. 1989. The influence of shear stress on cultured vascular endothelial cells: the stress response of an anchorage-dependant mammalian cell. Biotechnol. Prog. 5:1-8.

Massia, S. P. and J. A. Hubbell. 1991. Human endothelial cell interactions with surface-coupled adhesion peptides on a nonadhesive glass substrate and two polymeric biomaterials. J. Biomed. Mater. Res. 25:223-242.

Mitsumata, M., R. S. Fishel, R. M. Nerem, R. W. Alexander, and B. C. Berk. 1993. Fluid shear stress stimulates platelet-derived growth factor expression in endothelial cells. Am. J. Physiol. 265:H3-H8.

Nerem, R. M. 1993. Hemodynamics and the vascular endothelium. J. Biomech. Eng. 115:510-514.

Nerem, R. M., and A. Sambanis. 1995. Tissue engineering: from biology to biological substitutes. Tissue Eng. 1(1):3-13.

Peppas, N. A., and R. Langer. 1994. New challenges in biomaterials. Science 263:1715-1720.

Reach, G. 1993. Bioartificial pancreas. Diabetic Med. 10:105-109.

Shatford, R. A., S. L. Nyberg, S. J. Meier, J. G. White, W. D. Payne, W.-S. Hu, and F. B. Cerra. 1992. Hepatocyte function in a hollow fiber bioreactor: a potential bioartificial liver. J. Surg. Res. 53:549-551.

Soon-Shiong, P., R. E. Heintz, N. Merideth, Q. X. Yao, Z. Yao, T. Zheng, M. Murphy, M. K. Moloney, M. Schmehl, M. Harris, R. Mendez, and P. A. Sandford. 1994. Insulin independence in a type I diabetic patient after encapsulated islet transplantation. Lancet 343:950-951.

Tresco, P. A., S. R. Winn, and P. Aebischer. 1992. Polymer encapsulated neurotransmitter secreting cells. Potential treatment for Parkinson's disease. ASAIO Trans. 38:17-23.

Tziampazis, E., and A. Sambanis. 1995. Tissue engineering of a bioartificial pancreas: modeling the cell environment and device function. Biotechnol. Prog. 11:115-126.

Weinberg, C. B., and E. Bell. 1986. A blood vessel model constructed from collagen and cultured vascular cells. Science 231:397-399.

Yannas, I. V. 1992. Tissue regeneration by use of collagen-glycosaminoglycan copolymers. Clin. Mater. 9:179-187.

Ziegler, T., R. W. Alexander, and R. M. Nerem. 1995. An endothelial cell–smooth muscle cell co-culture model for use in the investigation of flow effects on vascular biology. Ann. Biomed. Eng. 23:216-225.

Ziegler, T., and R. M. Nerem. 1994. Tissue engineering a blood vessel: regulation of vascular biology by mechanical stresses. J. Cell. Biochem. 56:204-209.

Zilia, P. P., R. D. Fasol, and M. Deutsch, eds. 1987. Endothelialization of vascular grafts. Basel: Karger.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 100


Development of Biopharmaceuticals:
An Engineering Perspective

MICHAEL SHULER

Jurassic Park, the book and the movie, exposed millions of people to a vision of life in a biotechnology company. The company portrayed in the film was a commercial failure for many reasons. The brilliant molecular biologist worked with DNA in a rather abstract manner, without knowing and without wanting to know much about the biology of the organisms involved. The project failed because the components, not the system, were optimized and because secrecy demands made it so that no timely outside advice was obtained from uninvolved experts.

The question we will explore is: Could some of these factors, in this fiction, affect real-life biopharmaceutical firms? This presentation is from an engineering perspective, a perspective that is often missing in the decision-making process of many biopharmaceutical firms. It is missing in part because engineers and their employers have defined the role of engineering too narrowly. Engineering has been defined in terms of skill rather than in terms of systems, and a systems perspective is the key to making a product efficiently and, even more importantly, to knowing if a product will have value.

An engineer sees a living cell as a complex chemical plant. It has its own regulatory system and has many built-in redundancies for safety. A living cell has a far more sophisticated control system than has ever been put into a chemical plant designed by humans.

I will begin here with a discussion of therapeutic proteins, the products that have made the biopharmaceutical industry successful, and then discuss other products that are emerging, products dependent on and

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 101

independent of DNA knowledge. Finally, I will consider recent trends in terms of their implications for education.

Therapeutic Proteins

Simple strategies based on simple linear logic were initially conceived for the development of therapeutic proteins: a gene plus its expression produced a protein, which in turn produced a product system. The logic behind its use was that the product would attack the disease and this would lead to health improvement. This strategy, however, can fail. If you want to make protein, and just make protein, you are in the soybean business. If you want to sell a pharmaceutical product, you must worry about protein quality and biological activity. Is the protein correctly folded? Are appropriate sugars added? Are there multiple species present?

When biotechnology first began, many people thought that simple bacteria or yeast could make all proteins. They did not realize that these organisms lack some of the key processing steps needed for the necessary product quality, at least for some of the protein products. These initial processes were often problematic, in many cases because each component was optimized separately without any thought given to the integration of the components.

The situation has changed a lot over the past 15 years. Much has been learned: projects are developed with integrated teams, multiple host systems are explored, and the choice of the host system is closely linked to product quality. The issue of product quality, however, has not been fully resolved. Product quality is a function of culture conditions and, potentially, of the scale-up conditions used to make it.

Post-Translational Processing

We are familiar with the language of DNA in terms of nucleotides: how they are put together, how that information is transcribed into RNA, and how RNA is translated into proteins (Armstrong, 1989). Proteins are described as linear sequences; post-translational events include folding, glycosylation (putting on sugars), disulfide bond formation (which is dependent on redox potential), and phosphorylation, all of which are dependent on the physiological environment. In addition, protein are sometimes altered (clipped off) during transport or secretion.

Correct post-translational processing is often key to the therapeutic value of the product (Bialy, 1987; Liu, 1992). Will it have the right immunogenic characteristics? Will it be targeted to the right tissue? Will the body clear it before it has a chance to have a therapeutic effect? Correct

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 102

post-translational processing depends on the intrinsic capabilities of the cell, which are determined by genetics and culture conditions (Monica et al., 1993).

The physiologic utility of antibodies can depend on how they were produced (Van Brunt, 1990). Immunoglobulin M (IgM) produced in mice in ascites fluid has a half-life of 780 minutes. IgM can be also be produced in cell culture. Mouse cells can be used to make hybridomas that are hybrids of slow-growing, antibody-producing lymphocytes and fast-growing myeloma (cancer) cells. Hybridomas in serum-free medium in an air-lift reactor (a reactor in which air circulates and oxygenates the culture) can produce IgM with a half-life in the body of only 3 minutes, which is not long enough for therapeutic use. The same hybridoma cells in medium containing a small amount of serum in a hollow-fiber reactor, which may protect cells from shear, produce IgM with a half-life of 18 minutes. This range of half-life from 3 to 780 minutes demonstrates the sensitivity of the use of the product to variations in production.

Use of Mammalian Cells

Bacteria were found to be incapable of performing glycosylation. Yeast can do some simple glycosylation, but not the complex glycosylation that occurs in humans, and yeast have a bad habit of going wild and continuing to add sugars (Hodgson, 1993). The insect cell baculovirus system (Luckow, 1991) is of interest, but it is unclear if that system can produce the complex glycosylation reactions found in mammalian systems. Consequently, many products are now made by mammalian cell cultures.

Chinese hamster ovary cells are used in many cell cultures. These cell cultures are very expensive and difficult to use but have been brought into full commercial production. Their use was approved by the Food and Drug Administration (FDA) despite initial reservations about safety (Ramabhadvan, 1987). These cells are transformed cells; they are immortal and will continue to divide forever. Because all cancer cells are transformed cells, there was some concern about the possibility that a product produced by Chinese hamster cells would induce cancer in humans. Consequently, there are very severe regulatory constraints on the mononucleic acids, for example, that can be present in one of these products.

There also was concern about using mammalian cells because of the presence of endogenous viruses and about using viral vectors that were disabled viruses of primates because of the possibility of their reverting to virulence. Such problems did not occur, but for a long time there was much resistance to using these cells.

Use of the Chinese hamster cells results in a good yield of product,

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 103

but it takes a long time to achieve the yield. This system is much slower than many other systems, and because the yield is often compromised, this is a very expensive way to produce proteins (Hodgson, 1993).

Problems with Scale-Up

It is impossible to duplicate the conditions in a small reactor when scaling up to a large reactor. Shear and mixing time will be different and the cells will be in a somewhat different environment in the large-scale reactor (Shuler and Kargi, 1992). Potentially a biopharmaceutical firm might conduct Phase I clinical trials with a product that was developed on a modest scale and end up unintentionally conducting larger-scale trials with a different product. The difference would result from the subtle changes in post-translational processing due to heterogeneities in the large-scale system that alter cell physiology. We do not know how to predict when this is going to occur, and a great deal of fundamental work is needed in this area. Often we do not have full control of the quality of the product we are making when we go to a larger scale.

Even if we can make the identical product at the larger scale, we still sometimes face problems in recovery and purification. Many of the steps in recovery and purification can alter side groups on the proteins, which increases the number of different forms of the protein that are present in the product. The extent to which post-translational processing affects the therapeutic value of a protein is unclear, but in many cases we know it is important. This is an issue that the FDA has not fully resolved. The primary hurdle remaining in the production of therapeutic proteins is the issue of product quality control.

Small Molecules

The era of therapeutic proteins being used in vaccines is drawing to a close. Therapeutic proteins will still be important products for many years and new products will be developed, but proteins are very awkward to use as therapeutics. As we become more sophisticated in our knowledge, there will be more emphasis on the use of small molecules.

Our knowledge of DNA can be used as the basis for making small molecules via metabolic, or pathway, engineering (Bailey, 1991). We can insert into a cell a gene that will encode an enzyme. That enzyme can augment the flux of material down a particular metabolic pathway or it can be used to generate totally new pathways. It is clear that if we do this there is a problem of optimization. If I send 100 percent of the material down one particular pathway, the rest of the cell, the rest of the infrastructure, would collapse. It is also clear that sending zero down the pathway

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 104

does not make any product. So there is an optimum between those two options that has to be designed.

People have tried to apply metabolic control theory to get a better sense of the optimal conditions. However, in many such applications, people have not adequately taken into account that the product of a particular pathway may affect, directly or indirectly, the material that feeds into the pathway. These applications have only looked at part of the cell and not the whole cell.

There are some good examples where metabolic engineering may prove fruitful. Polykeytide antibiotics and anticancer agents have been produced from Streptomyces (McDaniel et al., 1993). Through this route, we can generate many novel structures that may lead to compounds that will be of significant medicinal use.

Biotechnology has led to ways of designing compounds via chemical synthesis of, for example, oligonucleotides and nucleotide analogs. Antisense technology is based on the production of oligonucleotides. These will be areas in which biopharmaceutical firms will be involved.

Carbohydrate drugs seem poised to enter the market. These tend to mimic some of the sugars that are added to proteins. In many cases, sugars are what is recognized by a receptor protein in the signaling process, and a carbohydrate component could be used to mediate physiological functions. Cell adhesion molecules are being investigated because they have been involved in cancer metastasis, arthritis, and other medically important conditions.

Epigenesis

The products discussed so far are based on DNA knowledge. This knowledge presumes a model where unique genes have unique effects on physiology. That is a paradigm that has worked well in diseases such as hemophilia and has served the biopharmaceutical industry well. However, this paradigm can fail and the concept of genetic determinism can fail.

Ninety-eight percent of diseases have a strong epigenetic factor. The rules governing cellular and physiological regulation are located not in the chromosome, but in the complex interactive nonlinear metabolic networks resulting from multiple gene expression. These epigenetic networks organize cellular and physiological responses to environmental signals (Strohman, 1994). Biopharmaceutical firms must understand the limitations that epigenetic networks can place on the success of single gene products and on gene therapy. For many polygenic complex disorders, such as cancer, the epigenetic network will ultimately control an individual's fate; a single gene will not always control response.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 105

The p53 Gene

We know about the p53 gene and how its mutation is linked to cancer. An analogy can be made with the two-brake car used for driver education. The p53 gene is the brake that we can put on cell proliferation. For example, a mouse has been constructed in which both p53 alleles are absent, which means that both brakes are missing. You might expect in such a circumstance that the driver education car (the mouse) would crash. Yet, in these experiments the mice were all normal at birth and early developmental growth was normal. In adulthood many of the mice developed cancer far in excess of the control population, but there were still some mice that never developed cancer. Why? Because there was an emergency brake. Other gene products, and in some sense the whole epigenetic network, were able to functionally replace the p53 gene product. There is a great deal of redundancy built into biological systems. Thus, genetic analysis showing a mutation in the p53 gene in a single allele in a single cell does not mean that cancer will develop. So there is a distinction between being able to determine predisposition and being able to predict what will happen. This situation is similar to that of the car with only an emergency brake, where there potentially is a higher probability of a crash than when both brakes are working.

The epigenetic system, however, is often neglected in discussions about the results of the Human Genome Project. When you work with bioprocesses, you have to deal with epigenetic responses. When a normal p53 gene is introduced into a p53-defective cell, it may or may not restore normal regulation. It depends on the type of mutation that has occurred in the p53 gene. So a biopharmaceutical firm that embarks blindly on making a p53 product can certainly not be guaranteed success.

Effect of Environment

Another way of looking at this is that many genotypes can give rise to a single phenotypic functional form. The same genotype can give rise to different phenotypes in different environments. Thus, genes are necessary but not necessarily sufficient to determine outcomes. Epigenetic behavior means that the history of the organism becomes important: previous as well as current environments are important.

When you build a bioprocess, you need to be careful how you prepare your inoculum. It has to be done in precisely the same manner every time if you want the same results. The same is true when you treat disease. You have to realize that the patients, even with the same genes, may not be the same if they have been exposed to different environments.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 106

When we encounter very complex interactive systems, it is often important to have mathematical models to help us consider those systems. Such models can help us ask ''what if" questions, identify knowledge gaps, and test alternative mechanisms. They probably can help us to reasonably predict what is going to happen with a large population, although in terms of any individual the predictions are probably not valid.

It is surprising and somewhat disconcerting that the biological community as a whole barely tolerates mathematical models. About 12 years ago, I had one paper rejected by a microbiology journal—a paper in which I used a mathematical model—based on a reviewer's comment that the laws of thermodynamics did not apply to living systems.

There have been failures with biopharmaceutical products caused by an inability to predict how the products were going to interact with complex systems. Three companies met their Waterloo over sepsis products that had to work in a very complex ecology, not only within cells but with microorganisms involved in sepsis. The inability to predict interactions in complex systems was the principal reason clinical trials did not support the development of those products.

A nucleotide analog (fialuridine) that was given in a recent trial for hepatitis B at the National Institutes of Health (NIH) resulted in the deaths of 5 of the 15 patients (Cimons, 1993). There was unexpected inhibition of replication of liver mitochondria. Because it takes a while for mitochondria to be replaced, taking patients off the drug did not immediately alleviate symptoms.

Fialuridine had previously been given to patients with acquired immune deficiency syndrome (AIDS) but was withdrawn because it did not appear to have much therapeutic effect. At least three other nucleotide analogs are either being used on AIDS patients or are in clinical trials. Here again, the difficulty will be our inability to appreciate the interaction throughout the whole system.

Biological Synthesis

The pharmaceutical industry is interested in switching from using chemical to using biological synthesis. When you make products via chemical synthesis, you often end up with right-handed and left-handed molecules. When you use these drugs, you hope that one enantiomer gives you the therapeutic effect and the other does nothing. Sometimes the other enantiomer causes adverse effects in patients. Because biological synthesis results in products with just one orientation (Stinson, 1993), we are going to see increasing emphasis in pharmaceutical firms on biotransformations using modulated cells or enzymatic synthesis.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 107

Drugs From Plant Products

We are also going to see many new drugs from natural products. These products are essentially empirical: we can get some clues about them from their use by primitive societies, which have accumulated a wealth of knowledge about which plants to use and when to harvest them. For example, if warm milk is used to extract the product, that tells you something about the nature of the product. There is tremendous untapped genetic diversity in plants, perhaps even more so in marine systems, particularly marine algae. I have seen estimates of 250,000 to 500,000 for the number of plant species. Many, of course, are disappearing very quickly, before they can be cataloged. Between 5,000 and 40,000 have been tested for pharmaceutical activity (Abelson, 1990; Svoboda, 1989). We have 120 prescription drugs worldwide that are based on extracted plant products. The question is, to what extent are we missing other opportunities?

Right now at least nine compounds from plants or marine algae are in preclinical trials for both AIDS and cancer (Anonymous, 1992). For pharmaceutical firms to pursue these kinds of products, there needs to be an enabling technology that ensures that this product can be made in quantities useful for testing and ultimately for production. So the supply issue becomes important.

Taxol: A Case Study

Much of the interest in natural products was renewed because of taxol, which comes from the bark of the Pacific yew tree. There was a good deal of publicity about it a few years ago, partially because it appeared to be a very exciting drug, the best anticancer drug that had come along in about 15 years. It has been approved for use against ovarian cancer and is in advanced clinical trials with breast and lung cancers. It is also in clinical trials against several other cancers. The ultimate therapy will probably be a combination therapy with cisplatin or some other chemotherapeutic drug. These clinical trials were greatly impeded because of the problem of taxol supply. Until this point the National Cancer Institute (NCI) had always presumed that if they needed a compound, it could be made in quantities sufficient to test it. This was not the case with taxol (Cragg et al., 1993).

Sam Broder, chief of NCI, said that taxol was the first but would not be the last product where the supply would be the critical issue. Why was the supply a critical issue with taxol? It comes from the bark of the Pacific yew tree. It takes about 18 months to dry the bark before it can be extracted to produce the product. It takes three 100-year-old trees to generate

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 108

enough bark to treat one patient. (There is nothing magic about a 100-year-old tree, it just takes many more 2-year-old trees to supply enough bark.) Pacific yew trees are among the slowest growing trees in North America and are fairly uncommon. The Pacific yew is the home of the spotted owl, which caused much controversy with environmentalists. In a sense, environmentalists were pitted against cancer patients.

So there was a need to look for alternative supplies. The taxol molecule is very complex. Just recently it was announced that total synthesis has been achieved (Nicoleou et al., 1994; Holton et al., 1994a, b). However, the synthesis was achieved by a route that will never be commercially useful. The chemistry is still important; it gives us some clues in terms of taxol analogs that may be more easily administered or more effective than taxol.

Producing taxol by genetic engineering is very unlikely. Part of the problem is that NIH has funded a great deal of work on bacteria and on mammalian cells but very little fundamental work on plant cells. We know far less about the biochemistry, genetics, and physiology of plants than we do about E. coli or about mammalian cells. No one knows the pathway for taxol production or what enzymes are involved. We do not know where the genes are located or how they are regulated. So in the short term, genetic engineering is not a likely route for making a product such as taxol or other products from algae or plants.

Another route, however, is being commercially developed now. You can extract a precursor of taxol from common yews and convert that precursor into taxol by a semisynthetic method. Taxol can also be extracted directly from the needles of the ornamental yews, but this is probably not a promising route because of the complexity of the needle extract (e.g., pigment co-elutes with taxol).

Plant Cell Tissue Culture

The final possibility is plant cell tissue culture (Payne et al., 1992; Shuler, 1993). One of the advantages of plant cell tissue culture is that you can be assured of an expandable supply and the quality will be the same in June as it is in January, which is not true with natural plant material. With plant cell tissue culture it is possible to select high-yielding variants much more easily. The product can be much cleaner (fewer contaminants present) than a product from the natural extraction, either from needles or from bark. Finally, novel products can be generated in such systems, such as novel taxanes. The question will be whether such novel taxanes have any useful biological activity.

In my laboratory we have worked on plant cell tissue culture for about 20 years and have been fortunate to be supported by the National

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 109

Science Foundation (NSF). Our work has been directed toward enabling technology, and we have developed several strategies for producing secondary metabolites from plant cells (Payne et al., 1992). I believe that one important role for government funding is support of enabling technologies. The presence of such technologies makes possible the development of products that might otherwise be abandoned.

Two of my students formed a company that is taking the strategies we developed and trying to apply them to taxol. I am on the board of directors of this company but have no financial interest in it. For example, I am principle investigator of a consortium grant from NCI to Cornell University that involves another academic institution, two commercial companies (one of which is the one started by my students), and a government lab. We are trying to better understand the scientific basis of the production of taxol in tissue culture, but the mixture of private, academic, and government labortories generates many possibities for conflict of interest.

With a cell line developed by Ray Ketchum at the U.S. Department of Agriculture, we have been able to achieve at least 20 milligrams of taxol per liter. An article in Science reported a fungus that made taxol (Stierle et al., 1993). The level that we have achieved is nearly one million times higher than the level that was achieved in the fungus, which means that a one-liter reactor could make as much taxol as could a million-liter reactor with the fungus, assuming there were no other improvements in production.

We have also seen novel taxanes with some cell lines under some culture conditions. We have not yet had a chance to explore their bioactivity. These observations argue that plant cell culture of taxol is a plausible possibility. Commercial development of plant cell tissue culture is on track in meeting all of the milestones that have been set.

It has been very satisfying to see ideas that we worked on for 20 years being put into practice. My colleagues told me that it was not worth working on plant cell tissue culture because there was no commercial processing and there probably never would be; although there had been a commercial process of plant cell tissue culture in Japan, this had not been done in the United States. The support of NSF for the development of enabling technology was very important.

I find that even though I know the principals in one of the companies very well, there are sometimes awkward situations. I have to remember, for example, if I heard something while I was a member of the company board of directors or while I was the director of the RO1 NIH grant supporting educational research in the same general area. One is confidential, the other can be discussed freely.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 110

Epigenetic Behavior of Plant Cell Culture

If you do not believe in epigenetic behavior, you should work with plant cell culture. There is one cell line we have that produces about 10 milligrams of taxol per liter. We have had it go through cycles where it produced, then it did not produce, then it produced, and then it did not produce again. The genetics had not changed, but cells are sensitive to history. If you have a certain procedure for how you subculture that cell line, you had better be very religious about following it. You had better use the same size flask, the same type of cap, the same level of inoculum: everything had better be the same. If you are going to work with these cell lines you have to very careful and very sensitive to the history of the culture as well as to the environmental cues that you are giving the culture.

Implications For Education

There are other upcoming products in biotechnology. We will be hearing a lot more about tissue engineering and gene therapy. There will be a market for biopharmaceutical firms to produce viral vectors to enhance gene therapy. There may be a market for customized products for treating cancer with a patient's own specific immunological markers. The biopharmaceutical industry will continue to change and its needs will affect our graduates.

Several trends are clear. One trend affecting every major industrial sector is that research and development growth is strongest in small companies. Small companies emphasize flexibility and are able to rapidly move to market. New health care economics will place greater emphasis on efficient production systems and on products with clear therapeutic advantages. Companies that can predict the product winners will succeed; companies unable to anticipate their product's effectiveness in complex systems may fail. Companies with inefficient processes will ultimately pay a stiff penalty.

In such an environment, graduates will be required to understand systems, either to predict effectiveness or to produce products efficiently. Graduates who are expert in a small subsystem but cannot realistically relate that subsystem to the whole process will become as extinct, perhaps, as Jurassic Park dinosaurs.

Finally, breadth will become increasingly important. A perspective from discovery to process to market and an understanding about product usefulness will be needed. However, students still must have depth in at least one subarea of biotechnology; depth cannot be sacrificed to breadth.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 111

In Jurassic Park, John Hammond had a great vision and drove the project, but his vision blinded him to many of the dangers of the project. Henry Wu, who focused totally on the technical molecular level, simply had no vision at all. The advisory board was present, but they knew too little too late, and that may be a recipe for failure for any biopharmaceutical firm.

References

Abelson, P. H. 1990. Medicine from plants. Science 247:513.

Anonymous. 1992. Natural product agents in development by the United States National Cancer Institute. J. Natural Prod. 55:1018-1019.

Armstrong, F. B. 1989. Biochemistry. 3rd ed. New York: Oxford University Press.

Bailey, J. E. 1991. Toward a science of metabolic engineering. Science 252:1668-1675.

Bialy, H. 1987. Recombinant proteins: virtual authenticity. Bio/Technology 5:883-889.

Cimons, M. 1993. Experimental hepatitis drug proves deadly in clinical trial. Am. Soc. Microbiol. News 59:596-597.

Cragg, G. M., S. A. Schepartz, M. Suffness, and M. R. Grever. 1993. The taxol supply crisis. New NCI policies for handling the large-scale production of novel natural product anticancer and anti-HIV agents. J. Nat. Prod. 56:1657-1668.

Holton, R. A., H.-B. Kim, C. Somoza, F. Liang, R. J. Biediger, P. D. Boatman, M. Shindo, C. C. Smith, S. Kim, H. Nadizadeh, Y. Suzuki, C. Tao, P. Vu, S. Tang, P. Zhang, K. K. Murthi, L. N. Gentile, and J. H. Liu. 1994a. First total synthesis of taxol. 2. Completion of the C and D rings. J. Am. Chem. Soc. 116:1599-1600.

Holton, R. A., C. Somoza, H.-B. Kim, F. Liang, R. J. Biediger, P. D. Boatman, M. Shindo, C. C. Smith, S. Kim, H. Nadizadeh, Y. Suzuki, C. Tao, P. Vu, S. Tang, P. Zhang, K. K. Murthi, L. N. Gentile, and J. H. Liu. 1994b. First total synthesis of taxol. 1. Functionalization of the B ring and first total synthesis of taxol. J. Am. Chem. Soc. 116:1597-1598.

Hodgson, J. 1993. Expression systems: a user's guide. Bio/Technology 11:887-893.

Liu, D. T-Y. 1992. Glycoprotein pharmaceuticals: scientific and regulatory considerations, and the U.S. Orphan Drug Act. Trends Biotechnol. 10:114-120.

Luckow, V. A. 1991. Cloning and expression of heterologous genes in insect cells with baculovirus reactors. Pp, 97-152 in Recombinant DNA Technology and Applications. A. Prokop, R. K. Bajpai, and C. S. Ho, eds. New York: McGraw-Hill.

McDaniel, R., S. Ebert-Khosla, D. A. Hopwood, and C. Khosla. 1993. Engineered biosynthesis of novel polyketides. Science 262:1546-1551.

Monica, T. J., C. F. Goochee, and B. L. Maiorella. 1993. Comparative biochemical characterization of a human IgM produced in ascites and in vitro cell culture. Biotechnology 11:512-515.

Nicoleau, K. C., Z. Yang, J. J. Liu, H. Ueno, P. G. Nantermet, R. K. Guy, C. F. Clalborne, J. Renaud, E. A. Couladouros, K. Paulvannan, and E. J. Sorensen. 1994. Total synthesis of taxol. Nature 367:630-634.

Payne, G. F., V. Bringi, C. L. Prince, and M. L. Shuler. 1992. Plant Cell and Tissue Culture in Liquid Systems. New York: Hanser Publishers.

Ramabhadvan, T. V. 1987. Products from genetically engineered mammalian cells: benefits and risks. Trends Biotechnol. 5:175-179.

Shuler, M. L., and F. Kargi. 1992. Bioprocess Engineering: Basic Concepts. Englewood Cliffs, N.J.: Prentice-Hall.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 112

Shuler, M. L. 1993. Plant cell culture: an approach to exploiting the genetic and chemical diversity of higher plants. Pp. 58-67 in Research Opportunities in Biomolecular Engineering: The Interface Between Chemical Engineering and Biology. Washington, D.C.: U.S. Department of Health and Human Services.

Stierle, A., G. Strobel, and D. Stierle. 1993. Taxol and taxane production by Taxomyces andreanae, an endophytic fungus of Pacific yew. Science 260:214-216.

Strohman, R. 1994. Epigenesis: the missing beat in biotechnology? Biotechnology 12:156-164.

Stinson, S. C. 1993. Chiral drugs. Chem. Eng. News Sept. 27:38-65.

Svoboda, G. H. 1989. Society news. Am. Soc. Pharmacol. Newslett. 25(2):6.

VanBrunt, J. 1990. The importance of glycoform heterogeneity. Biotechnology 8:995.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 113


Bioremediation:
A Promising Technology

GENE F. PARKIN

In this country over the past 20 to 30 years, we have contaminated our environment with a wide variety of organic and inorganic chemicals. Environmental professionals have been given the charge of cleaning up this environmental contamination, with the general goal of protecting the public health. Bioremediation is one of the emerging technologies that environmental professionals use to attempt to remedy these contamination problems. The objective of this chapter is to discuss bioremediation: to describe how it works, its advantages and limitations, and what we need to know to better apply the technology in the future.

Bioremediation is generally considered to be an emerging technology. Most environmental professionals feel that bioremediation is an attractive alternative, because it offers significant potential for cost-effective, environmentally acceptable treatment of contaminated waters and soils. However, no treatment technology is a panacea, applicable to all situations. This is reasonably well documented in a 1993 report from the National Research Council's Committee on In Situ Bioremediation, which states:

Bioremediation offers significant potential for cost-effective, environmentally acceptable treatment of contaminated waters and soils. However, bioremediation is clouded by controversy over what it does and how well it works, partly because it relies on microorganisms, which cannot be seen, and partly because it has become attractive for "snake oil salesmen" who claim to be able to solve all types of contamination problems. As long as the controversy remains, the full potential of this technology cannot be realized.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 114

The report describes the technology as having some controversy and confusion associated with it. One of the major objectives here is to attempt to remove some of this controversy, to give an idea of when we might apply bioremediation and what we need to know to successfully apply it in the future.

We can start with a general definition of bioremediation as the use of living organisms, or the catalysts that they produce, to bring about the removal or destruction of pollutants that contaminate a wide variety of substances, such as water, wastewater, and soils. We have even applied biological processes to contaminated gas streams. The major focus here will be the use of bacteria in bringing about bioremediation, but other organisms, such as fungi, plants, protozoa, and algae, are also used to facilitate bioremediation.

An Emerging But Not A New Technology

Bioremediation is an emerging technology, but we need to recognize that it is not really a new technology. Environmental engineers (in the past we were called sanitary engineers) have been using biological processes to solve various contamination problems for at least a century, using processes such as anaerobic digestion for treating sludges from wastewater treatment plants in the 1880s, trickling-filter treatment of domestic wastewater in 1894, activated-sludge treatment of domestic waste-water in 1914, and biological treatment of industrial waste waters in the 1930s. Over 20 years ago (in 1972), the first documented success of in situ bioremediation was reported by Richard Raymond, who is thought by some to be the father of bioremediation. Thus, the use of biological processes to remove or treat pollution is not new, except for perhaps the name bioremediation.

The question then is, what is new? In my opinion, two things are new. First is the chemicals that we are trying to biodegrade: anthropogenic and so-called xenobiotic chemicals. Anthropogenic chemicals enter the environment primarily as a result of human activity; xenobiotic chemicals are "foreign" to natural biota. One might think of it as the difference between degrading something that is relatively degradable, such as sugar, and degrading a chlorinated solvent, such as trichloroethylene.

The second new aspect of the problem is the complexity of the medium or matrix that has become contaminated with these chemicals. Most of this discussion will focus on subsurface contamination—the contamination of our groundwater environment—because that is the area where bioremediation can dramatically affect our ability to cleanup these sites. However, these situations are extremely complex and present major engineering challenges.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 115

The Underground Environment

We are interested in bioremediating three segments of the underground environment that can become contaminated: the near-surface soil (the vadose or unsaturated zone, which includes some moisture and some soil gas), the aquifer (the water), and the solid materials. These segments can become contaminated in numerous ways by numerous chemicals. As shown in Figure 9-1, a chemical in area A might be the spill of a light nonaqueous-phase liquid (LNAPL) such as gasoline, which will spread out over the groundwater table when it migrates down through the vadose zone because it is lighter than water. Then, of course, it will at least partially dissolve in the water and move with the groundwater flow, contaminating the water and the soil or the aquifer materials with which the water comes in contact (area B). Aquifers can also become contaminated with dense nonaqueous-phase liquids (DNAPLs), which have a density greater than water so that they sink to the bottom of the groundwater table (area C), dissolve, diffuse, and contaminate the aquifer (area D).

The complexity of cleaning the underground environment has been described by Perry McCarty (1994) of Stanford University as follows: take your hand and put it in a bucket of motor oil and make a fist, and then pull the fist back out of the motor oil. Now the job is to remove all the oil from your fist. You can repeatedly submerge your fist into buckets of clean water and you will remove some but not all the oil. You can submerge your fist into buckets of warm soapy water and remove more of the oil, but not all of the oil. You can blow hot air all over your hand, and maybe remove some additional oil, but you cannot remove all the oil

image

FIGURE 9-1 Treatment domains. Adapted from McCarty (1994).

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 116

without opening up the fist. Yet, that is our challenge. Bioremediation offers significant potential for attacking these problems.

Ex Situ And In Situ Bioremediation

There are two general types of bioremediation. Ex situ, or aboveground bioremediation (Figure 9-1), involves the design, construction, and operation of an engineered reactor above the ground. Water is pumped out of the ground or soil is dug up, made into a slurry, and then put into a reactor that is located above the ground. Bioremediation in the ground is referred to as in situ bioremediation.

In situ bioremediation will be the major focus here, because this is where there is significant potential for improvement. In situ bioremediation can proceed in two ways. It can be natural (sometimes termed intrisinic bioremediation), which means no action is taken other than to monitor the site. The idea is that if the rate at which the microorganisms are degrading the pollutant is faster than the rate at which the pollutants are leaving the site, the site will cleanse itself. However, more often we are interested in engineered in situ bioremediation, in which materials are added to the ground to stimulate the growth of organisms. Nutrients such as nitrogen and phosphorus may be needed. An electron acceptor such as oxygen might be added. Trying to deliver these materials represents a significant engineering challenge, and may even include introducing microorganisms into the subsurface environment.

The scheme shown in Figure 9-2 demonstrates some of the general ways in which we might do this (MacDonald and Rittmann, 1993). Many

image

FIGURE 9-2 A system for treating regions above and below the water table.
Reprinted with permission from MacDonald and Rittmann (1993).

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 117

modifications are possible. The dotted line represents the groundwater table, and contamination both above and below the water table is treated. For example, nitrogen and phosphorous might be added by using an injection well or in an infiltration gallery, which may simply be a perforated pipe in the ground. Groundwater would then be moved in one direction or another, perhaps by using an extraction well. As water and soil gas are removed via this extraction well, above-ground treatment could also be used. Treated groundwater can be reinjected, by using groundwater recirculation, to meet a target cleanup standard. In situ bioremediation involves adding materials to the ground, making sure that the microbes come in contact with all the materials that are necessary for organisms to grow. Obviously, if an anaerobic, or reduced, bioremediation scheme is desired, hydrogen peroxide as an oxygen source would not be added.

Why Bioremediation?

A few statistics will indicate the potential for bioremediation. It has been estimated that more than 200 million tons of hazardous materials are generated in this country each year (Wentz, 1989), and more than 50,000 sites have been identified that need some type of remediation (Gibson and Saylor, 1992). It also has been estimated that 40 million people live within 4 miles of a Superfund site (Gibson and Saylor, 1992). Of the 2 million underground storage tanks that are storing only gasoline, as many as 600,000 are leaking or expected to leak soon (Bakst and Devine, 1993). Because 50 percent of the U.S. population gets drinking water from groundwater, this presents a potentially significant problem. The most recent estimates indicate that cleanup will cost around $1.7 trillion (Gibson and Saylor, 1992). Some are now estimating that bioremediation might be a $500 million per year industry by the year 2000 (MacDonald and Rittmen, 1993). So, there is tremendous potential for the application of bioremediation.

Why might we want to use bioremediation? What drives environmental professionals to want to use this process? Bioremediation is one of the few processes that can actually destroy pollutants, converting organic materials into carbon dioxide, water, chloride, and other minerals. Biological processes may cost less than physical and chemical processes. In situ bioremediation has additional, potentially important advantages over pump-and-treat technology (pump the water above ground to treat it). Because the organisms are grown near the pollutant, the process may occur more quickly. This is particularly important for hydrophobic compounds, which tend to ''stick" to the soil material. If organisms are growing near where the hydrophobic compounds are adsorbed, the equilibrium

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 118

between the liquid and the solid can be destroyed and the chemicals can begin to desorb from the solids and be biodegraded. If these processes are occurring in the ground, there is no need to dewater (remove water from) the aquifer.

Limitations

Remembering that no treatment process is a panacea, it is very important to understand the limitations of a given treatment technology, and there are limitations to the technology of bioremediation. First, it is not universally applicable. Because we are dealing with potentially toxic compounds, there is a danger that the organisms that you are trying to grow will be killed. Toxic intermediate compounds may be produced by these organisms. The classic example is anaerobic biotransformation of trichloroethylene into vinyl chloride, which is a known human carcinogen. Most people feel that vinyl chloride is more of a health problem then the parent compound. Finally, in situ chemical reactions or organism growth may clog wells or aquifers.

Figure 9-3 illustrates the feasibility of in situ bioremediation by plotting

image

FIGURE 9-3 Feasibility of in situ bioremediation. Adapted
from Kavanaugh (1994).

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 119

the biodegradation rate constant (how fast bacteria degrade the pollutant) versus the hydraulic conductivity (the rate at which water can move through the aquifer materials). If the rate constant is very high, bioremediation is feasible. However, feasibility is also limited by hydraulic conductivity. Obviously, if the water will not move very fast, then neither the organisms nor the nutrients can move through the water, and effective bioremediation will not result. Hydraulic conductivity may be embraced with new technologies such as pneumatic fracturing, which attempts to break up aquifer materials and create paths for nutrients, organisms, and water to flow. The values given in figure 9-3 are relative, and a more extensive database is needed so that we can better understand these limitations. Bioremediation works best in homogeneous aquifers and with compounds that are not overly hydrophobic. Hopefully, in the future, continued research will enable us to remove some of these limitations.

Process Fundamentals

Microbiologists and others doing research in the biological sciences are well aware of what it takes to care, feed, and grow an organism, but engineers often need to be reminded. A general checklist of requirements for microbial activity is useful in this regard (Table 9-1). Additional details are given by Flathman et al. (1993), Gibson and Saylor (1992), Madson (1991), McCarty (1993), National Research Council (1993), Norris et al. (1993), Parkin and Calabria (1986), Thomas and Ward (1989), and Zitomer and Speece (1993). A few items are particularly important when trying to facilitate bioremediation. For example, items 1 and 2: organisms need a carbon and energy source to grow. This source may be found in the pollutant

TABLE 9-1 Checklist of Requirements for Microbial Activity

1.

Carbon source

2.

Energy source (electron donor)

3.

Terminal electron acceptor

4.

Macronutrients (nitrogen and phosphorus)

5.

Micronutrients (e.g., trace metals and sulfur)

6.

Proper pH

7.

Proper temperature

8.

Absence or control of toxic materials

9.

Adequate contact (bioavailability)

10.

Adequate time

11.

Desired microbes or genetic machinery

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 120

to be removed. However, in some cases a growth substance needs to be added, which is an engineering challenge and an engineering cost. The terminal electron acceptor condition, or the nature of the environment, is very important. For example, if the growth of aerobic bacteria is desired, a source of oxygen must be provided.

Items 4 to 8 are environmental factors necessary for microbial growth. For example, many groundwaters are deficient in the macronutrients nitrogen and phosphorus. Many bioremediation schemes involve adding at least one of these nutrients. Items 9 and 10 are engineering variables. With in situ bioremediation, adequate contact, sometimes called bioavailability, is extremely important.

Essentially, we are trying to bring together three elements: the pollutants to be removed, the organisms, and the nutrients needed by those organisms. Without all three, bioremediation cannot be accomplished. It is a significant engineering challenge to promote adequate contact. Sufficient time is also required; usually time is not a big problem with in situ bioremediation because the retention time in aquifers is in the order of months or years, which is usually enough time to bring about these reactions. Finally, there is a need to have the desired microbe or genetic machinery to bring about the desired biological reaction. There is significant potential for future development here.

Compounds, especially xenobiotic compounds, can be biodegraded in two ways: as a primary substrate (electron donor or electron acceptor), where the organism is grown by degrading the organic compound, or by co-metabolism, where the organism does not obtain energy for growth from biotransformation of the organic compound. Co-metabolism is important for many xenobiotic compounds. Here, the organism does not get sufficient energy from the compound to grow, either because the concentration is too low or the organism cannot process the compound as a growth substrate. The compound may still be removed, but a primary growth substrate must be present. This, too, represents a significant engineering challenge.

Several types of microorganisms are used in bioremediation. The current applications of bioremediation generally make use of indigenous organisms (organisms already present in the groundwater environment). The growth of these organisms must be stimulated by adding nutrients, electron donors, and/or electron acceptors. Two other types of organisms that may be used are acclimated strains, which have been developed from naturally occurring organisms, and genetically engineered microorganisms, in which desirable traits from several strains of organisms are combined so that the organism becomes much more versatile. These organisms must survive and grow in situ to accomplish bioremediation. Therefore, one of our major tasks is to develop robust strains of organisms

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 121

that will survive in the environments into which they are introduced. Use of genetically engineered microorganisms is probably years away, more likely for political rather than technical reasons.

Successful Bioremediation

To understand the current status of bioremediation, it is useful to list compounds that have been successfully bioremediated in laboratory-scale studies, pilot-scale studies, field-scale studies, and full-scale applications (Table 9-2). Most of our successes, both ex situ and in situ, have been with the first three categories of compounds: petroleum hydrocarbons; benzene, toluene, ethylbenzene, and xylene compounds (termed BTEX); and the other petroleum-type chemicals like alcohols. We are accumulating additional information and experience with the other chemicals listed.

Data in Table 9-3 and Figures 9-4 and 9-5 describe the current status of the use of bioremediation in the field. The first four technologies listed in Table 9-3 are considered to be established technologies, and it is no surprise that most remedial actions to date have used these technologies. The other listed technologies are considered to be innovative or emerging technologies. As of 1991, bioremediation was being used at approximately 9 percent of Superfund sites.

The Environmental Protection Agency (EPA) has developed a database called the Bioremediation Field Initiative (U.S. EPA, 1993), which is an attempt to catalog the use of bioremediation at not only Superfund sites, but also at other sites. The data in Figure 9-4 show that most bioremediation schemes are in the design phase.

Although bioremediation is most often applied to the petroleum hydrocarbons, chlorinated solvent wastes and pesticides are now also being bioremediated. The "other" category in Figure 9-5 includes compounds such as the nitroaromatics (e.g. trinitrotoluene, or TNT).

TABLE 9-2 Partial List of Compounds That Have Been Bioremediated

Petroleum hydrocarbons (gasoline, diesel, jet fuel, oil)

Benzene, toluene, ethylbenzene, and xylene (BTEX) compounds

Alcohols, ketones, esters

Polynuclear aromatic hydrocarbons (simpler ones such as naphthalene)

Creosote

Chlorinated aliphatic hydrocarbons

CFCs

Chlorinated benzenes

Polycholorinated biphenyls

Phenols and chlorinated phenols

Nitroaromatics

Pesticides (EDB, alachlor, atrazine, dinoseb)

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 122

TABLE 9-3 Superfund Remedial Actions (Through Fiscal Year 1991)

Technology

Number of Sites

Percent of Sites

Solidification/stabilization

128

26

Off-site incineration

85

17

On-site incineration

65

13

Other established

10

2

     

Soil vapor extraction

84

17

Thermal desorption

28

6

Ex situ bioremediation

25

5

In situ bioremediation

20

4

In situ flushing

16

3

Soil washing

16

3

Dechlorination

8

2

Solvent extraction

6

1

In situ vitrification

3

‹1

Chemical

1

‹1

Other innovative

3

‹1

TOTAL

498

100

Source: Adapted from Kavanaugh (1994).

To understand and apply these processes more successfully in the future, we must demonstrate conclusively that bioremediation (i.e., biological processes) is responsible for the disappearance of these compounds. The National Research Council's Committee on In Situ Bioremediation (1993) offers three tests. First, there must be documented evidence that the contaminant is disappearing in the field. This task is relatively easy, with currently available samplers and analytical methods. Second, it must be shown that organisms at the site, or organisms to be introduced at the site, have demonstrated potential for contaminant biotransformation. One way to do this would be to take soil cores from the site, bring them to the lab, and in carefully controlled experiments demonstrate that bioremediation can occur. Third, and most difficult, is to provide conclusive evidence that bioremediation has actually occurred in the field. Several methods can be used: For example, organism concentration can be measured. If the organism concentration increases at the same time the contaminant decreases, then it is likely that the organisms are doing the biotransformation. The disappearance of the electron acceptor (e.g., oxygen) could be monitored. Oxygen disappearing when the pollutant disappears is circumstantial evidence that biotransformation is responsible for the removal. This type of information is particularly difficult to get in the field with present technology.

Bioremediation has been most successful with petroleum hydrocarbons,

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 123

image

FIGURE 9-4 Bioremediation: status of operation. Adapted from U.S. EPA (1993).

even in Alaska at cold temperatures. The initial success was achieved by Richard Raymond in Pennsylvania, where there was a leak of 100,000 gallons of gasoline (Raymond et al., 1975). The free product was recovered by physical means and bioremediation was stimulated by adding ammonium sulfate, a phosphorus source, and oxygen via an air

image

FIGURE 9-5 Bioremediation: wastes being treated. Adapted from U.S. EPA (1993).

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 124

sparger. Petroleum hydrocarbons were monitored and none were detected after 10 months. Data from core samples taken at the site showed that the concentration of organisms in the soil had increased at the same time the hydrocarbons had disappeared, proving that bioremediation occurred.

A second example is a California site where the soil and groundwater were contaminated with petroleum hydrocarbons (Norris and Dowd, 1993). Soil levels ranged from zero to 1900 ppm for total petroleum hydrocarbons (TPH) and up to 32 ppm for the sum of benzene, toluene, ethyl benzene, and xylene (BTEX) compounds. The groundwater levels for BTEX compounds ranged from near zero to more than 6,000 ppb; the drinking water standard for benzene, for example, is 5 ppb. The groundwater was remediated by using groundwater recirculation, the oxygen source added was hydrogen peroxide, ammonium chloride was added as a nitrogen source, and tripolyphosphate was added as a phosphorus source. Within 10 months the levels of TPH and BTEX in the groundwater and soil were below detection limits. Bioremediation was confirmed by noting that the disappearance of TPH and BTEX coincided with the disappearance of the added microbial nutrient, ammonia, and an increase in the carbon dioxide concentration of the soil gas and the groundwater. The presumption is that the increased carbon dioxide comes from the biodegredation of the petroleum hydrocarbons. Additional examples are given by Flathman et al. (1993), National Research Council (1993), and Norris et al. (1993).

To expand current understanding and applications of bioremediation, engineers need answers to several questions. They need to know what is being done, how it is being done, who is doing what, why is it being done, how fast it is being done, whether we can do it faster and better, whether we can control it, whether we can meet cleanup standards, and whether we can predict with reasonable certainty that what we want to happen will happen. It is very important that we be able to reliably predict success.

An example of a recent development answering some of these questions and taken to the field is the anaerobic biotransformation of perchlorethylene (CCl2=CCl2, PCE; also called tetrachlorethene). It has been known for at least 10 years that under anaerobic conditions PCE is reductively dehalogenated, first to trichlorethylene (TCE), then to dichlorethylene (DCE), then to vinyl chloride (a known human carcinogen), and finally, in some cases, to ethene. As each successive chlorine is removed, the reaction rate becomes slower. Thus, these less chlorinated compounds accumulate and appear to be very difficult to biodegrade further. However, aerobic organisms with oxygenase enzyme systems (e.g., methane monoxygenase and toluene dioxygenase) can co-metabolically degrade

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 125

these less chlorinated compounds to carbon dioxide, chloride, and other products. These aerobic processes are quite expensive and have their own problems.

There may be hope for complete anaerobic treatment. Initial evidence was provided by research conducted at a Superfund site in St. Joseph's, Michigan, where there was a TCE plume (McCarty and Wilson, 1992). As the progress of the TCE plume was monitored, TCE decreased, DCE appeared and then began to decrease, vinyl chloride appeared and then began to decrease, and ethene was produced. Researchers at Cornell University investigating the degradation of PCE under anaerobic conditions with and without methane formation found that they could get PCE to degrade to ethene in the laboratory (Freedman and Gossett, 1989; DiStefano et al., 1992). In Europe researchers developed an enrichment from a contaminated river sediment that could dechlorinate PCE all the way to ethene and then to ethane (de Bruin et al., 1992). An organism was isolated from this sediment that uses PCE or TCE exclusively as an electron acceptor for growth (Holliger et al., 1993). This organism dechlorinates PCE and TCE to DCE, and then other organisms in the enrichment apparently convert DCE into ethane.

These exciting discoveries have been used in the field. At St. Joseph's, Michigan, because the conversion of TCE to ethene appears to be occurring more rapidly than the plume is leaving, the strategy is to allow natural bioremediation to occur in some TCE plumes. As a result of the work in the Netherlands, a full-scale above-ground facility (where hydrogen is added to facilitate the growth of the PCE-degrading organisms) is remediating PCE-contaminated waste. In a field-scale study conducted in Victoria, Texas, benzoic acid and sulfate were added to stimulate indigenous organisms that will convert PCE to ethene and ethane (Beeman et al., 1993). Side-by-side control sites demonstrated that bioremediation was responsible for the conversion that occurred. The plot that received benzoic acid and sulfate showed conversion of PCE into ethene and ethane, whereas the control plot that received no benzoic acid and no sulfate showed no PCE removal. Thus, recent developments in this technology are being quickly taken to the field.

Limitations And Future Needs

However, we must be careful not to move too quickly. Many limitations need to be overcome if we are to realize the full potential of bioremediation. It is perhaps useful to categorize the limitations and needs as engineering, basic and applied microbiology, and carefully controlled field studies.

Perhaps the most important engineering challenge is the need for

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 126

better site characterization. We need inexpensive, efficient ways to determine the extent of aquifer contamination. We also need better methods of assessing, in the field, whether the potential exists for biotransformation of the compounds there. Techniques based on molecular biology (e.g., gene probes and immunoassays) may be very helpful here. If the contaminants are not bioavailable, they are not going to be bioremediated. We need to understand the effect of the nonaqueous-phase liquids (NAPLs) on the important microorganisms. How close can the organisms live to these liquids? We need better models both for screening and prediction. We need better scale-up methods, particularly in the field. How do we go from laboratory scale to pilot scale and then to field scale? How do we reasonably ensure success? Finally, there is a need for innovative process design: innovative reactor designs for ex situ systems and innovative delivery systems for in situ applications. There is much room for improvement here.

In basic and applied microbiology, there is potential for explosive improvement in bioremediation. Consider, for example, the discovery of new bacterial strains, such as the PER-K23 strain that uses PCE or TCE exclusively as an electron acceptor (Holliger et al., 1993). Such advances may be taken rapidly to the field, as described above. We need better understanding of microbial ecology and population dynamics in the environment and of how microbial ecology and these dynamics are affected by mixtures of contaminants. Rarely is a site contaminated with only one compound. With a better understanding of the pathways of degradation, perhaps we can control these pathways. We also need to better understand gene expression and manipulation in the field, and new tools need to be developed. For example, how does one tell whether genetic machinery is available at a site to bring about the desired reactions? There is significant potential here for probe techniques, for example, gene probes and immunoassays. There is potential for development of genetically engineered microorganisms and a general need for developing robust strains of organisms. Finally, there is even potential for bioremediation of some metals. For example, hexavalent chromium can be reduced to trivalent chromium, which is unstable and will precipitate as a solid, making it less mobile. It may be possible to reduce uranium (VI) to uranium(IV), which will precipitate as an oxide.

Finally, there is a need for more carefully controlled field studies. Such data would help address the problem of how to go from the laboratory to the field. The field studies at Stanford University offer a model (Hopkins et al., 1993; Roberts et al., 1990; Semprini et al., 1991). An interdisciplinary approach is needed that includes engineers, microbiologists, microbial ecologists, molecular biologists, hydrogeologists, chemists, etc. We need field studies that include side-by-side comparisons with controls.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 127

These field studies will help generate a more extensive inventory of bioremediation successes and failures. After all, successful engineering is basically the prudent use of these kinds of inventories. There has been some movement in this area. Two examples are the Advanced Applied Technology Demonstration Facility (funded by the Department of Defense) at Rice University and the field-testing initiative that is being coordinated by the University of Michigan at a site in Michigan.

With additional research and experience in the field, bioremediation will no doubt become an even more important treatment technology in the arsenal of technologies used to clean up our contamination problems. Bioremediation offers the promise of cost-effective, environmentally acceptable treatment of contaminated waters, soils, and sediments. There is much interest in and enthusiasm for this emerging biotechnology.

References

Bakst, J. S., and K. Devine. 1993. Bioremediation: environmental regulations and resulting market opportunities. Pp. 11-48 in Bioremediation: Field Experience, F. E. Flathman, D. E. Jerger, and J. H. Exner, eds. Chelsea, Mich.: Lewis.

Beeman, R., S. Shoemaker, J. Howell, E. Salazar, and J. Buttram. 1993. A field evaluation of in situ microbial reductive dehalogenation by the biotransformation of chlorinated ethylenes. Proceedings of the 2nd International Symposium on In situ and On-Site Bioreclamation, San Diego, April 1993.

de Bruin, W. P., M. J. J. Kotterman, M. A. Posthumus, G. Schraa, and A. J. B. Zehnder. 1992. Complete biological reductive transformation of tetrachloroethylene to ethane. Appl. Environ. Microbiol. 58:1996-2000.

DiStefano, T. D., J. M. Gossett, and S. H. Zinder. 1992. Hydrogen as an electron donor for dechlorination of tetrachloroethylene by an anaerobic mixed culture. Appl. Environ. Microbiol. 58:3622-3629.

Flathman, F. E., D. E. Jerger, and J. H. Exner, eds. 1993. Bioremediation: Field Experience. Chelsea, Mich.: Lewis.

Freedman, D. L., and J. M. Gossett. 1989. Biological reductive dechlorination of tetrachloroethylene and trichloroethylene to ethylene under methanogenic conditions. Appl. Environ. Microbiol. 55:2144-2151.

Gibson, D. T., and G. S. Saylor. 1992. Scientific Foundations of Bioremediation: Current Status and Future Needs. Washington, D.C.: American Academy of Microbiology.

Holliger, C., G. Schraa, A J. M. Stams, and A. J. B. Zehnder. 1993. A highly purified enrichment culture couples the reductive dechlorination of tetrachloroethylene to growth. Appl. Environ. Microbiol. 59:2991-2997.

Hopkins, G. D., L. Semprini, and P. L. McCarty. 1993. Microcosm and in situ field studies of enhanced biotransformation of trichloroethylene by phenol-utilizing microorganisms. Appl. Environ. Microbiol. 59:2277-2285.

Kavanaugh, M. C. 1994. In situ remediation: research needs. Paper presented at AEEP Research Opportunities Conference, Ann Arbor, Michigan, September 20-22, 1994.

MacDonald, J. A., and B. E. Rittmann. 1993. Performance standards for in situ bioremediation. Environ. Sci. Technol. 27:1974-1979.

Madsen, E. L. 1991. Determining in situ bioremediation: facts and challenges. Environ. Sci. Technol. 25:1662-1673.

Suggested Citation: "BIOTECHNOLOGY APPLICATIONS TODAY AND TOMORROW." Frederick B. Rudolph, et al. 1996. Biotechnology: Science, Engineering, and Ethical Challenges for the Twenty-First Century. Washington, DC: Joseph Henry Press. doi: 10.17226/4974.

Page 128

McCarty, P. L. 1994. In situ remediation. Paper presented at AEEP Research Oppostunities Conference, Ann Arbor, Michigan, September 20-22, 1994.

McCarty, P. L. 1993. In situ bioremediation of chlorinated solvents. Curr. Opin. Biotechnol. 4:323-330.

McCarty, P. L., and J. T. Wilson. 1992. Natural anaerobic treatment of a TCE plume, St. Joseph, Michigan, NPL site. Pp. 47-50 in Bioremediation of Hazardous Wastes, EPA/R-92/126. Cincinnati, Ohio: US EPA Center for Environmental Research Information.

National Research Council. 1993. In Situ Bioremediation. When Does It Work? Washington, D.C.: National Academy Press.

Norris, R. D., and K. D. Dowd. 1993. In situ bioremediation of petroleum hydrocarbon contaminated soil and groundwater in a low-permeability aquifer. Pp. 457-474 in Bioremediation: Field Experience, P. E. Flathman, D. E. Jerger, and J. H. Exner, eds. Chelsea, Mich.: Lewis.

Norris, R. D., et al. 1993. Handbook of Bioremediation. Chelsea, Mich.: Lewis.

Parkin, G. F., and C. R. Calabria. 1986. Principles of bioreclamation of contaminated ground water and leachates. Pp. 151-163 in Hazardous and Industrial Solid Waste Testing and Disposal, Sixth Volume, ASTM STP 833, D. Lorenzen, et al., eds. Philadelphia, Pa.: ASTM.

Raymond, R. L., V. W. Jamison, and J. O. Hudson. 1975. Biodegradation of high-octane gasoline in groundwater. Dev. Ind. Microbiol. 16.

Roberts, P. V., G. D. Hopkins, D. M. Mackay, and L. Semprini. 1990. A field evaluation of in-situ biodegradation of chlorinated ethenes: part 1, methodology and field site characterization. Ground Water 28:591-604.

Semprini, L., G. D. Hopkins, P. V. Roberts, and P. L. McCarty. 1991. In situ biotransformation of carbon tetrachloride, freon-113, freon-11, and 1,1,1-TCA under anoxic conditions. Pp. 41-58 in On-Site Bioreclammation, R. E. Hinchee and R. F. Olfenbuttel, eds. Boston: Butterworth-Heinemann.

Semprini, L., P. V. Roberts, G. D. Hopkins, and P. L. McCarty. 1990. A field evaluation of insitu biodegradation of chlorinated ethenes: part 2, results of biostimulation and biotransformation experiments. Ground Water 28:715-727.

Thomas, J. M., and C. H. Ward. 1989. In situ biorestoration of organic contaminants in the subsurface. Environ. Sci. Technol. 23:760-766.

U.S. Environmental Protection Agency. 1993. Bioremediation in The Field. EPA/540/N-93/001, No. 8. Cincinnati, Ohio: US EPA Center for Environmental Research Information.

Wentz, C. A. 1989. Hazardous Waste Management. New York: McGraw-Hill.

Zitomer, D. H., and R. E. Speece. 1993. Sequential environments for enhanced biotransformation of aqueous contaminants. Environ. Sci. Technol. 27:226-244.

Next Chapter: FROM LABORATORY TO MARKETPLACE: THE CHALLENGES OF TECHNOLOGY TRANSFER
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.