As the 21st century opens, two new technologies raise the promise of benefits for humankind—biotechnology and nanotechnology. Crop biotechnology promises new and improved foods, e.g. vitamin A-enhanced rice (Oryza sativa), more food at lower cost from highly productive plants, plants that will grow under adverse—highly saline or drought—conditions, plants that will reduce reliance on potentially poisonous pesticides, or even plants that will produce pharmaceuticals. Human biotechnology promises cures for genetic diseases in somatic cells, diagnoses of diseases before they manifest themselves into deadly or irreversible conditions, e.g. breast cancer, perhaps even the prevention of inheritable genetic conditions that could be corrected by genetic engineering of the germ cells.

Biotechnology is well begun; nanotechnology has yet to penetrate public awareness or even university awareness beyond the major research universities. Nanotechnology, the science of the almost vanishingly small, is concerned with chemical products one-billionth of a meter in diameter. Yet, it, too, promises treatments for disease, innovative products to clean water supplies and protect the environment, and ultrasmall electronic switching mechanisms and components.

The early stages of a science or technology are replete with promises of benefits to humankind or the environment. Promised benefits, even soaring visions, motivate researchers to develop technologies, agencies to fund research, firms or venture capitalists to invest in, and ultimately the public to begin to accept or even to embrace, the technologies. Without promises of benefits, the science and technology will not develop. Yet, promises can be merely speculative, overblown, mistaken, or nonexistent. For example, the U.S. Food and Drug Administration withdrew approval for Parlodel (chemical name: bromocriptine), a breast milk lactation suppression drug, because it appeared to cause heart attacks and strokes, and its effects were “highly questionable” (Department of Health and Human Services, Food and Drug Administration, 1994).

New technologies inevitably will pose some risks as well—perhaps to human health, e.g. breathing nanoparticles may be as dangerous as breathing tiny air pollutants; to other human endeavors, e.g. crops infected by unwanted genes; or to the environment that sustains human life. How should we think about risks posed by promising new technologies?

One place to begin is with a brief review of the chemical revolution that began during and immediately after World War II; what can one reasonably conclude from this (ongoing) technological experiment? Beyond that, which risks are more and which less acceptable, and what considerations bear on these issues? Where should we locate the acceptability of risks posed by new technologies compared with other risks? What is the context into which the new technologies are introduced? Is it reasonable to suppose that the technology will exacerbate or ameliorate existing problems? What institutional strategies should we adopt to address risks that are less acceptable? The essay that follows focuses on genetically modified (GM) crop technologies. Addressing other technologies may need various modifications because they might raise different issues.

THE CHEMICAL REVOLUTION AS CASE STUDY

Without question, the post-World War II chemical revolution has produced substantial benefits for humankind. A significant portion of the gross domestic product in many countries results from the chemical industry and its products. It also has improved the quality of life and produced lifesaving products. It enhanced food production worldwide and helped make the United States a leading exporter of foodstuffs to the rest of the world.

The revolution has not been without costs, however. High-profile human health and environmental harms (and risks of others) are on the other side of the ledger. DDT (1,1,1,-Trichloro-2,2-bis(p-chlorophenyl)ethane), a “miracle” pesticide that killed malaria-carrying mosquitoes, was found to have substantial adverse ecological and health effects. Freon replaced highly toxic compounds that caused accidental deaths from use in home refrigerators, but this nontoxic and nonflammable substance has been found to destroy ozone in the upper atmosphere. Dioxins, at first contaminants of herbicide and pesticide products (now eliminated) and currently the contaminants of many industrial processes, are among the most potent mammalian carcinogens known. Polychlorinated biphenyls (PCBs), because of their thermal stability, their resistance to many chemical reactions, and dielectric properties, were used as hydraulic fluids, lubricants, plasticizers, insulators, and fillers in a variety of other products. PCBs became of concern because of their lipophilic and bioaccumulating properties, which transported them up the food chain, increasing their concentrations and toxicity as they moved to higher organisms. More recently, scientists have discovered PCBs can be transported far from their original sources by means of evaporation and condensation into colder areas of the Earth, contaminating fish, mammals, and humans to near-toxic levels at these remote locations (Travis and Hester, 1991).

Apart from toxicity properties resulting from direct exposures, many chemical substances have been disposed of in ways that have contaminated soils, groundwater, drinking water, the air, and natural ecosystems. In addition to toxic waste dumps, air and water emissions of potentially toxic substances have created experiments with the environment and public health without scientific understanding of long-term consequences of such exposures.

Of serious concern is our profound ignorance about the effects of most of the substances that are registered for use in commerce. Even more serious is that society at large doesn't even know that we are ignorant! There are about 70,000 substances registered for use in commerce. Add to that another 30,000 metabolites and derivatives of these substances, and the total comes to about 100,000 chemical substances. About 23% of these are polymers and another one-third present little or no exposure, in both cases presenting at worst quite minimal risks. Another 800 to 1,000 substances are added to commerce each year with no or only minimal testing (U.S. Congress, Office of Technology Assessment, 1987).

There are various estimates of what is known about the toxicity properties of the compounds, but none is reassuring. As recently as 1998, there was little or no basic toxicity information in the public record for 75% of the 3,000 chemical substances produced in the highest volume in the United States (Environmental Health Letter, 1998). Beyond this group of 3,000 representing the most serious concerns there are another 1,000 to 12,000 for which extensive toxicological information would be quite important but which was not available as recently as 1995 (U.S. Congress, Office of Technology Assessment, 1995).

A more detailed breakdown of this problem is as follows. In 1984, the U.S. National Research Council (NRC) found that there were 12,860 substances produced in volumes exceeding 1 million pounds per year, and for 78% of these, there was no toxicity information available, 13,911 chemicals produced in volumes of less than 1 million pounds (76% with no toxicity data), 8,627 food additives (46% with no toxicity data), 1,815 drugs (25% with no toxicity data), 3,410 cosmetics (56% with no toxicity data), and 3,350 pesticides (36% with no toxicity data) (NRC, 1984). Moreover, even when regulatory agencies have substantial evidence of toxicity, e.g. the carcinogenicity of substances, they have been slow to perform risk assessments and to regulate likely carcinogens (U.S. Congress, Office of Technology Assessment, 1987).

What can we infer from these risks and harms resulting from the post-World War II chemical revolution? If nothing else, this picture should occasion considerable humility for human ingenuity, scientists, technologists and institutions of social control. Entrepreneurs have been quite innovative in developing products but much less successful in understanding their risks and preventing them. Even now, it is not clear how well the downside of this technology is understood because it appears we have not yet run out of surprises. For example, polybrominated diphenyl ethers (PBDEs) used as fire retardants in furniture, plastics, and many electronic products appear to possess most of the same bioaccumulative and toxicity properties of PCBs (Hooper and McDonald, 2000), and they are increasing at a rapid rate in the breast milk of women (Fig. 1), with U.S. women at highest risk (Cone, 2003). Should we be repeating the PCB mistakes?

Concentrations of flame retardants called PBDEs have been rising exponentially in human beings. Tests of breast milk showed Swedish women were carrying 60 times more of the contaminants in 1997 than in 1972. That means they double in the human population every few years (source: Karolinska Institutet, Stockholm, as reported in Cone, 2003).

As scientists develop more subtle understandings of adverse effects and their mechanisms, the substances of concern tend to increase—consider, for example, endocrine disrupters and recent evidence of carcinogenicity mechanisms (Colborn et al., 1996; Melnick, 2003). If some of the hypotheses about the adverse effects of endocrine disrupters are corroborated, scientists may continue to discover ecological and health surprises well into the future.

INSTITUTIONAL CONTROLS

What institutional controls and incentives have led to such extreme ignorance of the substances in question? Broadly speaking, there are two kinds of laws to regulate risks: premarket screening strategies try to provide some assessment of the risks humans and the environment from products before they enter commerce, whereas post-market strategies provide for regulation of products after they have been introduced into commerce (and, thus, into the environment) and the public exposed (U.S. Congress, Office of Technology Assessment, 1987). The best postmarket statutes would aim to use surrogates of human health or environmental harm to identify risks before they cause harm. However, use of these surrogates has been eroded over time by requirements for evidence of actual harm, thus removing the possibility of warnings before damage occurs (Cranor, 2003a).

Premarket screening strategies have the potential advantage of greater safety than post-market statutes but with the disadvantage of burdening product development with higher costs and slowing the time from discovery of useful products to commercial production. Post-market strategies place lesser burdens on innovation but pose greater risks to safety. The history of the regulation of manufactured chemicals, their by-products, and pollutants suggests that most were subject to post-market regulation.

Virtually all post-market strategies create incentives for firms to develop products while ignoring or inadequately testing for potential adverse effects (or ignoring the byproducts or pollutants of production). Firms typically are not legally required to test substances extensively before they enter commerce (minimal data reporting is required under the Toxic Substances Control Act, a premarket notification law). Whether they do so depends upon a calculation of what is in their best self-interest. Would testing help them avoid regulatory action or personal injury suits? If potential adverse effects arise, ordinarily it is a governmental agency that must identify the problem, require needed toxicity data, and initiate regulatory proceedings to reduce exposure or remove the product from the market. While adverse health data and regulations are being developed, the firm can merely play “defense” and argue that there is insufficient scientific information (or too much ignorance and uncertainty) to justify taking regulatory action. Moreover, a firm's failure to do adequate testing in the first place means that it has even more time to benefit from the product while toxicity data are being developed. Similar problems attend personal injury (tort law) suits, with the burden of proof falling on the injured party (Cranor, 2003a).

Scientific procedures and burdens of proof reinforce regulatory burdens of proof. Substances are assumed to have no properties whatever until established by data and theories. Scientific standards of proof and burdens of proof can be interpreted so stringently that it becomes very difficult to satisfy them. Moreover, some scientists assume that substances have no adverse effects until proven otherwise by overwhelming evidence. Even if standards of proof are not so stringently interpreted, conducting the tests, interpreting them, and coming to sufficiently firm conclusions to satisfy scientific advisory panels, regulatory bodies, and appellate courts (that review the regulatory actions) are costly and time consuming (Cranor, 1993). All this analysis can easily result in “regulatory paralysis,” which further encourages firms not to test, to fight the regulations legally, and even to conduct misleading studies likely to yield contrary results (Markowitz and Rosner, 2002).

Substantial ignorance about the toxicity properties of substances is of course the expected result of postmarket regulatory schemes. This problem has now reached such proportions in Europe that the European Union has moved to require testing of 30,000 substances in commerce and to severely restrict 1,500 of the most hazardous substances (Loewenberg, 2003). Premarket statutes would address this issue better, but even here there are problems.

Premarket screening statutes that require substantial testing ensure that there is no or very little health and environmental exposure to substances until an agency is satisfied that there is no legally specified level of risk from them and permits them into production and commerce. With sufficient agency review and approval authority, there is an independent body to assess the quality of health and environmental data and to help assure that the substance does not enter commerce if it presents unreasonable risks to health or the environment.

However, premarket screening laws, such as U.S. statutes for the approval for new drugs, do not always function well. Sometimes firms deliberately or negligently withhold information from the Food and Drug Administration (but at least this opens them to liability). In 1984, there were no toxicity data for one-fourth to one-third of all drugs and pesticides (both subject to premarket laws; NRC, 1984). A more serious structural problem, however, is that even double-blind small clinical trials are inadequate to identify all adverse effects from longer term exposures in large biologically heterogeneous populations. Somewhat more than 50% of the drugs approved in the decade from 1976 to 1985 “had serious post-approval risks that went undetected” in the premarket testing period but were discovered when more diverse and larger populations were exposed (Green, 2000). Thus, there is a need for substantial postmarket follow up by the firm to monitor and report adverse effects from widespread exposure.

ACCEPTABLE AND UNACCEPTABLE RISKS. TRACTORS, CHEMICALS, AND GM CROPS

Any regulatory structure needs to be appropriate to the risks in question and to their acceptability. Users and others exposed to risks from large machines, such as tractors, need much less protection than from risks posed by chemicals. To most people, the risks from tractors or other typical physical threats are more acceptable than the risks posed by many chemicals (Slovic, 1987). What accounts for these differences, and where do GM crops fit in? Are they more similar to chemicals or machines?

Risks from GM crops include, but are not limited to, risks from the movement of genes (e.g. increasing weediness, risk of extinction of local species, and increased pesticide resistance), from whole plants (e.g. threats to wild relatives), to nontarget organisms (e.g. adversely affecting other plants, beneficial insects, or soil organisms) and of causing resistance evolution (e.g. herbicide-tolerant weeds; NRC, 2002). In addition, there might be short-term and longer term risks to human and animal health from pharmaceutical-producing plants (Ellstrand, 2003; van den Belt, in press).

Risks tend to be more acceptable the easier they are to identify, personally detect, appreciate and avoid (Cranor, 1995). The risks posed by tractors and other large machines tend to be fairly obvious, palpable, easily detected, easily appreciated, and readily avoidable. Of course, tractors can pose risks to third parties and bystanders, but here too such risks tend to be easier to detect, anticipate, and appreciate than are the risks from exposure to chemical substances.

Technologies that have direct personal benefits to those bearing the risks or that are a critical part of one's plan of life also tend to be more acceptable to persons from a generic point of view. We take airplanes although we know there are risks involved. Piloting airplanes is riskier than driving tractors, but pilots embrace such risks because they are important to their life plans. We drink chlorinated water, which may carry a low-level risk of contributing to bladder cancer, but that risk is counterbalanced by direct disease-preventing and lifesaving benefits provided by chlorination. Similarly, x-rays taken to try to detect disease and even life-threatening illnesses pose some risks of causing lung or bone cancer, yet such risks are acceptable precisely because of the important benefits accompanying them. Prescription drugs, created because they have beneficial effects, can have downside risks, yet if these are not too extreme and the benefits great enough, the risks associated with them may be acceptable simply because of the direct benefits to the users and absence of more benign alternatives (Cranor, 1995). More acceptable risks tend to have a close nexus between risks and direct benefits, are central to a person's life plan, or the person has clearly assumed the risk voluntarily.

Those exposed to risks from tractors have chosen to use them and have considerable continuing control over whether the risks posed materialize or not. Moreover, tractors for the most part do not pose risks far from their present location, with a few exceptions, e.g. their contributions to global warming and any health risks from engine exhaust. All of these considerations tend to make risks presented by tractors acceptable to those exposed.

By contrast, risks, for example, from chemical substances (Fig. 2) and GM crops, tend to be invisible, not easily detected, and might remain hidden for a long time. These tend to make the effects difficult to identify and detect, to protect against, and even to avoid (Cranor, 1995). GM plants tend to lack the close nexus between risks and benefits or fail to be central to the life plans of those who use them, although one needs to distinguish between different groups exposed to such products. It is one thing to be exposed to allergen-carrying foods, but quite another to have access to vitamin A-enhanced foods in an area of the world in which natural sources of vitamin A are in short supply. Farmers and some in the agricultural industry may benefit directly from GM plants, but ordinary consumers typically may not, except perhaps from very marginal reductions in the costs of products (if any at all). Any benefits from GM plants are likely to be quite peripheral to a person's life with few, minimal, or no welfare benefits and no other major direct benefits to representative persons. Moreover, representative consumers do not participate in decisions leading to approval of GM plants, and in the United States, because they are not labeled, one cannot make a conscious and well-informed decision to assent to their use and any risks they may present.

Chemicals present in our environment or food supply constitute invisible risks that are not easily detected and might remain hidden for a long time. These tend to make the effects of such chemicals difficult to identify and detect, to protect against, and even to avoid. Furthermore, the benefits may be uncertain or unknown to the consumer and might also be achieved without chemicals or by using different chemicals. Photo by Tim McCabe of the Agricultural Research Service.

The features of the risks from chemicals and GM crops tend to place such risks toward the more unacceptable end of risks to which persons will be exposed (Cranor, 1995). That is, they will tend to be more unacceptable than other (ordinary) risks of life, such as being a lifeguard, policeman, mountain climber, frequent flying professor, tractor operator, pilot, pedestrian, or consumer of chlorinated drinking water. People engaged in such activities are all at some risk of serious injury or in some cases death, yet the risks they incur are from a generic personal point of view judged to be quite acceptable. From the same point of view, the risks from exposure to GM plants and many chemical exposures would be judged to be comparatively unacceptable.

CONTEXT

Another generic concern for evaluating emerging technologies would be the context into which they are introduced. Is the system or environment into which the technology enters robust and resilient or in poor condition and perhaps vulnerable to adverse perturbations? With regard to GM crops, which could potentially have considerable impact on the farming and natural environment, what is the condition of the world into which they will be introduced? At the most generic level, for example, is the natural environment into which GM plants are introduced more like an unlimited, bounteous frontier that is quite resilient, or more like a limited, confined fish bowl that is already substantially polluted and more susceptible to new insults?

Scientists, largely ecologists, are far from optimistic in their assessment of the current condition of the natural environment. They see the environment as being under considerable pressure, continual threat, degraded from previous healthier states, which will only worsen because of increasing human population pressures. Consider a few highlights (readers may have their own shorter or longer lists; Cranor, 2003b). Comparatively pristine air, water, oceans, and wilderness are vanishing or are on their way to vanishing. In the past 50 years, the world's forested lands have shrunk substantially (Rodgers, 1994). Water tables are falling around the world; humans are over-pumping aquifers in China, India, North Africa, Saudi Arabia, and the United States by about 160 billion tons of water per year. Humans are mining water from nonrenewable resources sufficient to produce the food for 480 million people per year, a pace that is simply unsustainable. The overuse of water is particularly acute in India and China (Brown et al., 2002). The worldwide cropland per person has fallen from 0.24 hectares to 0.12 hectares in the last 50 years and may shrink to 0.08 by 2050 (Brown et al., 2002). About “two thirds of major marine fisheries are fully exploited, overexploited or depleted” (Lubchenco, 1998). About 11% of bird species, 25% of mammal species, and an estimated 34% of all species are vulnerable or in immediate danger of extinction (Brown et al., 2002). Some tropical forests have been turned into “inferior, rapidly degrading pasture” with attendant loss of biodiversity (Rodgers, 1994). And, of course, humans are making a substantial contribution to global warming, which in turn will likely have adverse effects on the environment and living conditions for humans (McKibbon, 2001). World population, which has more than doubled in the lifetime of readers older than 40 and which is projected to increase about 50% in the next 50 years, will greatly exacerbate the above problems (Rodgers, 1994).

In the context into which new technologies will be introduced, what effect is it reasonable to suppose that technology will have on it? Might it ameliorate, exacerbate, or have more neutral effects on existing problems?

Transgenic plants might successfully address some of the food shortages and shortage of adequate agricultural land in the future (e.g. transgenic plants might be created that would grow on degraded soils) and even the shortage of water (with the development of drought tolerate plants), although not the underlying problem of a population that may be too large for the world's resources.

With respect to GM crops, the U.S. NRC (2002) has some cautionary notes about the effects of GM plants on ecosystems. The NRC noted that “our collective judgments of environmental impacts” have changed over time and the information necessary to make good judgments about environmental risks from transgenic plants “is like a moving target.” In addition, scientific “understanding of genetics and ecology is still developing.” A “much broader array of phenotypic traits can now potentially be incorporated into plants than was possible two decades ago,” and this is likely to increase dramatically with transgenic plants.

Thus, because scientists do not yet understand well either the impacts of introduced species on ecosystems or genetics, and because it will take considerable time to develop the understanding to fully assess such risks, it appears that regulatory decisions about the introduction of GM crops will be made under substantial uncertainty and in considerable ignorance (NRC, 2002). If such decisions could have quite serious adverse consequences, this argues for making them cautiously, with considerable humility and with a full range of scientific expertise to ensure that they are appropriately protective of ecosystems, species, and human health. Moreover, the NRC assessment of the existing U.S. Department of Agriculture regulatory structure and personnel is quite critical. The agency does not have sufficient external scientific input to review plants proposed for testing or commercialization, and there is insufficient public notice and input into the process, especially for plants proposed for nonregulated status (NRC, 2002).

PULLING IT ALL TOGETHER. HOW SHOULD WE APPROACH GM CROP TECHNOLOGY?

What do the combination of risks, the nature of the technologies, the context of introduction, and lessons from case study of chemical technologies suggest for how one should approach transgenic plants?

First, the risks posed by transgenic plants will tend to be toward the unacceptable end of risks compared with many of the more familiar risks of life. Like chemical substances, genetic changes are invisible, undetectable features of plants and difficult to avoid, unless one is put on notice about their properties (and even that has limited value). Moreover, it is difficult to appreciate any risks they might pose because they are so far from our ordinary experiences and other common risks. For most of us, using or consuming transgenic plants are not central to our life plans.

Second, GM crops have risks that chemical substances tend to lack—they can replicate, propagate, migrate, mutate—and genes can “wander” from plant to plant within related species (Ellstrand, 2001). These features represent a degree of risk not usually present with chemical substances.

Third, the environment into which transgenic plants will be introduced has suffered from substantial human and technological impacts from previous technological advances. Consequently, it may be less resilient than it once was. Moreover, because according to the NRC both understandings of ecosystems and genetics are in their infancies, this creates additional reasons to be cautious in introducing transgenic plants with new and untested properties into the farming and natural ecosystems. Thus, it seems important to go slow at the beginning to understand as fully as possible the properties, risks, and possible problems from this new technology. Some risks will be more obvious, some much less so (van den Belt, 2003). The earliest proposed transgenic plants may be well studied and perhaps even those for which the best case can be made (they may have the most obvious benefits) and for which the risks are reasonably well understood. After the initial wave of GM crop proposals, when benefits may be less clear and risks higher, agencies will need to maintain their scrutiny. In particular, agencies will need to review closely the risks from pharmaceutical GM crops (Thompson, 2003).

Fourth, these first three features create a need for reliable trustees to provide protections that individuals cannot; governmental institutions to ensure that the risks individuals find it difficult to appreciate, detect, and protect against are identified and reduced before they materialize into harm to ecosystems and human health. What legal structure would such institutions take to address a new technology such as transgenic plants?

A first step would suggest a cautious approach, aspects of which the United States has tended to adopt. The U.S. Department of Agriculture in effect has a premarket approval law to guide review of transgenic plants; GM crops must be submitted to the agency, reviewed, and ultimately given permission to be planted for research, experimental, or commercial purposes (NRC, 2002). Under a premarket statute, there may be ways to have “tiered responses” to the products as experience is gained with them. For substances for which scientists have very strong reasons to believe the insertion of genes will not pose problems, there might be a lesser form of premarket review. For substances for which there is much less experience, where there are likely to be more subtle and long-term effects, or for which there is not strong confidence that the inserted genes do not pose problems, a more searching review would be appropriate, or, if the risks are too great, their release might need to be prevented (van den Belt, 2003) A standing difficulty of such tiered responses is that there must be good scientific justification for demarcating between those products needing greater scrutiny from those that do not. At present, such scientific criteria have not been adequately developed in the U.S. Department of Agriculture, and there appears to be too little oversight of some groups of plants (NRC, 2002; Thompson, 2003).

More important, it seems socially important to avoid being ignorant of risks from such products and losing social control of them, as occurred with chemical products. In addition, as the NRC (2002) noted, genetic technologies pose greater risks of scientists not understanding their properties and ultimate fate in the environment because of limited understanding of both ecosystems and genetics.

A constituent feature of a premarket screening statute should be to make the approval, distribution, and manufacture of the product conditional upon quick removal when problems arise. That is, with the legally sanctioned distribution of a product that exposes the public and ecosystems, I suggest that the social permissibility of its distribution should remain conditional upon continued safety of the product. This is the case under some of the premarket approval statutes for drugs and pesticides (although it is not always as easy as perhaps it should be to withdraw a product when problems arise). As part of this legal structure, agencies should make the rights to public protection greater or of higher priority than private property rights of the manufacturer of the product so that property rights to the product do not make a product's removal so difficult that it is unlikely to occur. This more esoteric point goes to the particular design of the law in question, but it should be comparatively easy for an agency to show that a product no longer satisfies the condition of approval and withdraw it, if we seek to protect ecosystems and our health (Cranor, 2003a).

A further feature of such laws might require the manufacturers and distributors of the products to have a legal obligation to report adverse effects so that an agency, acting on behalf of the public, can be alerted to threats and risks before they become the next freon, 1,1,1,-trichloro-2,2-bis(p-chlorophenyl) ethane, PCB, or PBDE problem. Moreover, agencies will need to follow the guidance of the scientific community in identifying threats that constitute the basis of product “recall” once they are in commerce.

Finally, the collective effects of individual decisions can be lost in discussions about individual products or substances (Cranor, 2003b). To address these, the NRC (2002) suggests that the United States needs to have a much better environmental and agricultural monitoring system in place than it does at present. Long-term monitoring provides baselines against which to compare future changes in agricultural or natural ecosystems. It also provides information about the impacts of GM crops on ecosystems, other agricultural crops, or human health. In turn, this can be utilized to improve future regulatory decisions. Unfortunately, at present “for most biological resources such long-term data do not exist” (NRC, 2002), and sometimes, what we don't know can hurt us.

None of the above arguments suggests that new technologies should not be pursued and developed. Instead, they suggest that some of the technologies on the horizon pose risks that tend to less acceptable than many ordinary risks of life and that they will be introduced into an environment that may be less resilient than it once was. Finally, the chemical technology experience suggests that we should approach new technologies with considerable humility and try to ensure that they do not escape social understanding and control as a good many chemical products have. If such cautionary approaches are followed, perhaps we can have the benefits of new technologies without some of the risks and costs that have accompanied other technological revolutions.

Acknowledgments

I am grateful for comments on this paper from Norman Ellstrand.

Footnotes

↵1 This work was supported by the National Science Foundation (grant no. 99-10952) and by a grant from the University of California Toxic Substances Research and Teaching Program.