This report called for transforming toxicology “from a system based on whole-animal testing to one founded primarily on in vitro methods that evaluate changes in biologic processes using cells, cell lines, or cellular components, prefer ably of human origin. ” In March 2009, the EPA published its own Tox21 agenda, The U.S. Environmental Protection Agency’s Strategic Plan for Evaluating the Toxicity of Chemicals, which asserts that “the explosion of new scientific tools in computation al, informational, and molecular sciences offers great promise to. .. strengthen toxicity testing and risk assessment approaches. ” The concept of adding more mechanistic data to risk assessment isn’t new... Instead, the EPA relies more often on conservative default assumptions about how chemicals affect human beings. “EPA goes by precedent and does things as it did in the past so as to not be arbitrary,” Rhomberg explains. “So, there’s a lot of inertia in the system. ” Robert Kavlock, director of the EPA National Center for Computational Toxicology, says the main difference between Tox21 and prior molecular research in toxicology is one of scale... Kavlock emphasizes that with its new strategy the EPA is demonstrating a willingness to take mechanistic data seriously. “Tox21 was produced with input from senior members across all the EPA offices,” he says. “There’s an explicit recognition that we’re in a scientific transition and that the business part of the agency needs to come along with it. ” Tox21’s essential premise is that scientists can infer human harm from chemicals on the basis of how they activate toxicity path ways in cells. “Toxicity pathway” refers to a chemically induced chain of events that leads to an adverse effect such as tumor formation, explains Raymond Tice, chief of the NTP Biomolecular Screening Branch... Cell-based assays offer some advantages in this respect... Unlike animal tests, which are limited by cost and resource constraints to just a few doses, in vitro assays can test chemicals at a broad range of doses that might provide better information about low-dose human effects, scientists say... The whole process requires a leap of faith that perturbations and associated modeling efforts will accurately predict human effects from chemical exposure, Solomon says. “And this is why risk assessors at EPA have such a hard time with this type of data,” she explains. “It’s not easy to extrapolate from [the results of] a cell-based assay to [exposure effects] in a real population of humans... Compared with prioritization, this is a far more challenging and elusive goal... Toxicologists have based human standards on the results of animal tests for more than 50 years... The cancer slope factor, therefore, aims to limit the number of expected cancers in the exposed population to no more than 1 in 1 million people... The fact that animal tests rely on doses far higher than those found in the environment raises difficult questions about their relevance to humans. “I’ve spent nearly forty years as a toxicologist trying to relate high-dose animal studies to low-dose human risk,” says Melvin E... Andersen’s view—backed by the NRC report, he says—is that testing for perturbations of toxicity pathways, leading to the elimination of animal tests, should be a fundamental goal. “EPA and the NTP want to in vitro results to predict high-dose outcomes in animals,” he says. “But that’s backwards—we need to identify cellular targets and then predict what’s going to happen to people at environmentally relevant concentrations... John Doull, professor emeritus at the University of Kansas Medical Center, gives the example of chemicals that target certain regions in the brain whose toxic effects are reflected elsewhere, perhaps in terms of gait or vision... Cell-based assays might not pick up these metabolic or downstream effects, however.

This report called for transforming toxicology “from a system based on whole-animal testing to one founded primarily on in vitro methods that evaluate changes in biologic processes using cells, cell lines, or cellular components, prefer ably of human origin. ” In March 2009, the EPA published its own Tox21 agenda, The U.S. Environmental Protection Agency’s Strategic Plan for Evaluating the Toxicity of Chemicals, which asserts that “the explosion of new scientific tools in computation al, informational, and molecular sciences offers great promise to. .. strengthen toxicity testing and risk assessment approaches. ” The concept of adding more mechanistic data to risk assessment isn’t new... Instead, the EPA relies more often on conservative default assumptions about how chemicals affect human beings. “EPA goes by precedent and does things as it did in the past so as to not be arbitrary,” Rhomberg explains. “So, there’s a lot of inertia in the system. ” Robert Kavlock, director of the EPA National Center for Computational Toxicology, says the main difference between Tox21 and prior molecular research in toxicology is one of scale... Kavlock emphasizes that with its new strategy the EPA is demonstrating a willingness to take mechanistic data seriously. “Tox21 was produced with input from senior members across all the EPA offices,” he says. “There’s an explicit recognition that we’re in a scientific transition and that the business part of the agency needs to come along with it. ” Tox21’s essential premise is that scientists can infer human harm from chemicals on the basis of how they activate toxicity path ways in cells. “Toxicity pathway” refers to a chemically induced chain of events that leads to an adverse effect such as tumor formation, explains Raymond Tice, chief of the NTP Biomolecular Screening Branch... Cell-based assays offer some advantages in this respect... Unlike animal tests, which are limited by cost and resource constraints to just a few doses, in vitro assays can test chemicals at a broad range of doses that might provide better information about low-dose human effects, scientists say... The whole process requires a leap of faith that perturbations and associated modeling efforts will accurately predict human effects from chemical exposure, Solomon says. “And this is why risk assessors at EPA have such a hard time with this type of data,” she explains. “It’s not easy to extrapolate from [the results of] a cell-based assay to [exposure effects] in a real population of humans... Compared with prioritization, this is a far more challenging and elusive goal... Toxicologists have based human standards on the results of animal tests for more than 50 years... The cancer slope factor, therefore, aims to limit the number of expected cancers in the exposed population to no more than 1 in 1 million people... The fact that animal tests rely on doses far higher than those found in the environment raises difficult questions about their relevance to humans. “I’ve spent nearly forty years as a toxicologist trying to relate high-dose animal studies to low-dose human risk,” says Melvin E... Andersen’s view—backed by the NRC report, he says—is that testing for perturbations of toxicity pathways, leading to the elimination of animal tests, should be a fundamental goal. “EPA and the NTP want to in vitro results to predict high-dose outcomes in animals,” he says. “But that’s backwards—we need to identify cellular targets and then predict what’s going to happen to people at environmentally relevant concentrations... John Doull, professor emeritus at the University of Kansas Medical Center, gives the example of chemicals that target certain regions in the brain whose toxic effects are reflected elsewhere, perhaps in terms of gait or vision... Cell-based assays might not pick up these metabolic or downstream effects, however.