These represent the human side of eye-glazing discussions over ''risk assessment'' and ''cost-benefit ratios'' as applied mainly to laws dealing with environmental health and safety. What they amount to is deciding, for example, how much of a particular toxic substance -- and the cost of regulating it -- society can tolerate.

The House of Representatives has passed bills ordering bureaucrats to weigh relative risks and figure economic costs before imposing new government regulations. The Senate is considering a range of measures -- some palatable to the Clinton administration, others that likely would be vetoed.

''Risk assessment, cost-benefit analysis, [scientific] peer review, and regulatory review are among the most important tools we have for protecting the environment,'' Environmental Protection Agency administrator Carol Browner told the Senate Environment and Public Works Committee last week. ''But the issue is how best to use these tools in making ... very difficult decisions.''

Tough choices

Subjective values are as much at stake as objective science here.

''Ensuring the health and safety of the American people and protecting our environment is a matter of making choices,'' says C. Boyden Gray, chairman of a group studying the issue for the Harvard Center for Risk Analysis, in Cambridge, Mass.

Government regulation in the United States now costs about $600 billion a year (roughly $6,000 per household), with environmental protection growing the fastest. Some of the cost reflects bureaucratic budgets, but most of it is in higher prices for goods and services impacted by regulation.

''Too many regulations impose undue costs, and the regulatory process itself has become too cumbersome, unresponsive, and inefficient,'' says Sen. William Roth (R) of Delaware.

Harder to calculate are the economic benefits of regulation. One study put it at $200 billion a year.

But it is almost impossible to reckon in monetary terms the health and environmental benefits of, say, a sharp reduction in lead emissions, automobiles emitting fewer tailpipe pollutants, or health-care costs avoided through the government's antismoking campaign. Is it worth an extra $5 million to save a single life? More?

Even harder to quantify are the benefits of protecting nature for future generations.

''How do you get a cost-benefit analysis of a wild and scenic stream?'' Sen. John Chafee (R) of Rhode Island, chairman of the Senate Environment and Public Works Committee, asked recently.

In Senate bills to be considered this week, the ambiguities of applying economics to laws dealing with health, safety, and the environment are apparent.

Senator Dole asserts that federal regulators should ''stop using the most extreme scenarios possible when evaluating risks to human health and safety -- in other words, to use a little common sense when deciding whether it is really likely that anyone ... would be exposed to a particular level of hazard.''

But in describing the bill in a recent newspaper column, Dole spoke of applying ''sound science'' and a ''common-sense test'' that would ''rely on the judgment of agencies'' -- phrases open to wide interpretation.

Scientists themselves realize the limitations of trying to gauge risk.

''Even the best risk assessments contain data gaps and rely on assumptions that can affect the outcome by several orders of magnitude,'' says Margaret Mellon of the Union of Concerned Scientists.

''Actual risk values ... are notoriously difficult to ascertain,'' Edwin Jones of the Lawrence Livermore National Laboratory in California told a congressional panel.

The Harvard group of 15 experts has recommended centralizing the federal risk-assessment process under the authority of the President's Office of Science and Technology Policy, which would set all agency guidelines. It also suggests that some risk-management powers (dealing with water quality and some Superfund toxic waste sites, for example) be delegated to state and local authorities.

But the Harvard group also recognizes the complexity of calculating risks to health and safety and their relative costs. ''When determining what investments in risk reduction are reasonable,'' the group wrote, ''regulators should take into account quantitative estimates of benefit and cost (including the uncertainties in these estimates), intangible or qualitative benefits and costs, the distribution of benefits and costs among citizens, and the values of ordinary citizens....''

The public's take

But determining public values (or using ''a little common sense,'' as Dole says) could present some challenges. Like scientists, the public has mixed views about relative levels of acceptable risk. This is especially true for those living near a toxic chemical plant or nuclear waste dump. But differences of opinion on risk (and therefore how to measure it) exist within the general populace as well.

Most Americans are becoming more concerned about environmental and health risks, reports the Decision Science Research Institute in Eugene, Ore. But perceptions differ according to race, gender, and economic class.

A 1994 study of 1,489 individuals by the Oregon-based private research firm found that white women and all nonwhite persons perceive risks to be much higher than do white men. ''These results suggest that sociopolitical factors such as power, status, and trust are strong determiners of people's perception and acceptance of risk,'' the researchers wrote.

''Risk assessment has values, and you might as well recognize that,'' says Paul Slovic, president of the Decision Science Research Institute and past president of the Society for Risk Analysis. ''It's a very ambiguous science.''