In 1989, when “climate change” had just entered the public lexicon, 63 percent of Americans understood it was a problem. Almost 25 years later, that proportion is actually a bit lower, at 58 percent.

The timeline of these polls defines my career in science. In 1982 I was an undergraduate physics major. In 1989 I was a graduate student. My dream was that, in a quarter-century, I would be a professor of astrophysics, introducing a new generation of students to the powerful yet delicate craft of scientific research.

Much of that dream has come true. Yet instead of sending my students into a world that celebrates the latest science has to offer, I am delivering them into a society ambivalent, even skeptical, about the fruits of science.

This is not a world the scientists I trained with would recognize. Many of them served on the Manhattan Project. Afterward, they helped create the technologies that drove America’s postwar prosperity. In that era of the mid-20th century, politicians were expected to support science financially but otherwise leave it alone. The disaster of Lysenkoism, in which Communist ideology distorted scientific truth and all but destroyed Russian biological science, was still a fresh memory.

The triumph of Western science led most of my professors to believe that progress was inevitable. While the bargain between science and political culture was at times challenged — the nuclear power debate of the 1970s, for example — the battles were fought using scientific evidence. Manufacturing doubt remained firmly off-limits.

Today, however, it is politically effective, and socially acceptable, to deny scientific fact.