The age-old maxim of scientists whose work has resulted in deadly or dangerous technologies is: scientists are not to blame, but rather technologists and politicians must be morally culpable for the uses of science. As new technologies threaten not just populations but species and biospheres, scientists should reassess their moral culpability when researching fields whose impact may be catastrophic. Looking at real-world examples such as smallpox research and the Australian “mousepox trick”, and considering fictional or future technologies like Kurt Vonnegut’s “ice-nine” from Cat’s Cradle, and the “grey goo” scenario in nanotechnology, this paper suggests how ethical principles developed in biomedicine can be adjusted for science in general. An “extended moral horizon” may require looking not just to the effects of research on individual human subjects, but also to effects on humanity as a whole. Moreover, a crude utilitarian calculus can help scientists make moral decisions about which technologies to pursue and disseminate when catastrophes may result. Finally, institutions should be devised to teach these moral principles to scientists, and require moral education for future funding.