Researchers at the University of Maryland, College Park need more than a laptop, a room of standard servers or even a supercomputer dubbed Deepthought to tackle research exploring Saturn's rings, the earliest galaxies, fire dynamics and drug-resistant bacteria.

Instead, the university has launched a $4.2 million machine, Deepthought2, capable of processing 300 trillion calculations each second. University officials expect that will rank it among the 10 most powerful owned by public universities.

The investment, unveiled Wednesday, aims to tackle complex research questions on the Prince George's County campus, instead of through supercomputers owned by federal centers or other institutions. That could help speed projects and attract new faculty seeking to explore big ideas of their own, researchers said.

It comes amid a bit of an arms race for computing power among universities and other institutions, as research increasingly requires more computing power than the average cluster of servers can handle. The machine's predecessor, Deepthought, was launched in 2006, underscoring that any investment in expensive technology comes with the risk of obsolescence in a few years, something university officials said they hope to avoid.

"Deepthought was a good cluster, but it's nowhere near the level we need to be at to be able to compete nationally," said Derek C. Richardson, an astronomy professor who plans to use the new supercomputer to study particles in Saturn's rings and on the surface of asteroids. "When we go to recruit new faculty, they'll see we're serious about computing."

Supercomputers bring together a cluster of powerful servers that can work together to compute data and solve problems, in contrast to networks that distribute computing power to divide tasks. Their ability to process immense amounts of data has made them key in research fields that depend on mass computation, such as weather forecasting and climate research, quantum mechanics, and modeling and simulations of complex and voluminous materials in molecular biology or astronomy research.

The university intends to use Deepthought2 in a variety of fields. The technology is a joint investment between its information technology division and the colleges of engineering, behavioral and social sciences, and computer, math and physical sciences.

While it doesn't look spectacular — its infrastructure devoted to computing takes up a dozen standard server racks — it is capable of both large-scale numerical computing and some graphic processing, used in simulations and other visual models. University officials planned it that way to make it capable of tackling a broad array of research problems, unlike some more specialized supercomputers, said Fran LoPresti, deputy chief information officer of cyberinfrastructure and research IT.

"High-performance data processing has moved from solving a small number of problems to being increasingly important for many researchers," said Ann Wylie, interim vice president for information technology. "It's no longer a novelty, but something a research university like the University of Maryland should have."

Deepthought2's power, 10 times greater than that of its predecessor, is equivalent to 10,000 laptops working together, with 2,000 times the storage of an average laptop and an internal network 50 times faster than broadband. The system is expected to place on a list of the top 500 supercomputers in the world, a database updated twice a year. The supercomputer is set to undergo tests this month that will determine where exactly it places, LoPresti said.

Among the most powerful supercomputers at higher education institutions in the United States are systems at the University of Texas, Purdue University, the University of Tennessee, Rensselaer Polytechnic Institute and Indiana University.

But LoPresti and peers at other institutions said it's not the ranking that is important, but the ability to do the complex research that is becoming more central.

"There's problems you just can't do with anything smaller," said William Gropp, director of the Parallel Computing Institute at the University of Illinois, Urbana-Champaign.

As research demands grow, universities are increasingly justifying investments in supercomputers as competitive tools, said Wayne Pfeiffer, a distinguished scientist at the San Diego Supercomputer Center, part of the University of California, San Diego.

That's because the alternative is applying for funding and time on supercomputers elsewhere.

"Having your own machine means you can make the decision about what science you want to pursue without having to worry about whether you can convince a review panel that you're right about that," Gropp said.

In College Park, decisions on how to prioritize research projects will be made by each of the colleges that partnered on the project, LoPresti said. Researchers from other colleges at the university will be given the opportunity to rent time on the supercomputer.

That rental plan will expand access to the supercomputer's power and will help pay for what is expected to be a Deepthought3 in the next five years or so. In the meantime, Deepthought2 will be upgraded as needed to keep pace with demands. Like personal computers, supercomputers go through a three- to five-year cycle of obsolescence that is driven by ever-broadening research horizons.

"Most of the new exciting research areas such as big data, understanding the brain, bioinformatics and cybersecurity require substantial computing facilities," Joseph JaJa, an electrical and computer engineering professor and member of the university's Institute for Advanced Computer Studies, said in an email. "In fact, without such an infrastructure, it will be very difficult to make a major new breakthrough in any of these emerging research areas."