It is perhaps the most daunting challenge facing experts in both the fields of climate and computer science — creating a supercomputer that can accurately model the future of the planet in a set of equations and how the forces of climate change will affect it. It is a task that would require running an immense set of calculations for several weeks and then recalculating them hundreds of times with different variables.

Such machines will need to be more than 100 times faster than today’s most powerful supercomputers, and ironically, such an effort to better understand the threat of climate change could actually contribute to global warming. If such a computer were built using today’s technologies, a so-called exascale computer would consume electricity equivalent to 200,000 homes and might cost $20 million or more annually to operate.

For that reason, scientists planning the construction of these ultrafast machines have been stalled while they wait for yet-to-emerge low-power computing techniques capable of significantly reducing the power requirements for an exascale computer.

Developing such techniques, however, has been particularly vexing because virtually every aspect of designing faster computers consumes more electricity and generates more heat. Computer engineers now believe that sometime between 2020 and 2023 is a likely arrival date for the next generation of supercomputers.

But Krishna Palem, a computer scientist at Rice University, believes he has found a shortcut.

He has been stirring debate among computer architects by arguing that a counterintuitive computer design approach — one that he originally proposed to give smartphones longer battery life — can also be used to build faster and less power-hungry supercomputers.

Dr. Palem says his method offers a simple and straightforward path around the energy bottleneck. By stripping away the transistors that are used to add accuracy, it will be possible to cut the energy demands of calculating while increasing performance speeds, he claims.

His low-power crusade has recently attracted followers among some climate scientists. “Scientific calculations like weather and climate modeling are generally, inherently inexact,” Dr. Palem said. “We’ve shown that using inexact computation techniques need not degrade the quality of the weather-climate simulation.”

Climate models use an immense set of differential equations that simulate the interaction of physics, fluid motion and chemistry. To create models, scientists turn the world into a three-dimensional grid and compute the equations.

Current climate models used with supercomputers have cell sizes of about 100 kilometers, representing the climate for that area of Earth’s surface. To more accurately predict the long-term impact of climate change will require shrinking the cell size to just a single kilometer. Such a model would require more than 200 million cells and roughly three weeks to compute one simulation of climate change over a century.

Dr. Palem believes his inexact approach is more appropriate for weather and climate modeling because the vast grids of cells that separately calculate local effects like cloud formations, wind, pressure and other variables can be calculated without great accuracy.

“I see it as a necessary tool we need now to move the science forward,” said Tim Palmer, a University of Oxford climate physicist. “We can’t do a lab experiment with the climate. We have to rely on these models which try to encode the complexity of the climate, and today we are constrained by the size of computers.”

Dr. Palem says the technologies used to build current supercomputers will be too costly to create a computer capable of an exaflop — a billion billion calculations per second. Rather, he argues, computing the rate of global warming may be possible with a new kind of computer that would use specialized low-power chips to solve a portion of the problem.

He describes his approach as “inexact” computing. “This is a lower-energy way to compute,” he said.

The stated goal of the engineers who are trying to design an exascale computer is to stay within a power budget of 30 megawatts. But Andreas Bechtolsheim, a high-performance computer and network designer, noted that based on current technology, that would require a tenfold improvement over today’s most efficient designs.

Dr. Palem has been imploring the computing world to back away from its romance with precision for more than a decade. He has recently developed allies among climatologists like Dr. Palmer who in the journal Nature recently called on the climate community to form an international effort to build a machine fast enough to solve basic questions about the rate of global warming.

“High-energy physicists and astronomers have long appreciated that international cooperation is crucial for realizing the infrastructure they need to do cutting-edge science,” he wrote. “It is time to recognize that climate prediction is ‘big science’ of a similar league.”

Dr. Palem’s effort received some help last month when he was awarded a Guggenheim fellowship to support his research on low-energy computing for weather and climate modeling.

Not everyone is convinced his computer architecture ideas will be applicable.

“Inexact computing works well for mobile applications where the consequence of choosing incorrectly is low,” said John Shalf, department head for computer science at the Lawrence Berkeley National Laboratory. “For consequential problems, where inexact results could cause a bridge to be misdesigned, or erroneous conclusions about the mechanics of climate, the inexactness is problematic.”

Dr. Palem and Dr. Palmer are attempting to overcome these objections. Earlier this year at a technical computing conference in Europe, they presented a paper that claims they can sharply reduce the power needs without compromising the accuracy of the simulation.

Dr. Palmer said the case for the required investment should be self-evident.

“It’s a trivial amount of money when you think of climate impact being in the trillions of dollars,” he said. “It’s actually an existential question. If it’s at one end of the spectrum, we can adjust, but if it’s at the other end of the spectrum, we’re not going to come out of it unless we cut emissions in the next decade.”

A version of this article appears in print on , on Page D6 of the New York edition with the headline: Shortcut to a Climate-Modeling Solution. Order Reprints | Today’s Paper | Subscribe