In recent years more and more Americans are making the decision to switch to distributed energy sources. The rise of these technologies in the last couple of years has been influenced by a number of different factors, including the rising cost of power and the lowering cost of hardware components such as solar.

As the cost of energy continues to rise, more and more companies have been looking at Distributed Energy Resources (DERs) to help reduce their energy costs.

According to an article on ArsTechnica,

“DERs include all kinds of hardware that the utility may not necessarily own directly—solar panels, natural gas-fired microturbines, stationary batteries, and alternative cooling. Demand-response schemes, where a grid operator shifts electricity consumer use (usually through incentives) away from high-demand times, are also considered DERs.”

The rise of these technologies and the continued implementation of Energy Conservation Measures (ECMs) raises an interesting question. What effects do DERs have on the grid?

The Modern Grid

The problem with our current grid is that it was not built to be a two way street. The grid that we have today, is outdated and simply does not have the capacity to handle the amount of energy that DERs can feed it.

The argument against Distributed Energy sources, such as Combined Heat and Power and solar is they feed power back to the energy grid whenever the facility that they are tied to is not consuming as much energy as they are producing. This is where net metering comes in to play. The question here is where does this power go?

Theoretically the energy being created will travel back to the grid where it can service other loads. What happens, however, when there is too much resistance on the grid and the power being generated does not have enough reactive power to travel through the grid?

These problems, along with a slew of other problems, such as the risk of terror attacks make Distributed Energy Recourses more and more attractive to businesses. This is where the case for micro-grids comes into play. Is it better for companies to feed power back to a grid shared by everyone, or is it better to adopt a micro-grid approach?

Modernizing the grid.

The more utilities continue to upgrade to smart meters the more data that will become available as to the real effects of DERs on the grid. With proper energy data not only can companies know exactly what is happening inside their facilities but the information can be shared. This will allow researchers to know exactly what effects DERs have on the grid and what the long term consequences may be.

Going back to the ArsTechnica article, they state that one DOI study found:

“The researchers applied ReMatch to a 10,000-customer sample in California, using real hourly data gleaned from smart meters. The model found that constructing DER infrastructure in a targeted way reduced the Levelized Cost of Electricity (that is, the present value of the resource over its lifetime costs) by nearly 50 percent. This was, the paper states, due to a dramatic reduction in operating costs incurred by the utility.”

If this is indeed the case, the more DERs are implemented the lower the infrastructure cost of the grid will be. Combining this and the wider adoption of ECMs can substantially reduce the cost of energy for everybody connected to the gird.