Tag: California Public Utility Commission

Starting in 2020, all new residential homes in California must be built solar ready. On May 7, the California Energy Commission approved the 2019 Building Energy Code, which includes that provision.

The California Energy Commission approved the 2019 Building Energy Code, which includes the provision that all new homes must be built solar ready, starting in 2020. (Photo by DOE Office of Energy Efficiency and Renewable Energy)

In addition to mandating rooftop solar, the code contains incentives for energy storage and requires new home construction to include advanced energy-efficiency measures. Using 2017 data, ClearView Energy Partners estimate that the mandate could require between 68 and 241 megawatts of annual distributed solar buildout.

Good for consumers, solar, storage industriesThe commission stated that the new code is meant to save Californians a net $1.7 billion on energy bills all told, while advancing the state’s efforts to build-out renewable energy.

Following the commission’s decision, solar developers such as Sunrun, Vivint Solar and First Solar experienced a surge in stock prices, Bloomberg reported.

The updated codes also allow builders to install smaller solar systems if they integrate storage in a new home, adding another incentive to include energy storage. California has been a leader in incentivizing energy storage. In January, the California Public Utility Commission moved to allow multiple revenue streams for energy storage, such as spinning reserve services and frequency regulation.

Utilities question policyThe solar industry received a prior boost in January 2016, when the CPUC approved its net metering 2.0 rate design. The state’s investor-owned utilities asserted at the time that net metering distributed generation from electricity consumers shifted the costs for the system’s maintenance and infrastructure onto consumers who do not own distributed generation.

ClearView analysts pointed to the distributed solar mandate as a possible opening for utilities to argue that California regulators should reconsider the net metering reform proposal. According to the report ClearView published ahead of the CEC’s decision, utilities that opposed the new rate-design could claim that mandating distributed solar alters the policy landscape enough to warrant further review of the compensation levels paid to excess generation.

David Reynolds of Energy & Resource Solutions presented a case study on a collaborative effort by California public power utilities to evaluate their programs. The motivation was two state bills that required investor-owned utilities to verify savings from energy-efficiency programs.

As a first step, the utilities worked together to develop an energy efficiency database and reporting tool. A contractor was hired to train the staff, but the training went both ways, since the contractor had never worked with public power providers. The collaborative developed its implementation plan, implemented the plan at their individual utilities and shared lessons learned.

What they discovered was that there is a lot of room for improvement in data collection and tracking systems. Even so, feedback is valuable for program design—it shows what measures work and helps utilities to improve program effectiveness.

Economies of scale can reduce the cost of operating programs. Leverage the information that is out there. Prioritized the program elements to be evaluated and spread the effort across several years.

Evaluation is a quality assurance tool driven by documentation, and it is your best chance of keeping program costs down. It’s just good program management by showing you are in charge. It instills a level of confidence in the program, and proves your efforts align with state or national policies.

For the process to work, you must be clear why you are doing evaluation in the first place. Don’t try to evaluate too many aspects if the program. If you can write a clear concise question, you can set your goal. Don’t try to answer too much. Design evaluation to answer a singular question.

If you are working with a third party provider—or even if you are conducting your own evaluation—get your documentation together. That includes a resource portfolio structure, program description, rationale, customer markets, forms agreements, rules and rebates. Assess your data management system and existing quality assurance procedures up front. Include the evaluation in your program budget.

The plan is the heart of your evaluation framework. It should include program summary, tracking and reporting, overall priorities, secondary objectives, overall approach, accounting principles and preferred methodology.

The budget will define and limit your evaluation effort. California utilities were spending 1 percent of their operating in the early ’00s. The national average now hovers between 3 and 5 percent, Some evaluation consultants suggest between 4 and 16 percent. There will be some additional costs for new programs. Focus on an in-depth effort if your funds are limited.

Tracking and reporting is expensive. Have a least-cost preference for how you are going to collect data. Do data collection in house as much as possible. Move it onto customer where you can. Lowers cost. Move from utility to third party to customer.

Data should be easy to share. Archive applications so you don’t lose data when the staff turns over. Analysis is all difficult so make it simple where you can.

The report is a technical document, so you need to communicate the good news. Be your own advocate. Frame the discussion, so other entities don’t do it.

Gainsville Regional Utilities had 22 energy-efficiency programs in 2009 and chose four to evaluate: refrigerator recycling, air conditioner upgrades, attic insulation and duct sealing and repair. The choice may be based on program expense or popularity, or political considerations.

Programs that are unaffected by externalities are easier to evaluate. “Measures in, savings out,” programs such as the refrigerator buyback and recycle program or light bulb giveaways fall into this category.

More complex programs are those where the savings are impacted by environmental factors. Attic insulation, duct sealing and air conditioner upgrades will require more statistical analysis to evaluate. You will have to look at what is driving savings and how market noise impacts customers.

It will be necessary to establish a control group and develop a model of normalized annual consumption, with estimated energy impacts based on statistical adjusted engineering models. Obviously, engaging a third party consultant may be worth every penny.

Gainsville discovered that estimated savings equaled out across the program suite, though adding in cost variables improved the payback. But the point is to be able to identify programs that are not performing well and either fix or eliminate them.