]]>The biggest hurdle to hooking up more batteries to the power grid (to store energy) has long been cost — batteries are just too expensive. But another real-world challenge is quietly emerging for solar installer SolarCity as it’s been trying to ramp up sales of batteries, made by electric car maker Tesla, combined with solar panels: The utilities themselves.

SolarCity says that its solar battery pilot program — which it has been working on since early 2012 — has managed to get around 500 customers in California (including Walmart) interested. But to date only a dozen of those customers have battery systems connected to the grid. That’s because California’s three large utilities are slowing down the connection process, requiring a series of applications, charging high fees for connecting batteries (in some cases close to $4,000 per customer), and just taking a very long time to connect the batteries to the grid, says SolarCity.

As a result of the utility barriers, SolarCity has temporarily stopped submitting applications because, as a spokesman tells Businessweek, “We’ve lost faith that these things are actually going to be carried out in any reasonable time.” SolarCity has installed about 100 batteries in customers’ buildings.

SolarCity spokesperson Will Craven clarified the state of the program as:

There’s a queue of customers in each IOU territory, and until those queues move, we’re not going to submit $800 application fees for more customers to enter the queue. We fully intend to continue the program and interconnect all customers under contract, it’s just a matter of the utilities helping to facilitate.

Why would utilities stand in the way of batteries combined with solar panels? Well, they enable the customer to essentially go off-grid, generate their own power with solar panels and store that power to be used at night. The Rocky Mountain Institute released a report earlier this year that said that the combination of solar panels and batteries could be so transformational for how Americans use power that large chunks of the U.S. could defect from the grid by 2030, leading to an erosion of customers and revenue for utilities.

SolarCity claims the fees that utilities are charging customers for applications are illegal. PG&E tells the San Francisco Chronicle that it’s not delaying the process, but is taking some eight to ten weeks to inspect the systems before connecting them to the grid. The utility also says it won’t be rushed.

Grid batteries won’t be going away any time soon. The market is just emerging and Tesla plans to build a massive battery factory in the U.S. that could cut the cost of making batteries by 30 percent. Tesla has said some of those batteries will be used for grid sales.

Updated at 2:12pm March 20 to add a comment from SolarCity spokesperson Will Craven.

]]>Over the last 200 years, the world has experienced several waves of innovation, which successful companies learned to navigate. The Industrial Revolution brought machines and factories that powered economies of scale and scope, making a profound impact on society and the culture of the world. With the Internet Revolution we have seen the rise of computing power, information sharing and data networks, fundamentally changing the way we connect (on whatever device).

And now we are at the cusp of another metamorphic change that will spawn new business models, new jobs and new operational efficiencies: The industrial internet, the convergence of contextual data, people and brilliant machines.

Whether we’re discussing the consumer internet or the industrial internet, a new challenge arises with the seemingly endless proliferation of connected devices and intelligent machines: Big data. Consider that 90 percent of the data in the world today was created within the last two years. And according to IDC’s Digital Universe study, between now and 2020 the amount of digital data is expected to double every two years – much of which will be ‘dark’ data that won’t ever be used.

To fully appreciate the potential for this data, it is important to consider how large the global industrial system has become. There are now millions of machines across the world, ranging from simple electric motors to highly advanced CT scanners. There are tens of thousands of vehicle fleets, ranging from the trucks fielded by utilities and governments to the aircraft that transport people and cargo around the world.

Tied to those fleets are thousands of complex networks ranging from power grids to railroad systems, all of which are spinning off millions of bytes of data daily. To put this in context, the information contained in all 33 million books in the Library of Congress equates to roughly 15 terabytes of digital data. Compare that to the almost six terabytes of data a single 500-turbine wind farm generates in one day. And that farm just makes up a tiny component of the industrial internet. If we can harness all of this data, it will help reduce unplanned downtime of these machines, which in turn lead to savings for both the company and consumers.

Big industry, big data, big analytics

By 2020, the biggest change for industrial businesses will be in how they use new software-based services to manage this influx of critical data.

Companies like Amazon, Facebook and Google understand that existing data platforms – the majority of which were designed to handle traditional back office IT requirements – are not adequate to meet the on-demand needs of their customers. Thus they have developed new architectures (software and hardware) that can manage the ever-growing amount and pace of data-generation. They have learned to manage large volumes of consumer information to provide new services, and in doing so, have fundamentally changed the consumer landscape.

But the industrial sector needs to improve on what the consumer internet started. The next leap forward for industry will be all about more agile development, mastering data science and becoming proficient at repeatable processes. There is a need to create software and a platform that is equivalent, if not better, than what current consumer companies have built.

When compared to other sectors (e.g., government, financial services, and retail), industrial data is different, growing at two times the rate of any other big data segment in the next ten years. Its creation and use are faster, safety considerations are more critical, and security environments are more restrictive. Computation requirements are also different. Industrial analytics need to be deployed on machines (sometimes in remote locations) as well as run on massive cloud-based computing environments. The unique and proprietary nature of industrial applications and processes also make it difficult to properly contextualize the data.

As a result, the integration and synchronization of data and analytics, often in real-time, are needed more than in other sectors. Industrial businesses require a big data platform with common software and hardware standards, optimized for these unique characteristics.

Indeed, these requirements are changing how data and information will be used by industrial operators. There are six capabilities that industrial companies must adopt as part of their business strategy to be successful in this new era.

Data collection and aggregation. Industrial companies must collect and aggregate data and information from the widest possible range of industrial devices and software systems, as well as those from enterprise and web-based systems. They must be able to integrate and normalize different data types (streaming sensor data vs. transactional enterprise data), different response times (once per ten milliseconds vs. once per day), and different business requirements (real-time process optimization vs. less real-time asset optimization) and reconcile their use at different levels of analysis

Advanced analytics at the point of need. What’s required is a software-defined machine: the ability for assets to be abstracted into software running in connected virtual environments where analytics are continually tuned to the requirements of specific devices, business processes, and individual roles.

Cloud-agnostic, deployment independence. Industrial companies need a highly flexible deployment architecture that allows them to mix and match technology deployment methods – and avoid vendor lock-in – as their needs and technological options change. For companies that are bound by regulatory requirements, this may mean supporting private cloud deployments; for other companies, it may mean supporting third-party public clouds from various providers.

Extensibility and customizability. Industrial companies need a big data platform that is highly extensible and based on standardized APIs and data models that allow it to adapt to new capabilities, new devices, new data types, and new resources as they become available, while still preserving the capabilities of the legacy systems that continue to impart value.

Orchestration: Industrial companies must support the orchestration of information, machine controls, analytics, and people in order to ensure that the different components of the industrial big data world interoperate effectively. For example, a machine-level analytic that detects and responds locally to an operational anomaly must also be able to set in motion other analyses and actions (e.g., rescheduling flights or moving spare parts) across the network in order to prevent knock-on problems throughout the system. This requires the ability to self-tune and adapt as data, processes, and business models change.

Modern user experience. Industrial companies must deliver the above components within the context of a modern user experience that is no longer bound to a desktop; this includes supporting a wide range of mobile devices and user interaction models, as well ensuring that the user experience is tailored to the individual’s role and requirements at a particular time and location.

Humans and machines are more interconnected than they have ever been before, and data is the common language between us. The rate at which data is created from the intelligent devices and machines we use every day is forcing corporations to shift their business processes and strategies to adapt to this new connected reality. And the ability to manage this data will not only change the landscape for businesses, but will allow us to navigate successfully through this next wave of innovation.

]]>Batteries could do wonders for the power grid — but odds are, the winning technology won’t come from those good ole’ lithium ion batteries that power our laptops and which Tesla is using for its electric cars. Startup Aquion Energy thinks the answer to the future of grid batteries will come from an entirely new type of chemistry — one that uses basic materials and is ultra low cost — and the company on Tuesday plans to announce an important partnership with grid heavyweight Siemens that could help see Aquion’s brand new grid batteries move into the market more quickly.

Aquion said that Siemens has purchased a shipment of its grid batteries and it will be testing those batteries with Siemen’s power inverter technology. If Siemens likes the technology — and it works as advertised — the idea is that Siemens could eventually bundle the batteries with its power grid infrastructure and sell it to customers like solar farm developers; though the partnership is just a memorandum of understanding at this point.

It might sound like a routine agreement, and in some respects, it’s the type of relationship that all new technologies need to score. But it’s a big deal for the young startup that was founded in 2007 and which has yet to scale up its manufacturing to a commercial level. That won’t happen until mid-2014, and Aquion is just starting to deliver its final battery products to pilot customers now.

So what’s Aquion’s energy storage innovation? The startup — which is backed by Bill Gates and VCs like Kleiner Perkins and Foundation Capital — is making a low cost, modular grid battery made from basic materials like sodium and water. The battery pairs a carbon anode with a sodium-based cathode, and a water-based electrolyte shuttles ions between the two electrodes during charging and discharging. The technology was developed out of Carnegie Mellon University by founder and chief technology officer Jay Whitacre.

By using basic materials, Aquion is hoping its product is inexpensive enough to disrupt the current grid battery market. Aquion’s CEO Scott Pearson told me that several years from now, when the battery has been manufactured at a commercial scale for awhile, the price point of the battery could be $300 per kilowatt hour. That’s about a third of the cost of some of the more expensive lithium ion battery grid products currently on the market. Pearson said that even now at the pilot scale that its batteries are “not radically above that” $300 per kwh level.

A low cost and easily installed grid energy storage option could be a game changer. Over the next few years solar and wind farms are going to be built at a record pace in various markets across the globe and reliable and low cost batteries will be crucial for banking clean power at night or when the wind dies down. Pearson told me that there’s at least a dozen applications for which power companies and utilities could use the batteries, like renewable energy integration, power grid load shifting (helping utilities handle spikes in load) and ancillary services like frequency regulation (the power grid has to maintain a specific frequency).

Aquion managed to raise $20 million back in 2011 and started working on another $35 million earlier this year. Investors in the company include Bill Gates, Bright Capital, Gentry Venture Partners, Kleiner Perkins and Foundation Capital. Kleiner Perkins’ David Wells played a key role in helping incubate the technology and Whitacre and Kleiner sponsored an incubator at Carnegie Mellon for Whitacre to develop the tech.

]]>GigaOM Pro analyst Adam Lesser geeked out with Power Assure CTO Clemens Pfeiffer, and came away with three undercover trends that could transform the data center with energy efficiency. You can read the entire post here, but here’s tidbits of the three trends:

]]>Over three years ago the Department of Energy allocated around $4.5 billion in stimulus funds for smart grid projects in the U.S. That included $700 million for demonstration projects that helped install technology like wireless data networks and battery farms for the power grid. With such a large amount handed out over a relatively short period of time, the chances of problems were high. And according to a newly released report from the Department of Energy’s Inspector General, the smart grid demonstration program had some management issues.

Those problems included:

Handing out funds for a couple projects for estimated costs instead of actual costs of projects, which resulted in over payments.

Funding a project that also received a grant from another DOE program, the ARPA-E program, for the exact same project.

Funding a project that had not handed in proper documents, and had not begun making the energy storage units that it claimed for the award.

The audit looked at just 11 projects, with a total of $279 million in awards, out of the $700 million smart grid demonstration program. Out of that section alone they found $12.3 million in questioned costs, and they have now recovered $6.6 million of those misspent costs, and plan to recover another $5 million. It’s probably safe to assume there’s similar levels of mismanagement throughout the entire $4.5 billion.

The report says:

The Department had not always managed the Program effectively and efficiently. . . .In the absence of significant improvements, the Program is at risk of not meeting its objectives and has an increased risk of fraud, waste and abuse.

As a result the report recommends that the program:

Ensure adequate review of payments made to recipients

Provide training to recipients on proper submission of reimbursement packages

Ensure that recipients contribute their required cost-share from allowable sources

Ensure the elimination of any potential overlapping funding among awards authorized by various Department programs

Contracting officers should resolve the questioned amounts in our report.

The stimulus funds delivered an unprecedented, and game-changing amount of federal support for the next generation of power grid technology. It’s natural that with such a large amount of money allocated that mismanagement would happen.

One of the things not addressed in the report is how effective — or not — this type of stimulus program for smart grid projects actually was. Many in the industry back in 2010 complained that the funds actually had a sort of chilling effect on the sector, because the funds took awhile to actually get delivered, which meant utilities waited on these projects and didn’t put money into other new ones.

Cities in India have goals to make the power grid more reliable and more efficient in the coming years, particularly to help ease the addition of several more gigawatts of solar power on the Indian grid. But in reality the power grid in many Indian cities is a crippled network that is often hit with rolling blackouts.

Over the next week and a half I’ll be traveling with the Geeks on a Plane tech group (see disclosure) and we’ll be visiting entrepreneurs, tech execs and different sites in Delhi, Mumbai and Bangalore. While we were in Old Delhi on Sunday, I snapped these photos of some of the wild jerry-rigged grid action going on in a few sections of the city. Now that is some craziness. Check out my little photo essay of Old Delhi’s “grid gone wild:”

]]>Despite the seemingly downward short-term trend for cleantech investing, corporations and investors continue to back the green building sector. On Wednesday San Francisco–based energy-efficient building company Project Frog announced that it has raised $22 million from GE, as well as venture capitalists including RockPort Capital and Claremont Creek Ventures.

GE showed its interest in Project Frog back in June, when it revealed the second group of startups that had won funds from its Smart Grid Challenge program. Over the past year through its Smart Grid Challenge project, GE has identified dozens of startups in the building energy, power grid and energy software space and has made a variety of small investments of several million dollars in these firms.

Project Frog designs and makes buildings that are prefab, and it uses control systems and software to make the buildings far more energy-efficient than a traditional building. Because its designs are manufactured off-site and assembled at the construction zone via a building kit, Project Frog says its buildings can also be built far more quickly and at a lower cost than traditional buildings.

Frog says a typical design and construction project can take 6 months, and on the energy front Frog says its buildings use 25 percent less energy than a standard building. The company targets its building kits and designs for schools, stores, museums, commercial workplaces and health care. GE says it has started building a Frog Design building at its Learning Center in Ossining, New York.

]]>Only 72 percent of China’s wind-power sources are connected to its grid — meaning there’s a good deal of wind turbines that are spinning that aren’t providing usable clean power. Battery maker A123 Systems hopes its first deal in China can help with that problem, and on Tuesday it announced that it will be supplying a 500 kW lithium-ion battery demo system to one of China’s top wind makers, Dongfang Electric Corporation.

Donfang will install the energy storage system at a factory in Hangzhou city, China Zhejiang Province, and will use it to evaluate how to better use energy storage to integrate wind power onto the grid. Wind power is a variable source — it’s only available when wind blows — and it needs energy storage tech to provide needed capacity to level the ups and downs.

In particular in China there’s the problem of something called “Low Voltage Ride Through,” (LVRT). When wind dies down and there’s periods of low voltage, the wind systems disconnect from the grid, and then are slow to reconnect. A123 Systems says its energy storage management systems are uniquely designed to help get generation back up on the grid quickly.

Rob Johnson, A123 System’s VP of energy storage, told me that the pricing of the battery systems are around $1,000 per kilowatt installed. Not exactly cheap when compared to lower end batteries like lead acid batteries. But Johnson says lithium ion batteries, and specifically A123’s, are better suited to integrating wind on the grid, and can provide the short bursts of power and the high life cycle needed.

China could be the largest grid storage market in the world, says Johnson, given its pledge to construct gigawatts of wind and solar power. And A123 Systems hopes its small demo project — its first foray into the country — will turn into a much bigger deal.

In the wake of a slow ramp up of electric vehicles, A123 Systems has increasingly been looking to the power grid for revenues. So far, it has scored several deals with power company AES, including a deal to provide battery storage as grid regulation for a 500 MW power plant in northern Chile.

]]>Earlier this year Aquion Energy, which is developing a battery for the power grid, told us that the company was looking to raise money by the summer for a 500 megawatt-hour factory. Well, here’s proof of that: According to a filing, the startup has closed $20 million of a planned $30 million round and has brought on investor Foundation Capital in addition to existing investor Kleiner Perkins Caufield & Byers.

Four-year-old Aquion has developed a modular battery that uses basic, widely-available (and edible) materials, like sodium and water. The company says its bulk battery will be low cost compared to the current batteries that are being piloted on the power grid, and could be as cheap as one-tenth the cost of the alternatives. The Electric Power Research Institute (EPRI) says that grid-scale lead acid batteries cost around $950-$1,590 per kilowatt, or $2,770-$3,800 per kilowatt-hour, and lithium-ion batteries cost about $1,085-$1,550 per kilowatt, or $4,340-$6,200 per kilowatt-hour.

Aquion’s goal is to use its grid battery to help utilities integrate more solar and wind power. Solar and wind are variable power sources (when the sun doesn’t shine and the wind doesn’t blow they don’t produce power) and energy storage will be a crucial way to even out those ups and downs for the grid. Aquion also says its battery, which it thinks can last 20-30 years, can withstand a wide range of temperatures without losing storage capacity, meaning installing it onsite at a solar farm wouldn’t affect the battery’s performance.

Founder and chief technology officer Jay Whitacre developed the basic science for Aquion at Carnegie Mellon University. The battery pairs a carbon anode with a sodium-based cathode. Water-based electrolytes shuttle ions between the two electrodes during charging and discharging. Traditional batteries often use solvent-based electrolytes.

The idea was to create a battery out of the most basic elements that could produced at a massive scale for cheap. Aquion Energy business development chief Ted Wiley, described the battery to us as “literally edible—every single material in the battery.”

The company is now looking to transition from “an early-stage technology development organization into a full-fledged product company,” as Kleiner chairman Ray Lane described it earlier this year. Commercializing the tech will be tough. For example, just selling into a utility-scale energy project can take as long as three years, and the factory will cost an estimated $75 million to $80 million.

]]>Talk about big data: The California Independent System Operator Corporation has installed an 80-foot by 6.5-foot screen in its control room to display real-time power-grid data from thousands of endpoints. Its new system is powered by software from Space-Time Insight, whose software melds real-time geospatial data with Google Maps to give ISO employees access to the data they need in a format that’s very useful. It might be the most-cutting edge ISO control room in the world.

The problem that required such an extreme solution is the massive scale at which everything takes place in California. The ISO, for example, manages about 85 percent of California’s power load, totaling more than 286 billion kWh of energy per year across more than 25,000 miles of line. Employees are inundated with data, said Jim McIntosh, an executive director at California ISO, so they needed something that would present it to them in a useful manner.

What Space-Time Insights’ software does is present data however it will be most beneficial for that customer’s particular needs. It maps data — from wherever and whenever it’s from — in-memory for fast access, organized geospatially as well as by time. The result might be the popular map view, or it might be any number of heat maps, correlation, trending or any other number of more-classical data views. Additionally, Space-Time’s Steve Erlich told me, customers can customize the application how they see fit to create their own visualizations, functions, data workflows or other capabilities.

McIntosh says his team used to get data updates at 4-second intervals, but now it gets updates by the millisecond, and presented in a very intuitive manner that shows and tells what’s going on. He analogized the switch to the Space-Time system to making the move from x-rays to MRIs. Other ISOs that have toured the control have been very impressed, he added, because it’s likely the most state-of-the-art in North America, if not the world.

The California ISO began using Space-Time in 2008 to combat wildfires in San Diego, but now uses it for a wide variety of tasks. To combat wildfires, McIntosh explained, the ISO is able to see where wildfires are and how they might move potentially hours earlier than previously possible, which lets it manage the power supply accordingly. It can proactively move power flow off of lines that might be affected by fire to protect from a future power loss and other damage that might occur should a fire hit live power lines.

Another use case for which the new system is ideal is making sure California’s grid is using the right data source at the right time. The ISO monitors data from about 4,500 nodes to determine what energy source will be the cheapest for any given location at any given time. This is made all the more difficult because of strong California mandates around using certain percentages of renewable energy. If wind conditions pick up and make that an ideal source for a certain location, it’s the ISO’s job — in the names of both economics and compliance — to make sure that wind isn’t being wasted.

As for Space-Time Insight, which was formed in 2005 and first got funded in 2008, it’s becoming very popular with utilities worldwide, Erlich said, but it’s expanding fast. Oil and gas, transportation and other industries that derive value from advanced visualization and real-time analysis are buying its software, and the company just signed up its first new media customer. Although who it is and how it’s using the Space-Time software are still a secret.