ARPA-E, the Department of Energy’s $770-million-and-counting investment into cutting-edge energy technologies, has mainly focused its funding on renewable energy (the SunShot initiative is an outgrowth of ARPA-E), biofuels, energy storage and power electronics.

But there’s also a fair share of smart-grid-specific technologies under the ARPA-E umbrella. In October 2011, the agency formed its Green Electricity Network Integration (GENI) program, aimed specifically at technologies to “modernize the way electricity is transmitted in the U.S.” The focus here is on integrating intermittent renewables like wind and solar power, both at utility and at distributed scale.

GENI directed $38.9 million to fifteen grant winners that include a fascinating cross-section of startups, big corporations, universities and DOE labs, all involved in technology on the hardware side, the software side, and on the hardcore power electronics side, where differentiation between what’s the grid and what’s the digital network begins to blur.

Amidst the hundreds of grant winners crowding the showroom floor of the 2013 ARPA-E Energy Innovation Summit in suburban Washington, D.C. this week, several GENI project researchers were available to explain just what they’ve created with the money they’ve been given, and what they and the DOE are trying to learn from their efforts. Here’s a sampling of some of the most interesting projects, organized by category into hardware, software, and the intersection of the two:

Doug Speight, strategic innovations project director for ORNL, told me that today’s technologies for managing power flow along the grid are either relatively clumsy electromechanical devices, or more expensive, cutting-edge technologies, such as FACTS systems for distribution grids, or power conversion stations for transmission grids. But those newer technologies are “prohibitively expensive,” coming in at about $140 per kilovolt-amp (kVa), he said.

ORNL’s new “prototype iron-based magnetic amplifier,” on the other hand, replaces expensive superconductive wire with a low-cost magnetic iron core. That could drive down its comparable costs to about $4 per kVa, while serving the same functions -- and because ORNL’s device sits separate from the power line itself, it won’t mess up the power flow it if breaks, he added. Partners SPX Transformer Solutions and University of Tennessee-Knoxville are working with ORNL on testing the device.

The Menlo Park, Calif.-based startup launched its first product, a distribution line sensor and voltage management device, in February. Andrew Dillon, vice president of business development, was at this week’s ARPA-E conference to demonstrate Varentec’s next device, a 13kV/1-megawatt “dynamic power router” device that serves similar functions to reduce congestion and direct power flows at the points where two big distribution feeder lines or circuits come together. Varentec is testing a prototype device in the lab now, and plans to work with partner Georgia Tech to test three more on a “virtual grid” later this year, and launch a pilot with big Southeastern utility Southern Co. around the end of 2013, he said.

Probability-Based Software for Grid Optimization, $2.99 million: This project, led by DOE’s Sandia Labs, is taking a crack at bringing a “new, probability-based formulation” to the market management systems (MMS) that control the buying and selling of real-time energy transactions on the grid. In simple terms, Sandia’s software is adding a ton of new variables and probability equations to the energy trading mix, in hopes of making it more efficient.

The trick is in using a stochastic process, Ross Guttromson, energy storage and transmission analysis manager at Sandia, explained. Today’s MMS systems estimate their daily, hourly and real-time energy needs by calculating single-point, average values to forecast daily generation capacity and cost as balanced against load, he said. Sandia’s software, by contrast, takes a stochastic, or variable over time, process, to change that single value into a range of potential outcomes, based on probability.

In other words, instead of a single line on a graph representing projected load and generation over the course of the day, Sandia’s software presents a whole lot of lines, each representing a certain probability of a combination of events -- weather, fuel cost, variability in demand, etc. -- that could happen over the course of the day. Crunching all that data is a challenge, but Sandia has it down to where it can install a server cluster to run it for a utility in about a half-hour, Jean-Paul Watson, a member of Sandia’s Discrete Algorithms and Math Department, told me.

The team has simulated a small-scale grid problem, featuring three sources of generation (cheap, mid-priced and expensive), and run it through a variety of simulations, from the business-as-usual to the highly improbable. Overall, it’s shown it’s able to save about 4 percent in overall generation costs by better predicting, then matching, demand and supply, Watson said. The next step is to test it out at broader scale with its partners, grid operator ISO New England and big MMS system vendor Alstom, which will happen over the course of this year.

There’s a lot of heavy math and hardcore computational prowess that goes into predicting something as complicated as the ebb and flow of energy across the grid, and several GENI grant-funded projects on display this week were working on similar challenges. Some examples include the University of Washington and University of Michigan’s Renewable Energy Positioning System project, which has a $1.4 million grant from ARPA-E, and consultancy Charles River Associates’ Decision-Support Software for Grid Operators project, with a $1.3 million grant.

Autonomous, Decentralized Grid Architecture, $2 million: This project is aiming at nothing less than enabling the internet of the grid, according to Georgia Tech professor and project leader Carlos Santiago Grijalva. Working alongside partners including Duke Energy, OSIsoft, Verizon and grid operators PJM and Midwest ISO, Georgia Tech is building the digital architecture for an “electricity operating system,” and a decentralized energy scheduling system, for what Grijalva described as the “millions of decision-makers” that will populate the future grid.

These decision-makers, or endpoints, can range in scale from massive power plants and master control systems of grid operators and utilities, down to individual grid sensors, smart meters, home energy gateways, smart appliances, EV chargers, solar inverters and even individual smartphone-enabled customers, he said. Instead of connecting all these systems in the traditional, centralized way, “we’re proposing a decentralized architecture” that will allow each to communicate simultaneously with each and every other endpoint.

That allows for a “flat” architecture, he said, in which each device can describe itself -- what it does, its capacity for delivering its services, what it needs from the rest of the grid -- in close to real time, he said. That could allow distributed portions of the grid to form their own interactions to manage power quality or availability in congested parts of the grid, as well as a more finely grained management of the grid as a whole by central operators, he said -- explaining the interest of partners like Duke and PJM.