Digital power control, battery management converge in EVs

Engineers have perfected gasoline propulsion over the past hundred years or so. Now, OEMs and their suppliers have shifted gears, forming alliances and breaking down paradigms in an effort to optimize electric propulsion.

But electric propulsion comes at a cost, in terms of both product development and component complexity, with sophisticated and fault-tolerant vehicle intelligence and power electronics managing tens of kilowatts of power continuously.

Consider the simple task of measuring fuel levels in a conventional gas-powered vehicle. Depending on your automobile, the fuel gauge may be little more than a bimetallic strip driven by a heating coil connected to a sending unit. In an electric vehicle, by contrast, the fuel tank is a high-voltage battery comprising many cells (maybe 100 or more) in series/parallel arrangements. Achieving an accurate state-of-charge (SOC) determination requires precise voltage measurement of each cell (to within a few millivolts over temperature).

That is the job of the battery management system. The BMS is a highly accurate system that reports detailed information on the battery cell voltage, current and temperature to a central processor responsible for calculating the battery SOC (i.e., the auto’s fuel level). Failure to measure the battery accurately doesn’t just misrepresent battery SOC; it also can shorten the battery service life or create an unsafe and potentially catastrophic condition.

To avert that condition, ICs are being developed to such emerging standards as ISO26262, which ensures reliable operation through hardware built-in-test functions, as well as N+1 redundancy for safety-critical functions like cell over/undervoltage monitoring. If a single cell of a battery stack is forced into a deep discharge condition or is excessively charged, the cell can be permanently damaged, creating a potential for thermal runaway—a self-destructive condition. As such, secondary protection in addition to the primary battery monitoring system is necessary.

More-advanced BMSes synchronize voltage and current measurement as a means for continuously measuring battery impedance, the primary indication of a battery’s state of health (SOH).

Figure 1 illustrates the typical cell configurations and the BMS sufficient for measuring a battery’s SOC and SOH. Note that any single cell in the series stack limits the entire battery stack capacity. In other words, if one cell reaches its maximum or minimum voltage before the others, the charge or discharge cycle must be interrupted. Cell-balancing circuitry (illustrated in green in the figure) is used to ensure that all cells are charged and depleted in unison.

Battery charger basics

An EV charger is classified by output power/input voltage. Level 1 chargers typically are integrated on board, operate from 95 V to 265 Vac and are capable of charging between 1.5 kW and 3.3 kW. Dedicated Level 2 and Level 3 chargers operating from 240-V/480-V wiring systems are capable of charging at a much higher rate, but only within vehicle battery and connector constraints. SAE J1772, for example, is currently the only certified EV connector standard in North America and limits power to <16.8 kW.

Unlike batteries intended for portable electronics, automotive-grade batteries accommodate a much higher charge current without affecting battery life or nearing thermal runaway. A charger’s C rating is defined as a current into the battery that is proportional to the battery’s capacity measured in ampere-hours. For example, a 1C charger charges a 1-Ah battery at 1 A.

Whereas traditional lithium-ion batteries may be limited to 1C, some automotive batteries can be charged well above that limit, reducing recharge cycle time. In fact, high-power Level 3 chargers (such as those developed by Aker Wade Power Technologies3 and others) operating off 480 V/three-phase are capable of charging an EV battery in about the same time that it takes to fill a gasoline tank.

Note that EV battery capacity typically is rated in kilowatt-hours, which can be loosely related to a battery’s ampere-hour rating by dividing the kilowatt-hour rating by the nominal battery plateau voltage. As a point of reference, the Nissan Leaf integrates a 3.3-kW charger that takes about eight hours to charge its 24-KWh battery from 10 percent to full charge.4

Note also that an EV battery’s depth of discharge affects the unit’s life, so such batteries typically retain at least 10 percent of the battery capacity when a charge cycle begins.

Architecting the charger

The on-board charger must comply with strict industry and governmental regulations for electromagnetic compatibility, power factor and UL/IEC safety standards. Like all other lithium chemistries, the EV propulsion battery charger employs a constant-current, constant-voltage (CC/CV) charge algorithm whereby the battery is charged by a programmable current source until it reaches its voltage setpoint. The charger then moves into voltage regulation while monitoring battery current as an indication charge cycle completion.

The charging current (power) is negotiated by the BMS, hybrid control module (HCM) and electric-vehicle service equipment, depending on the input-voltage availability, temperature and battery SOC/SOH, as well as other system considerations monitored by the HCM. The safety and fault tolerance of this control algorithm cannot be understated.

A suitable power architecture involving interleaved power factor correction (PFC) followed by a phase-shifted full bridge is illustrated in Figure 2. The control feedback parameters are digitized by a microcontroller capable of numerically closing multiple control loops and precisely modulating the high-voltage MOSFET switches.

The centralized and highly intelligent control scheme addresses many issues not easily tackled with analog techniques.

More-advanced microcontrollers integrate a coprocessor (control law accelerator), for accelerated computation of the control loop transfer functions, and multiple high-resolution pulse-width modulators (PWM) capable of controlling the power switches to within 150 picoseconds. The architecture can dynamically adapt to variations in line and load, log system operating parameter data, and implement predictive and fault-aversion algorithms, while intelligently interfacing with all other vehicle subsystems via a ground-isolated control area network.

Recent advances in digital power make this approach viable as well as cost-effective, scalable and more appropriate for the high-power, multiphase applications found in EVs.5

Large and expanding libraries of modular software for digital compensation and virtually every power supply topology are freely available for integration by experienced software designers; test reports comparing digital and analog power solutions are also available.6 Consider, for example, the two-phase interleaved PFC function outlined in Figure 2. The PFC boost switches are controlled by PWM1 implementing multimode PFC to generate the compliance voltage of the battery charger.

The adaptability of such a topology is apparent in Figure 3, in which the digital compensation and phase management blocks are variables under software control. Applying digital techniques also makes the system less susceptible to noise and temperature, while intelligently synchronizing power stages to minimize interference and optimize filter design.