Acceleration and Energy Usage

I started wondering about a five mile trip down a road with no wind. There would be four stops and four accelerations each to a constant speed of 60 miles an hour. All the stops are exactly the same.

My gut says that I will use more energy if I accelerate hard versus easy but then that is probably because easy acceleration produces an overall slower average speed (thus less aero loss for stretch of road).

The real question for me is what the energy usage difference would be if the slow and fast acceleration five mile drives were adjusted so that the average speed was exactly the same. Put differently, it takes X amount of energy to accelerate a given mass to a certain speed. Is there any difference in total energy used to accelerate MS quickly versus slowly all else being equal and, if there is, how much is the difference and why?

1. Hard accelerations cause heat buildup due to internal resistance of the battery/drivetrain and high current draw. This is additional wasted energy. Plus, the cooling system must remove this heat, which uses energy.

2. Hard acceleration has more friction losses in the tires.

3. Probably a small amount, but high accelerations put high loads on the bearings, increasing overall friction in the drivetrain.

4. High accelerations put more load on the reduction gear, increasing frictional losses.

5. It's my understanding that the load on a motor does not appreciably reduce the motor's efficiency itself...so high acceleration vs. low acceleration shouldn't have an appreciable impact on MOTOR efficiency (although it does affect the efficiency of the rest of the drivetrain).

Mine are a little more anecdotal with some unpublished ongoing research as a foundation.

In preparation for a long trip in two days, I have been watching energy usage and driving habits. When I drive to work my regular "pace" with rather peppy starts (60-80kW), I am looking at about 348wH/mi (which seems to be fairly the norm here in the forums). If I go gentle on the throttle, and keep my starts <40, accelerations <30 and maintenance <25, I can keep a good 55mph and do so at 288wH/mi while getting to the local office in 3 extra minutes, or the downtown office in 4-6 extra minutes.

So - whatever the reason, there IS loss (drivetrain, heat, etc.). I did notice from the curb rash thread that some of the tires are no longer lining up with the wheel rash, which also tells me that some of the acceleration is causing lost energy in moving the tire. So (again, anecdotally) there must be several other areas that contribute to the loss that we are not recognizing.

Bottom line, the answer to your question is "yes", there is a difference.

Right. Todd has nailed it. The one other thing is that with hard acceleration, you are spending more time at a higher speed and so there are additional aerodynamic losses. It's the area under the curve that equates to power used.

Right. Todd has nailed it. The one other thing is that with hard acceleration, you are spending more time at a higher speed and so there are additional aerodynamic losses. It's the area under the curve that equates to power used.

Click to expand...

Right. Champy (can I call you that lolachampcar? hehe) acknowledged this, but was wondering what causes reductions in efficiency assuming he reduced cruise speed for the hard-accelerating car so those drag losses are the same.

> Is there any difference in total energy used to accelerate MS quickly versus slowly all else being equal and, if there is, how much is the difference and why?[/QUOTE]

From physics, neglecting the subtle effects Todd listed, there is no difference. The energy required to accelerate from speed 1 to speed 2 is the kinetic energy (1/2 mv^2) at speed 2 minus the kinetic energy at speed 1.

The biggest factor is ohmic losses at higher power. Electric resistance quadruples when the current is double. In other words. If you accelerate at twice the power your losses are 4 times higher. Everything on the electric side (drive train/battery) becomes less efficient when accelerating hard.

The biggest factor is ohmic losses at higher power. Electric resistance quadruples when the current is double. In other words. If you accelerate at twice the power your losses are 4 times higher. Everything on the electric side (drive train/battery) becomes less efficient when accelerating hard.

Click to expand...

In simple terms when you accelerate hard, a greater percentage of energy used goes to heat instead of propulsion.

The physics never lies - in this case the losses of the racier driving will be higher. Some loss, however, will be chemical too - the heavier driving style takes more from the battery and then presumably in slowing down the regen will push some of that back. There is a % efficiency for the chemistry, with the excess going into heat (and the wasted energy then removed by the thermal jacket)

In simple terms when you accelerate hard, a greater percentage of energy used goes to heat instead of propulsion.

Click to expand...

But how much? Is it significant? Is it noticeable, even?

I would suggest that the difference is not large, if we are talking about accelerating to the same speed.* The idea that "resistance quadruples when the current is double" is nice, but if 99% of the power is going to propulsion the amount lost is not going to make a big difference. More to the point, exactly how long is the system being stressed? 5 seconds, say, of maximum acceleration, then an hour at highway speed, is going to be absolutely positively not a whit different than 10 seconds of slower acceleration then an hour at highway speed. If, however, the comparison is over a minute instead of an hour, it might make a difference.

I would suggest that the amount of friction braking avoided has a much, MUCH larger impact than whether you often perform maximum acceleration.

I have the energy efficiency figures, and propensity to floor it, to back that up.

* The point at which you lose out is if you travel faster then slow down. If you use friction brakes then you really lose out.

But how much? Is it significant? Is it noticeable, even? ... but if 99% of the power is going to propulsion...

Click to expand...

It is noticeable. The manual specifically mentions to avoid it when you try to get good range. If you do it a few times in a row even with a half charged battery the limiter will come on to protect the battery from dropping to a voltage that would be harmful.
The drive train efficiency is no where near 99%! Even when we leave out the charging losses and look at only the efficiency at driving, we have about 15-20% losses. That's based on the rated range driving, which means an average discharge rate of 0.25C. Hard acceleration is 2C or more. That's at least 8 times higher! Ohmic losses would be 64 times! You see how it is becoming significant. It is OK because we usually only accelerate for very short periods of time, but the more you do it, the more impact is has on your range.

Here you can see how it affects battery capacity measured in Ampere-hours. But also note that the curves show the voltage is lower at higher discharge rate which means the actual energy you get is less on top of that.

We've taken our first long trip and have been in LA the past three weeks with our daughter's family and new grandson. We take our granddaughter to and from preschool, going along Wilshire or Olympic in west central LA (Miracle Mile/Mid Wilshire to Westwood), often during rush hour. Lots of stop and go, even though it is only about 7 miles each way (takes about 1/2 hour or more each way). Back in the SF Bay area, we normally use about 350 whr/mile, but here in LA with the much more aggressive driving and lots and lots of stop and stop and go, we are often above 500 whr/mile and always above 400.

Fortunately, there are several free charge points in the Beverly Hills public parking lots and we have gone down to the Hawthorne SC to get several deep charges. Our daughter is renting, so we only have a 120V plug and have been getting 3 miles/hr from their home (using the not recommended extension cord which seems to work fine).

Sorry, I didn't mean to suggest that it was, it was an example to show that whatever the relationship, even if it's squared, the effect may still be practically irrelevant. The point as originally stated ("If you accelerate at twice the power your losses are 4 times higher.") is, I think, misleading. The losses are higher for a very short time, and how large is that loss compared to others when considering the much, much larger energy consumption to go a significant distance.

That's the main point. In normal use the acceleration phase is a tiny fraction of what the car spends its time doing. Sure, if you look at the average energy per unit distance over a very small time, it will matter, but over a minute? Five minutes?

The other thing which seems unclear is the reason why losses are not proportional to speed all the way to zero. There are a number of effects being summed, such that there turns out to be an optimal speed which is well above zero (roughly 25 mph / 40 km/h). Suppose you are accelerating up to that most efficient speed. If you accelerate faster then you spend less time in the less efficient speed zone, so that's an effect in favour of accelerating faster. A small (probably very small) effect, but again, it just doesn't seem clear that the arguments made so far amount to more than waving a wand and saying "this effect is significant and the others aren't".

I'd like to see experimental proof. The experimental evidence I've seen suggests that degree of acceleration is not as large an effect as many other factors. I can reach "ideal" average energy usage per unit distance (something around 160 - 170 Wh/km =~ 256- 270 Wh/mile) driving in town while accelerating quite significantly. That's in the spring when temperatures are such that I can leave the climate control off. With temperatures 10 to 15 degrees C cooler (18 to 27 F) I see more like 180 to 200 Wh/km (288 to 320 Wh/mi) over the same route. Climate control, or other temperature effects, seem to be a consumer of energy in a different league to whether one accelerates quickly for 3 seconds, once every minute or two. And at highway speeds the difference can easily be 50% between a slow speed and a fast one (say 90 km/h to 120 km/h); I would argue that this effect totally completely swamps any acceleration effects. i.e. the critical piece of information is that if maximizing range is important, slow down. And if you can handle turning off climate control but leave the windows up, do that too. Also of critical importance: avoid using the brakes. I think that if there's an effect due to acceleration, it is tiny compared to these.

Is it possible to get instrumentation accurate enough to measure total energy consumption to the second, and do repeated trials of something like
0-50 km/h at maximum acceleration, hold 50 km/h to a distance of, say, 500 m
vs
0-50 km/h at 1/2 full power, hold 50 km/h to the same distance
vs
0-50 km/h at 1/4 full power, hold 50 km/h to the same distance

? i.e. remove the effect of regenerative braking from the experiment. I would genuinely like to know the answer, with different top speeds and with different distances.

Lots of stop and go, even though it is only about 7 miles each way (takes about 1/2 hour or more each way). Back in the SF Bay area, we normally use about 350 whr/mile, but here in LA with the much more aggressive driving and lots and lots of stop and stop and go, we are often above 500 whr/mile and always above 400.

Click to expand...

7 miles in 1/2 an hour is 14 miles per hour. That's a much less efficient speed than average city speeds of 20-25 miles per hour. (Just search for the range vs average speed graph). I'm not saying that that is the only effect, just that it can explain at least some of the difference. The other factor is that the longer it takes, the longer other parasitic loads have an effect. Climate control springs to mind. If you leave the climate control off and it takes longer to get from A to B then the car will definitely consume more energy for climate control.

Most importantly, though: do you use the car's brakes at all? Every time you use the brakes, the car is converting all kinetic energy into heat in, and wearing of, the brakes. I suspect that that's the biggest culprit.

From physics, neglecting the subtle effects Todd listed, there is no difference. The energy required to accelerate from speed 1 to speed 2 is the kinetic energy (1/2 mv^2) at speed 2 minus the kinetic energy at speed 1.

That's the main point. In normal use the acceleration phase is a tiny fraction of what the car spends its time doing. Sure, if you look at the average energy per unit distance over a very small time, it will matter, but over a minute? Five minutes?

Click to expand...

That might be your normal use, but if you're diving an expressway where the speed limit is 45 with stop lights you are spending a good percentage of time accelerating. Slow or fast accelerations makes a big difference. If you gun it to get on a freeway and then cruise for an hour, then sure the extra energy used won't make much difference.

The biggest factor is ohmic losses at higher power. Electric resistance quadruples when the current is double. In other words. If you accelerate at twice the power your losses are 4 times higher. Everything on the electric side (drive train/battery) becomes less efficient when accelerating hard.

Click to expand...

I'm sure you meant to say that the power dissipation quadruples when the current is double; P = i^2R.