Hello All,
My question is: When an aftermarket re-manufacturer offers an older era alternator that was 45 amp stock, now claimed to be upgraded to put out 95 amps, what has been done to arrive at the greater output? Are there downsides to reliability from the "upgrade"?
Steve

Within a given frame size, simplistically, by increasing the cross sectional area of the conductor, in this case the stator windings, the amperage output is similarly increased. This can be done by increasing the number of turns of wire but more commonly by increasing the gauge with round section wire, or less commonly, going to 'flat-wound' large section wire (CAV, Denso etc). 'Re-wiring' the stator, for instance going from a star to delta wound arrangement, are other tricks manufacturers resort to, as is adding a 'tap' to star connected stators to create a four wire rectifier (~10%).

Upping the rotor magnetic field strength, again by increasing the number of turns, or increasing the field current, as Delco did with some european models by going from a 3amp to 7amp field, also give greater output to a lesser extent, as does reducing the rotor/stator air gap (obvious production limitations). Increasing the rotor diameter and 'slimming down' the stator width to maintain frame size, as Bosch did with certain 70/80amp K1s, is also an option.

Drawbacks..........As alternator efficiency is fairly constant across the better contemporary manufacturers, although in all cases goes down with machine temperature (alternators are most commonly rated 'hot'), increasing the output creates more 'waste' heat, which affects all components, but the rectifier to the greatest extent. Higher rated diodes, more efficient heatsinking, and cooling flow are required. In this last respect modern internal fan (IV) machines which cool from the inside out, as opposed to older front fan draw through types, are better capable of dealing with this. Or, as OE outputs grow, jacketed machines plumbed it into the cooling system are becoming more common. Increasing field strength means higher rated parts.

Also, again within a given frame size, increasing ultimate output is usually at the expense of lower speed output. Much like a highly tuned engine, the power curve is more peaky. The 'cut-in-speed' (CIS), when the machine starts charging, is often higher too.

Most often overlooked is driving the alternator, single Vee belts become rapidly overworked when tensioned correctly. Doubles can handle more but Poly-Vees are really the answer. Also not enlarging the cabling, and keeping it as short as possible, is often neglected.

There's more, but that's the bones of it..........

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

I suppose to round it up a bit more mention should should be made of rotor design, although it's not nowadays a 'tuning' trick, AFAIK......

Rotor 'claw' count affects output, as each claw is an electro magnet. When CAV increased the output of their AC5 they went from a four to a six claw rotor, as did Lucas with their original AC10/11. Most manufacturers have settled on a six claw design (probably more correctly referred to as twelve claw; six north, six south, but not commonly known as such), certainly in the most commonly encountered automotive sizes. Interestingly Delco used a seven claw design, both sides of the pond, for decades, but has now fallen in line with others.

Not well known is that stator 'timing' has to match claw count. A four claw rotor will not work in a six claw wound stator, etc.

I'll shut up now, lol.

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

Many thanks, Kevin, but re-reading it I noticed I missed the end off a sentence. Not that it matters much but it should read:

"When CAV increased the output of their AC5 they went from a four to a six claw rotor, as did Lucas with their original AC10/11, but switched to six claws on all subsequent models."

If you need to upgrade, and don't need originality, there are plenty of fully serviceable, easily adaptable, high output machines around up to about 150amp, if you know what to look for. Or small frame, low PMI, ~50amp ones if that's needed. OE quality too, not spurious copies. Mostly Densos, which normally even have the wiring info on the machine sticker, but get the plug too. Just stay away from the PCM controlled ones.

As an aside, it takes just one engine bhp to drive ~35 amps, so anyone thinking of reducing parasitic loses maybe ought to re-evaluate not running an alternator. It's often the case to see more actual net engine power, via higher and more stable voltage, running with than without.

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

There's a lot of snake oil and witchcraft involved with alternators....no different from with ignition systems........or anything else electrical in a car.
An alternator is just a "pump" which pumps electrons. The faster you spin it, the more it flows. If you make it work against a restriction, it makes a voltage. No different really from a fuel pump and regulator. The fuel pump makes the flow, the regulator is a restriction which makes pressure.
Of course there's a limit to how much they can output. The diodes get hot due to the voltage drop (0.7 ish v)times the amps.

OK, so how are the stock ones rated at whatever amps?
If you don't know that then you don't know how the upgraded ones are rated either.

I'll take a guess that it's some amp level that makes the diodes far too hot and then its de rated by XX %.
It doesn't mean that you can't get 95A out of a completely stock 55A unit. You just have to spin it fast enough and maybe trick up the regulator a bit so there's no current limiting.

Many years ago I built my own regulator for a Lucas alternator. They're really simple. A power transistor which switches on the current to the rotor. Another transisitor and zener diode (voltage sense) which shuts down the power transistor as the voltage rises.
You don't ever want more than 14.7 volts, 14.4 is really the limit for the battery.
If you've got a V1 radar detector, they go into melt down mode at 14.7V.........I've repaired lots of them due to dodgy charging systems.

An amp rating on an alternator means nothing unless you add to it the voltage and the RPMs.

It's a bit like saying "my ports flow 300CFM" but not stating the test pressure.
Or like saying "my EFI pump flows 200GPH" without stating the input voltage or the output pressure.

The main limiting factor with an alternator is magnetic saturation of the rotor. Look it up on google?
That's so ling as the diodes don't cook.
It's also magnetic saturation that limits the output of an ignition coil.

Not the worst analogy, but an alternator creates electrical 'pressure' that restores the battery's charge and converts rotational energy to electrical energy that feeds the vehicle loads. Amperage is a function of the existing system voltage and the alternator output voltage, the potential difference governs actual output.

The faster you spin it, the more it flows.

While technically true, in the real world no. Most of an alternators output, depending on the power curve up to 90%, can be achieved within 1/3 of it's maximum rotational speed. The curve is king, not speed per se.

If you make it work against a restriction, it makes a voltage. No different really from a fuel pump and regulator. The fuel pump makes the flow, the regulator is a restriction which makes pressure.

As system voltage rises, due to lower demand, amperage drops and voltage increases, due to the clipping pulse of the regulator sensing rising voltage and modulating rotor strength. Rotor's have 'reluctance', so the field simply just doesn't collapse but decays slightly, recovers, decays etc etc so it's strength only fluctuates slightly over a small time span. Also the shape of the clipping pulse changes, going from a 'saw tooth' to 'shark fin' profile as it regulates, depending on demand.

Of course there's a limit to how much they can output. The diodes get hot due to the voltage drop (0.7 ish v)times the amps.

Only mostly limited by field and stator saturation , and cooling efficiency (hence the trend to water cooled machines). Diode back voltage is small enough to not really be a limiting factor.

OK, so how are the stock ones rated at whatever amps? If you don't know that then you don't know how the upgraded ones are rated either.

Alternator output is mostly rated 'hot' (90degC springs to mind as a common value but I'd have to look it up, and it will vary by manufacturer). Again ultimate output is secondary to the output curve, which most manufacturers will supply.

I'll take a guess that it's some amp level that makes the diodes far too hot and then its de rated by XX %.

No it's predominently rotor/stator saturation, perdetermined at the design stage.

It doesn't mean that you can't get 95A out of a completely stock 55A unit. You just have to spin it fast enough and maybe trick up the regulator a bit so there's no current limiting.

Anythings usually possible, it just has to be worth it. You can spin it as fast as you like, until the rotor windings centrifuge or the bearings cry foul, with diminishing returns. No modern automotive alternator I know of has a 'current limiting regulator', as they're inehrently self limiting via stator saturation. Current limiting regulators went out with dynamos. Dynamos need currently limiting as the main power generation component is the armature and keeping it cool enough to avoid self-incineration is job of the 'shunt coil' (IIRC) in the regulator/cut out.

Many years ago I built my own regulator for a Lucas alternator. They're really simple. A power transistor which switches on the current to the rotor. Another transisitor and zener diode (voltage sense) which shuts down the power transistor as the voltage rises.
You don't ever want more than 14.7 volts, 14.4 is really the limit for the battery.
If you've got a V1 radar detector, they go into melt down mode at 14.7V.........I've repaired lots of them due to dodgy charging systems.

Most OE analogue regulators used to have three transistors, modern digital ones I'm not sure as I have little interest these days, though I'll try to look it up. Anything more than 14.4v is a unecessary and may affect battery longevity.

Anyway, as I remember it and JMO. Your terminology and my colloquialisms may vary.

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

Oh, and apologies for the double post, but if you're maybe struggling a bit with visualising all this, and not being the brightest spark myself I know i did, it may help to view an electrical circuit much as you would view an hydraulic one: Pressue, in say Pounds Sqaure inch ( PSI), becomes Voltage; Flow (in say GPM), becomes Amps. Not strictly true, but close enough.

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

Digitally controlled regulators. If you stick one on a test bench bench and power it up you sometimes can hear a characteristic digital 'whine', and get a flick on the 'scope trace.

The 'smart charge' ones are linked into the PCM/ECU and do things like run up to 17v after a cold start, or completely kill the charging when sensing the vehicle may be overtaking. Others can be 'coded in' so a trip to the dealer to re-code, or dealer only full stop, if you need to replace. You can come to your own decisions as to the efficacy of all this..............

"If an honest man is wrong, after demonstrating that he is wrong, he either stops being wrong or he stops being honest." Anon

You don't ever want more than 14.7 volts, 14.4 is really the limit for the battery.
If you've got a V1 radar detector, they go into melt down mode at 14.7V.........I've repaired lots of them due to dodgy charging systems.

Not just the aftermarket bits. I've had trouble on hot rods/customs with modern 95 amp alternators frying the older OEM gauges. The designers were working with generators and never expected the gauge mechanism to see 15 volts.

Can you recommend an instrument voltage regulator/limiter to protect the instruments?