if suppose frequency is reduced in a transformer keeping the voltage as same the value of flux will decrease.What will its effect on performance of transformer as in comment about the new kva rating,variable and constant losses,magnetising current etc...!!

Author : Ketan - From: India

#1

Sat, January 2nd, 2016 - 12:29

An important parameter which generally should not be allowed to change when operating a transformer at other than its rated frequency, is the maximum allowable flux density, Bm, This is straightforward enough. The voltage applied to the primary and the new frequency must be changed in the same direction and by the same percentage. For example, to properly and safely operate a 120 V, 60 Hz transformer from a 50 Hz power line, the primary voltage must be reduced by the fraction, 50/60, yielding 100 V. Although this will, indeed, make the transformer 'happy', there are two side effects: the systems planner must now figure out how to conveniently and efficiently reduce the line voltage and, once having done this, how best to accommodate the new secondary voltage (after all, the turns ratio of the transformer has remained unchanged).
It should be realized, too, that the kVA capability is now 5/6 of its 60 Hz rating. Somewhat paradoxically, the operating efficiency of the transformer may show an improvement when operating from 50 Hz because of the lower core losses (see Fig. 3.1). It stands to reason that the converse situations apply when operating a 50 Hz transformer from 60 Hz. Here, the kVA rating may now exceed that pertaining to the original 50 Hz rating. The gain, however, will tend to be less than | because of the increased core losses. The primary can now use 6/5 x 120, or 144 V.
In making the adaptations described above to a new operating frequency, care is needed to keep the secondary load current within original limits; a load change may be required.
The case of operating the 50 Hz transformer from a 60 Hz line may, in some situations be easier to implement if one can accommodate less than maximum kVA capability. Here, it may be acceptable to merely substitute the new power line, not making any changes in either the applied primary voltage or in the load. We will have violated the postulate to keep the maximum allowable flux density at the designed level, but the departure is on the safe side. The 50 Hz transformer will simply operate as if it had been designed with a greater than needed safety factor. Operation will be cool and linearity will be good, but the kVA rating will be about 120/140 or 83% of what it could he at elevated line voltage.
A nice thing about the transformer substitution described above is that one can legitimately plan for a smaller, less massive transformer for a final prototype of the system. (It is easy enough to observe that appropriately designed high-frequency transformers tend to be physically smaller than their lower-frequency counterparts.)
The preceding discussion pertains to power line frequencies of sinusoidal waveform. With square waves and audio frequencies, various factors come into play tending to make transformer comparisons more nebulous. A few of these can be listed as follows: type of magnetic core material, core fabrication (butt joint, lap joint, tape-wound, powdered material, air gap etc.), primary inductance, distributed and stray capacitance, lamination thickness, insulation quality of laminate surfaces, skin-effect, leakage inductance, harmonic response, and others. Empirical investigation is generally the best practical procedure with such operation.
A special situation arises when dealing with transformers in a variable duty cycle circuit, such as in pulse-width modulation (PWM) switchmode power supplies. Unique in such operation is the presence of a d.c. component in the waveform. This situation is dealt with elsewhere.