The correct answer is: C. core flux density is reduced.
When a given transformer is run at its rated voltage but reduced frequency, the flux density in the core is reduced. This is because the flux density is proportional to the applied voltage and inversely proportional to the frequency. Therefore, when the frequency is reduced, the flux density must also be reduced in order to maintain the same output voltage.
The iron losses in a transformer are proportional to the square of the flux density. Therefore, when the flux density is reduced, the iron losses are also reduced.
The core flux density is the maximum magnetic flux density that the core can withstand without saturating. When the core saturates, the transformer’s efficiency is reduced and the output voltage may be distorted. Therefore, it is important to ensure that the core flux density does not exceed the rated value.
The following are the brief explanations of each option:
- Option A: Flux density remains unaffected. This is not correct because the flux density is proportional to the applied voltage and inversely proportional to the frequency. Therefore, when the frequency is reduced, the flux density must also be reduced in order to maintain the same output voltage.
- Option B: Iron losses are reduced. This is correct because the iron losses are proportional to the square of the flux density. Therefore, when the flux density is reduced, the iron losses are also reduced.
- Option C: Core flux density is reduced. This is correct because the flux density is proportional to the applied voltage and inversely proportional to the frequency. Therefore, when the frequency is reduced, the flux density must also be reduced in order to maintain the same output voltage.
- Option D: Core flux density is increased. This is not correct because the flux density is proportional to the applied voltage and inversely proportional to the frequency. Therefore, when the frequency is reduced, the flux density must also be reduced in order to maintain the same output voltage.