Create Your Own Problem Consider electromagnetic fields produced by high voltage power lines. Construct a problem in which you calculate the intensity of this electromagnetic radiation in W/m2 based on the measured magnetic field strength of the radiation in a home near the power lines. Assume these magnetic field strengths are known to average less than a µT . The intensity is small enough that it is difficult to imagine mechanisms for biological damage due to it. Discuss how much energy may be radiating from a section of power line several hundred meters long and compare this to the power likely to be carried by the lines. An idea of how much power this is can be obtained by calculating the approximate current responsible for µT fields at distances of tens of meters
Step 1 of 3</p>
In order to formulate a problem involving the evaluation of electric field strengths, the corresponding magnetic field strength is to be known. The problem can be formulated as follows.
Evaluate the magnitude of the electric field in W/m2, in a place, due to the effect of a nearby power station. The magnetic field at this place is measured as 65µT. The permittivity of air as 8.85・10-12m-3kg-1s4A2
Step 2 of 3</p>
Whenever there is a power line system which involves the transfer of large currents, there will be formation of electric fields, which drain the power from the source outwards by radiation. This value of the radiated energy depends directly upon the amount of current passing through the line and the length or the area across which the lines are spread.
For a power line which is a few hundred meters long, the amount of power loss by radiation is found to be roughly about 3% of all the losses in the system. As a result, such losses are usually negligible compared to other larger losses such as the losses produced by the Joule effect.