Overview and important terms

[“The greatest environmental change wrought so far by the human species is the change in the electromagnetic environment of the earth. For billions of years the earth’s electromagnetic environment was virtually “silent” in the range of the electromagnetic spectrum below visible light, and light itself was the most abundant source of electromagnetic energy. Now, in just a few decades, with the explosion of wireless signals of radio and TV broadcasts, radar, military applications, microwave towers and cell phones, and ever etcetera, the density of radio waves and microwaves in our environment is many millions of times higher than the natural levels with which all life on earth evolved. In addition, the increased use of electronics by electricity consumers has resulted in a situation where our building wiring systems, intended to carry just 50 or 60 Hz, now broadcast harmful high frequencies as well. We have created and are living in a blanket of electrosmog never before experienced by living species, without having considered the consequences. What were we thinking?

In order to understand what we should have been thinking about, we need to be familiar with a few terms and features of manmade electromagnetic energy.

Atoms are made up of negatively charged electrons that orbit around a positively charged nucleus. Electrons can move up in different orbits when they are excited, Their return to their original orbit releases energy. Electrons shaken loose and traveling through a wire produce electricity. Light, heat, electricity and nuclear activity are all forms of electromagnetic energy.

Energy moves away from its source in waves, and is classified according to the length of its wave. Utility-provided electricity of 60 Hz (Hertz) has 60 waves per second. Frequencies below 3 kHz (3 kilohertz, or 3 thousand Hertz, 3,000 waves per second) are called Extra Low Frequency (ELF) and are measured in terms of their electric and magnetic components.

The electric field is related to the voltage in the conductor. Electric fields are present even if no current is flowing. For instance, a plugged-in lamp and the cord to it have an electric field, even if the lamp is not turned on. Electric fields are measured in V/m, or volts per meter.

The magnetic field is generated by the current flowing through a conductor, and it varies in strength with the strength of the current. Magnetic field strength is measured in milliGauss (mG), which is 1/1000 of a Gauss. Another unit of measure for magnetic field strength is the microtesla, µT. One µT equals 10 mG.

The lower the frequency, the longer the wavelength. The higher the frequency, the shorter the wavelength. The shorter the wave, the more power is inherent in it.

As we move up the EM spectrum from the longer to the shorter wavelengths, we encounter first electrical power transmission, then radio, TV, radar/microwave, radiant heat/visible light, ultraviolet, x-rays and gamma rays. The frequencies at and below that of visible light are known as non-ionizing, and those above light as ionizing. At ionizing frequencies, the particles of radiation contain enough energy to eject electrons from atoms and molecules, leaving them electrically imbalanced, or ionized. Ionized molecules are highly reactive and can damage cells.

As technology advanced and we began to use the higher frequencies, it was accidentally discovered that frequencies of about 27MHz (27 mega Hertz, or 27 million cycles per second) caused body heating. It was inaccurately concluded that any biological effects not caused by ionization must be caused solely by overheating. Thus the first safety standard set for exposure to manmade electromagnetic energy took only heating into consideration, relying mostly on the work of Herman Schwan, a biophysicist. In the 1950s, Schwan worked for the Defense Department, estimating “safety” according to how much radar MW energy it took to heat metal balls and containers of salt water, which he believed represented both the size and electrical characteristics of animals and humans.

Operating on the assumption that in regards to non-ionizing radiation avoiding heating meant safety, with heating occurring at 100mW/C2, the Air Force applied a “safety factor” of ten and set an initial safety standard of 10mW/cm2 2 (10 microwatts of energy absorbed in a square centimeter of tissue) in 1957. Later standard setters, influenced ever more strongly by industry and the military, ignored the emergence of evidence that biological effects were indeed occurring at levels far below 10mW/cm2, and in 1966 the American National Standards Institute (ANSI) developed ANSI C95.1-1966 at 10mmW/cm2. The rewritten ANSI/IEEE C95.1-1992 set a two-tiered recommendation, one for the general public and one for RF workers, and lowered the limit to some frequencies to 1 mW/cm2, but the standard still presumes only thermal effects, in the face of now monumental evidence to the contrary. The EPA called this standard seriously flawed and specifically cited the failure to recognize nonthermal effects. Nonetheless, it remains in effect.

In a July 16, 2002 letter from Norbert Hankin of the EPA’s Center for Science and Risk Assessment, Radiation Protection Division to Janet Newton, President of The EMR Network, Mr. Hankin writes: ” The FCC’s current exposure guidelines, as well as those of the Institute of Electrical and Electronics Engineers (IEEE) and the International Commission on Non-Ionizing Radiation Protection, are thermally based, and do not apply to chronic, nonthermal exposure situations.” — Shivani]

[page 2]