Member-only story

Why do electromagnetic waves get weaker with distance?

Giuseppe Frisella
1 min readFeb 5, 2024

--

The intensity of an electromagnetic wave is the power transmitted per unit area. Since most antennas at great distances behave as a point emitter (Huygens-Fresnel Principle), then as distance increases, the emitted power is spread over a spherical surface whose area increases with the square of that distance.

Consequently, the intensity, which is equal to the ratio of transmitted power to the area over which it spreads, is inversely proportional to the square of the distance.

It should be noted that only under particular conditions, flat and coherent radiation, in practice in a laser, does the intensity of the beam remain constant over long distances.

But again, up to a point: in the LRO experiment that measures the earth-moon distance by illuminating with a laser the mirrors left behind by the Apollo missions and precisely measuring the round trip time, the laser beam arrives at the moon with a diameter of about 6km.

--

--

Giuseppe Frisella
Giuseppe Frisella

Written by Giuseppe Frisella

I'm a curious person and I'm on Medium mainly to read and share thoughts and knowledge. I love science, especially physics and evolutionary biology.

No responses yet