Some people stubbornly insist that The Earth is flat. One of the evidences
against this is that folk at different places in the world see The Sun, at its
highest on a given day, at different angles above the horizon. Apparently some
flat-earthers hand-wave this (and the fact that things vanish over the horizon)
away by saying light curves upwards, at least near the surface of The Earth. As
it happens, round-earthers (like me) believe that light
curves *downwards* near the surface, due to the thicker atmosphere
causing light to slow down (i.e. the greater density of the troposphere gives it
a higher refractive index). This is why sunrise and sunset at the equinoxes are
in fact a little more than twelve hours apart; and day-length at summer solstice
is a little longer than night-length at the winter solstice.

However, the flat-earthers would have us believe that light curves upwards,
not downwards, near the surface of The Earth. This means the speed of light is
higher near the ground than higher up; presumably this is due to some factor
that counteracts the higher refractive index of the denser air near the surface;
but so be it. This at least requires that there *be* some function of
altitude, above The Earth's surface, that gives the speed of light at that
altitude. Call that ({speeds}: c |{altitudes}), with c(h) being the speed of
light at altitude h.

Consider a ray of light with an end somewhere on The Earth's surface. In the simple case, this comes down from the sky, so its trajectory passes through a definite position at each altitude; and, since its only curvature is vertical, the points on The Earth's surface under it form a line segment. We can use distance along that line segment as a horizontal co-ordinate, call it x, and altitude h to describe the trajectory of the curve as a mapping ({reals}: x |{altitudes}) that tells us how far along the line is the point under the ray at altitude h. We can chose x(0) = 0, i.e. we measure distance along the line from the point where the ray reaches the surface, and measure distance positive in the direction of the ray, so that we get ({positives}: x |{positive altitudes}).

So the fun part is working out how our trajectory curves as a result of c's variation with altitude. Consider a wave-front whose normal is in the [x, h] plane; let [u, v] be the unit normal; the displacement along the wave-front in this plane is then parallel to [−v, u], so the points on the wave-front close to a given [x(h), h] are at positions [x−e.v, h+e.u] for various small e. The velocity of the wave-front here is c(h+e.u).[u, v] so, a small time t later, the wave-front shall have advanced to (for various small e, here presumed small enough that we can ignore terms in e.e; and for small t also):

- [x, h] +e.[−v, u] +c(h+e.u).t.[u, v]
- = [x, h] +e.[−v, u] +(c(h) +e.u.c'(h)).t.[u, v]
- = [x, h] +c(h).t.[u, v] +e.([−v, u] +u.c'(h).t.[u, v])

so that the normal has changed by an amount parallel to the wave-front, i.e. rotated. The c-scaled normal to the wave-front is the velocity of our ray at each point, so rotates in the same way; in advancing t.c(h) along the ray, its direction [u, v] changes to the unit in the direction [u +u.v.t.c'(h), v −u.u.t.c'(h)] which, for small enough t, is indeed within O(t.t) of a unit. So with p(t) = [x(h), h] and p'(t) = c(h).[u, h], we get p''(t) = u.c'(h).[v, −u], where [v, −u] is a unit perpendicular to p'(t).

Now, with x increasing with h, u and v are both positive so c'(h).[v, −u] points downwards if c'(h) is positive (the round-earth model, with light moving faster at higher altitude due to thinner air), upwards if c'(h) is negative (the flat-earth model, with light moving slower at higher altitudes, despite the thinner air).

Valid CSS ? Valid XHTML ? Written by Eddy.