While the rest of the world was worrying (either as a joke or senselessly) about the Earth being eaten up by a black hole, my computer is processing data for the project; and soon it might even be processing something other than simulations. It's been repeatedly saying things like:

lhcathome Wed 10 Sep 2008 07:01:55 CEST Sending scheduler request to http://lhcathome.com.ch/lhcathome_cgi/cgi lhcathome Wed 10 Sep 2008 07:01:55 CEST Reason: To fetch work lhcathome Wed 10 Sep 2008 07:01:55 CEST Requesting 8640 seconds of new work lhcathome Wed 10 Sep 2008 07:02:00 CEST Scheduler request succeeded lhcathome Wed 10 Sep 2008 07:02:00 CEST No work from project lhcathome Wed 10 Sep 2008 07:02:00 CEST Deferring scheduler requests for 2 hours, 9 minutes and 7 seconds

for some time now, with occasional interludes like:

lhcathome Wed 10 Sep 2008 09:39:00 CEST Sending scheduler request to http://lhcathome.com.ch/lhcathome_cgi/cgi lhcathome Wed 10 Sep 2008 09:39:00 CEST Reason: To fetch work lhcathome Wed 10 Sep 2008 09:39:00 CEST Requesting 8640 seconds of new work lhcathome Wed 10 Sep 2008 09:39:35 CEST Scheduler request succeeded lhcathome Wed 10 Sep 2008 09:39:35 CEST Started download of file w3_lhc_symmetric-q2_3__41__s__64.31_59.32__16_18__5__75_1_sixvf_boinc24097.zip

but they're not real data yet; however, less than an hour after that last work-load, CERN (centre européenne pour la recherche nucléaire) fired up the beam and sent some particles round their big loop. It'll be a while before they're collecting gibibytes of data per second in collisions, but the infrastructure to process that data is already in place; thousands of computers all round the world are signed up to make their spare processor cycles available via the LCH@home project. My slow old computer's been lending a hand since March 2008: but, now that lots of folk are signing up to join in, its position in the rankings of contributions is plummeting – which is a good thing, as it means lots of folk with newer, faster machines are pitching in.

Update (2010/April): at the end of 2008 my slow old
machine died, so I replaced it with a shiny new
64-bit quad-core AMD Phenom™ 9950. For each of the four CPUs,
Linux's `/proc/cpuinfo` reports 5211 bogomips while the BOINC manager
(which looks after running LHC@home and (now) SETI@home) reports 1999 floating
point MIPS (Whetstone) and 7769 integer MIPS (Dhrystone). However, I don't see
much in the way of traffic from LHC@home any more – they seem to have more
volunteers available than tasks that need performed.

There's been some fretting about the possibility that the LHC might make black holes, so it seems worth pointing out why this isn't something to worry about, essentially because of reasons of scale. The black holes everyone's heard about are the big ones that arise from the deaths of big stars and the vast concentrations of mass near the centres of galaxies; so, understandably, folk tend to think about big scary monsters that can swallow whole solar systems without even burping. However, the speculation that the LHC might produce black holes relates to an altogether different (quantum mechanical) mechanism for black holes to come into being, which depends on the hypothesis that it's possible to have really tiny microscopic black holes.

The theory of black holes is a branch of general relativity; and one of the things to remember about it is that the solutions to the field equations are entirely scalable. A black hole arises anywhere that a mass, M, lies entirely within a distance R = 2.G.M/c/c of some point, where G is Newton's gravitational constant and c is the speed of light; the ratio R/M = 2.G/c/c = 1.4847e-27 m/kg, which is extremely tiny. All the same, the threshold radius is simply proportional to the mass, so the field equations of gravity allow black holes of arbitrary mass, big or small.

So, when the theory suggests that the LHC might make a black hole, it's going to be one of a size commensurate with the masses and energies of the particles being crashed into one another. The highest energy collisions I've heard described for the LHC are planned experiments with lead ions, colliding with a combined energy of 1148 TeV (one electron-Volt, eV, is the energy a single electron (or proton) picks up or loses when moving across a potential difference of one Volt; and the quantifier tera, or T, is just a factor of a million million). That's got a big number in it (1148 million million) but the unit of measurement, eV, is rather tiny – especially when we divide it by the square of the speed of light to work out the mass associated with that much energy. In fact, the mass of even an electron is equivalent to over half a million eV. When we multiply that big number by that tiny unit of energy, we get less than a fifth of a milli-Joule. One whole Joule is enough energy to lift a kg by a height of barely more than ten centimetres (or one pound by not quite nine inches) at Earth's surface; a milli-Joule is only enough to lift ten grammes by just over a centimetre (or an ounce through not quite one seventh of an inch); and we're dealing with less than a fifth of that.

Furthermore, now that you have some idea of how tiny that amount of energy is in familiar terms, I should remind you that it's the associated mass that matters, and for that we have to divide it by the square of the speed of light. That gives us a mass of just over 2 atto-grammes (the quantifier atto means one millionth of one millionth of one millionth, or 1e-18). On atomic scales, that's a pretty respectable amount – it's the mass of nearly six thousand lead atoms – but it's tiny by familiar standards and just about indetectable when it comes to actually having an appreciable gravitational influence on the things around it. I once did an extremely sensitive experiment to measure the gravitational influence of some masses of several kilogrammes, across a few centimetres; it was just about possible, but the effect is very small. When you scale that down by a factor of a million million million, it's going to get very small.

Still, gravitational force scales inversely with the square of separation of particles, so as long as we get something within a thousandth of a millionth of the separation I was measuring, we'll achieve a similar scale of tiny deflection of masses: so a particle coming within a fraction of a nanometre of our hypothetical black hole might just deviate almost perceptibly from totally ignoring its presence; so something within not many picometres might actually notice it – and atomic nuclei are actually smaller than that, although they tend to be several dozen picometres apart, so we'd better check whether it might be able to capture a nucleus, or part of one, in the tiny proportion (less than twenty millionths) of its time that it'll typically spend anywhere near one. To assess that, we have to look at the size of the black hole: so we multiply its mass by 2.G/c/c and get about 3e-48 metres; that's five factors of a million shorter than the length scales of nuclei. If it gets close to a nucleus it conceivably might cause the nucleus to accelerate violently and swing off course but it can't capture anything but what's within a modest multiple of its radius. If I put seventy ordinary dice in a bucket, shake it up well and pour them out on a smooth and level floor, I've got a better chance of them all coming up six than that hypothetical black hole has, when it encounters a nucleus, of capturing it. But hey, that's only statistics – can't it just wait ? Surely it's bound to end up catching the occasional nucleus.

Well, in fact, it doesn't even have much chance if it could afford to wait around to get lucky (if it passed through as many nuclei as there are in the whole of planet Earth it'd still have less than a one in a milliard chance of capturing even one of them) but that turns out to be irrelevant because it doesn't have the time to wait around for anything. The other thing I have to tell you about black holes, you see, is that they evaporate.

Black holes are, well, black; consequently, like any other black body, they
radiate energy with a characteristic spectrum that's determined by their
temperature; if a black hole's temperature is the same as that of a piece of
coal glowing red in the embers of a fire, then that black hole is also glowing
red. Now, if black holes all had a temperature of absolute zero, that'd be
irrelevant; but Stephen Hawking showed that, when quantum mechanics and gravity
meet in the neighbourhood of its event horizon, a black hole really does (rather
counter-intuitively) radiate energy – and even matter. Strictly speaking,
the matter and energy aren't emitted by the black hole itself: they're emitted
by the space near it, always as one half of a virtual particle pair, the other
half of which fell into the black hole, carrying with it an energy *debt*
(the virtual particles that arise in quantum mechanics are weird that way) big
enough to let its partner escape the black hole's clutches – and the black
hole pays back that debt by losing energy and getting smaller.

The details are beyond me, but the net effect is that the temperature of a black hole is proportional to the inverse of its mass: roughly speaking, a black hole emits a photon of wavelength comparable to its size roughly once in each time interval during which light can travel that distance. For big black holes, those are long wavelength photons with very little energy and it emits them rarely; for little black holes, on the other hand, the photons it emits have short wavelengths and it emits them frequently. Little black holes are hot, which means they radiate away energy fast – and energy is mass, which they don't have so much of, so they get lighter and consequently hotter and consequently radiate away even faster. In short, small black holes explode. A black hole with a mass of a hundred tonnes takes a fraction of a second to evaporate; a black hole with a mass of one tonne takes less than a microsecond to evaporate. You don't want to be around either of those, because even the little one is an explosion more than twenty times as devastating as the biggest nuke even our mighty superpowers were rash enough to make. Smaller black holes evaporate even faster but, fortunately, are less devastating – because they can at most deliver as much energy as corresponds to all of their mass. Our hypothetical LHC-produced black hole only has two atto-grammes of mass to convert to energy: it essentially does this instantaneously, liberating all 1148 TeV of its energy. Which, as I explained above, is barely enough to lift two grammes through a centimetre.

So our hypothetical black hole, if the LHC ever produces one, would have
almost no chance of ever capturing any more mass even if it gets to stick around
indefinitely: which it doesn't, because it'll evaporate essentially instantly.
The reason the LHC physicists are interested in the possibility of this is that
what, exactly, it evaporates *into* is subject to far fewer constraints
than the usual reactions between nuclear particles, so it might produce
something we've never seen before among its decay products.