The Book of Nothing Page 19
DEEP CONNECTIONS
“I love cosmology: there’s something uplifting about viewing the entire universe as a single object with a certain shape. What entity, short of God, could be nobler or worthier of man’s attention than the cosmos itself? Forget about interest rates, forget about war and murder, let’s talk about space.”
Rudy Rucker21
The first person to suggest that the cosmological constant might be linked to the rest of physics was the Belgian astronomer and Catholic priest, Georges Lemaître. Lemaître was one of the first scientists to take the idea of the expanding Universe seriously as a problem of physics. If the Universe was expanding, then he realised that it must have been hotter and denser in the past: matter would be transformed into heat radiation if cosmic events were traced far enough into the past.
Lemaître rather liked Einstein’s lambda force and found several new solutions of Einstein’s equations in which it featured. He was persuaded that it needed to be present in Einstein’s theory but, unlike Einstein, who tried to forget about it, and some other astronomers, who assumed that even if it existed it was negligible, he wanted to reinterpret it. Lemaître realised22 that although Einstein had added the lambda force to the geometrical side of his equations, it was possible to shift it across to the matter and energy side of the equation
{geometry} = {distribution of mass and energy} − {Λ energy},
and reinterpret it as a contribution to the material content of the Universe,
{geometry} = {distribution of mass and energy − Λ mass and Λ energy}.
If you do this then you have to accept that the Universe always contained a strange fluid whose pressure is equal to minus its energy density. A negative pressure is just a tension which is not unusual, but the lambda tension is as negative as it could possibly be and this means that it exerts a gravitational effect that is repulsive.23
Lemaître’s insight was very important because he saw that by interpreting the cosmological constant in this way it might be possible to understand how it originated by studying the behaviour of matter at very high energies. If those investigations could identify a form of matter which existed with this unusual relation between its pressure and its energy density then it would be possible to link our understanding of gravity and the geometry of the Universe to other areas of physics. It was also important for the astronomical concept of the vacuum. If we ignored the possibility of Einstein’s cosmological constant then it appeared that there could exist vacuum universes devoid of any ordinary matter. But if the cosmological constant is really a form of matter that is always present then there really are no true vacuum universes. The ethereal lambda energy is always there, acting on everything but remaining unaffected by the motion and presence of other matter.
Unfortunately, no one seems to have taken any notice of Lemaître’s remark even though it was published in the foremost American science journal of the day. The early nuclear and elementary-particle physicists never found anything in their theories of matter that looked compellingly like the lambda stress. Its image amongst cosmologists ebbed and flowed. The Second World War intervened and changed the direction of physics towards nuclear processes and radio waves. Soon after it ended the interest of cosmologists was captured by the novel steady-state theory of the Universe first proposed by Fred Hoyle, Hermann Bondi and Thomas Gold. Like Friedmann’s universes the steady-state universe expanded, but its density did not diminish with time. In fact, none of its gross properties changed with time. This steadiness was achieved by means of a hypothetical ‘creation’ process that produced new matter everywhere at a rate that exactly counterbalanced the dilution due to expansion. The rate required is imperceptibly small, just a few atoms appearing in each cubic metre every ten billion years. In contrast to the Big Bang24 models, the steady-state theory had no apparent beginning when everything came into being at once. Its creation was continual.
At first, it looked as if this cosmological theory required a new theory of gravity to supersede Einstein’s. It needed to include a new ‘creation field’ that could generate the steady trickle of new atoms and radiation needed to maintain the constant density of the Universe. In 1951, William McCrea,25 a British astrophysicist, showed that nothing so radical was required. The creation field could be added into Einstein’s equations as an extra source of energy and mass. And when it was, it looked just like the lambda term. No continual creation was needed.
Sadly for its enthusiastic inventors, the steady-state universe was soon consigned to the history books. It was a good scientific theory because it made very definite predictions: the Universe should look, on average, the same at all epochs. This made it extremely vulnerable to observational test. In the late 1950s, astronomers started to amass evidence that the Universe was not in a steady state. The population of galaxies of different sorts changed significantly over time. Quasars were discovered to populate the Universe more densely in the past than today. Finally, in 1965, the remnant heat radiation from a hot past Big Bang state was detected by radio astronomers and modern cosmology was born.
During the mid-1960s, when the first quasars were discovered with redshifts clustered around a single value, it was proposed that a large enough lambda stress might have been able to slow the expansion of the Universe temporarily in the past when it was about a third of its present extent. This could have led to a build-up of quasar formation close to this epoch. However, this idea faded away as more and more quasars were found with larger redshifts and it began to be appreciated how the apparent confinement of their redshifts to lie below a particular value was an artefact of the methods used to search for them.
Since that time observational astronomers have been searching for definitive evidence to determine whether the Universe is expanding fast enough to continue expanding for ever or whether it will one day reverse into contraction and head for a big crunch. If a lambda force exists that is large enough to dominate the attractive force of gravity over very large extragalactic distances, then it should affect the expansion of the Universe in the way shown in Figure 6.5. The most distant clusters of galaxies should be accelerating away from one another rather than continually decelerating as they expand.
Figure 6.5 The effect of lambda on the expansion of the Universe. When it becomes larger than the inverse square force of gravity it causes the expansion of the Universe to switch from deceleration to acceleration.
The search for this tell-tale cosmic acceleration needs ways to measure the distances to faraway stars and galaxies. By looking at the change in the pattern of light colours coming from these objects we can easily determine how fast they are expanding away from us. This can now be done to an accuracy of a few parts in a million. But it is not so easy to figure out how far away they are. The basic method is to exploit the fact that the apparent brightness of a light source falls off as the inverse square of its distance away from you, just like the effect of gravity. So if you had a collection of identical 100-watt light bulbs located at different distances from you in the dark, then their apparent brightnesses would allow you to determine their distances from you, assuming there is no intervening obscuration. If you didn’t know the intrinsic brightness of the bulbs, but knew that they were all the same, then by comparing their apparent brightnesses you could deduce their relative distances: nine times fainter means three times further away.
This is just what astronomers would like to be able to do. The trouble is Nature does not sprinkle the Universe with well-labelled identical light bulbs. How can we be sure that we are looking at a population of light sources that have the same intrinsic brightnesses so that we can use their apparent brightnesses to tell us their relative distances?
Astronomers try to locate populations of objects which are easily identifiable and which have very well-defined intrinsic properties. The archetypal example was that of variable stars which possessed a pattern of change in their brightness that was known theoretically to be linked to their intrinsic brightness in a simp
le way. Measure the varying light cycle, deduce the intrinsic brightness, measure the apparent brightness, deduce the distance away, measure the spectral light shift, deduce the speed of recession and voilà, you can trace the increase of speed with distance and see the expansion of the Universe, as Edwin Hubble first did in 1929 to confirm Friedmann’s prediction from Einstein’s theory that the Universe is expanding, as shown in Figure 6.6.
Unfortunately, these variable stars cannot be seen at great distances and ever since Hubble’s work, the biggest problem of observational astronomy has been determining distances accurately. It is the twentieth-century analogue of John Harrison’s27 eighteenth-century quest to measure time accurately so that longitude could be determined precisely at sea. Until quite recently, it has made any attempt to map out the expansion of the Universe over the largest extragalactic dimensions too inaccurate to use as evidence for or against the existence of Einstein’s lambda force. We could not say for sure whether or not the present-day expansion of the Universe is accelerating. Absence of evidence was taken as evidence of absence – and in any case it seemed to require a huge coincidence if the lambda force just started to accelerate the Universe at an epoch when human astronomers were appearing on the cosmic scene. Moreover, this would require lambda to have a fantastically small value. Better, they argued, to assume that it is really zero and keep on looking for a good reason why.
Figure 6.6 Hubble’s Law:26 the increase of the speed of recession of distant sources of light versus their distance from us.
In the last year things have changed dramatically. The Hubble Space Telescope (HST) has revolutionised observational astronomy and, from its vantage point above the twinkling distortions of the Earth’s atmosphere, it is now possible to see further than ever before. Telescopes on the ground have also advanced to achieve sensitivities undreamt of in Hubble’s day. New electronic technologies have replaced the old photographic film with light recorders that are fifty times more sensitive at catching light than film. By combining the capability of ground-based telescopes to survey large parts of the sky and the HST’s ability to see well-targeted, small, faint sources of light with exquisite clarity, a new measure of distance has been found.
Observers use powerful ground-based telescopes to monitor nearly a hundred pieces of the night sky, each containing about a thousand galaxies, at the time of the New Moon, when the sky is particularly dark. They return three weeks later and image the same fields of galaxies, looking for stars that have brightened dramatically in the meantime. They are looking for faraway supernovae: exploding stars at the ends of their life cycles. With this level of sky coverage, they will typically catch about twenty-five supernovae as they are brightening. Having found them, they follow up their search with detailed observations of the subsequent variation of the supernova light, watching the increase in the brightness to maximum and the ensuing fall-off back down to the level prior to explosion, as shown in Figure 6.7. Here, the wide sky coverage of ground-based telescopes can be augmented by the HST’s ability to see faint light and colours.
The detailed mapping of the light variation of the supernovae enables the astronomers to check that these distant supernovae have the same light signature as ones nearby that are well understood. This family resemblance enables the observers to determine the relative distances of the distant supernovae with respect to the nearby ones from their apparent peak brightnesses, because their intrinsic brightnesses are roughly the same. Thus a powerful new method of determining the distances to the supernovae is added to the usual Doppler shift measurements of their spectra from which their speeds of recession are found. This gives a new and improved version of Hubble’s law of expansion out to very great distances.
Figure 6.7 A supernova light-curve. The variation in the observed brightness of a supernova, showing the characteristics to a maximum and gradual fall back to the level prior to the explosion.
The result of these observations of forty distant supernovae by a combination of observations from the Earth and by the Hubble Space Telescope by two separate international teams of astronomers is to provide strong evidence that the expansion of the Universe is accelerating. The striking feature of the observations is that they require the existence of the cosmological constant, or lambda force. The probability that these observations could be accounted for by an expanding universe that is not accelerating is less than one in a hundred. The contribution of the vacuum energy to the expansion of the Universe is most likely28 to be fifty per cent more than that of all the ordinary matter in the Universe.
The variation of the redshifting of light with distance for the sources of light cannot be made to agree with the pattern predicted if lambda does not exist. The only way of escape from lambda is to appeal to a mistake in the observations or the presence of an undetected astronomical process creating a bias in the observations, changing the apparent brightnesses of the supernovae so that they are not true indicators of distance, as assumed. These last two possibilities are still very real ones and the observers are probing every avenue to check where possible errors might have crept in. One worry is that the assumption that the supernovae are intrinsically the same as those we observe nearby is wrong.29 Perhaps, when the light began its journey to our telescopes from these distant supernovae, there were other varieties of exploding star which are no longer in evidence. After all, when we look at very distant objects in the Universe we are seeing them as they were billions of years ago when the light first left them en route to our telescopes. In that distant past the Universe was a rather denser place, filled with embryonic galaxies and perhaps rather different than it appears today. So far, none of these possibilities has withstood detailed cross-checking.
If these possible sources of error can be excluded and the existing observations continue to be confirmed in detail by different teams of astronomers using different ways of analysing different data, as is so far the case, then they are telling us something very dramatic and unexpected: the expansion of the Universe is currently controlled by the lambda stress and it is accelerating. The implications of such a state of affairs for our understanding of the vacuum and its possible role in mediating deep connections between the nature of gravity and the other forces of Nature are very great.
So far we have seen what the astronomers thought about lambda and its possible role as the ubiquitous vacuum energy that Lemaître suggested. During the last seventy years, the study of the subatomic world has gathered pace and focus. It was also in search of the vacuum and its simplest possible contents. The discovery of a cosmic vacuum energy by astronomical telescopes turns out to have profound implications for that search too, and it is to this thread of the story that our attention now turns. It will start in the inner space of elementary particles and bring us, unexpectedly, full turn back to the outer space of stars and galaxies in our quest to understand the vacuum and its properties.
“There is an element of tragedy in the life-story of the ether. First we had luminiferous ether. Its freely given services as midwife and nurse to the wave theory of light and to the concept of field were of incalculable value to science. But after its charges had grown to man’s estate it was ruthlessly, even joyfully, cast aside, its faith betrayed and its last days embittered by ridicule and ignominy. Now that it is gone it still remains unsung. Let us here give it a decent burial, and on its tombstone let us inscribe a few appropriate lines:
Then we had the electromagnetic ether.
And now we haven’t e(i)ther.”
1
IT’S A SMALL WORLD AFTER ALL
“This [quantum] theory reminds me a little of the system of delusions of an exceedingly intelligent paranoiac, concocted of incoherent elements of thoughts.”
Albert Einstein2
One of the greatest truths about the character of the physical universe, which has come increasingly into the spotlight during the past twenty-five years, is the unity of its laws and patterns of change. Once upon a time it would have been suspected
that the nature of the most elementary particles of matter had little to do with the shapes and sizes of the greatest clusters of galaxies in the astronomical realm. Likewise, few would have believed that a study of the largest structures in the Universe would be able to shed light upon the smallest. Yet, today, the study of the smallest particles of matter is inextricably linked to the quest for a cosmological understanding of the Universe and the organisation of matter within it. The reason is simple. The discovery that the Universe is expanding means that its past was hotter and denser than its present. As we retrace its history back to the first minutes, we encounter a cosmic environment of ever-increasing energy and temperature which ultimately reduces all the familiar forms of matter – atoms, ions and molecules – to their simplest and smallest ingredients. The number and nature of the most elementary particles of matter will thus play a key role in determining the quantities and qualities of the different forms of matter that survive the childhood of the Universe.
This cosmic link between the large and the small also features in the fate of the vacuum. We have just seen how the theory of gravity that Einstein created can be used to describe the overall evolution of the physical universe. In practice we choose a mathematically simple universe that is a very good approximation to the structure of the real one that we see through our telescopes. At first, we have seen how it was that Einstein’s theory reinforced his expulsion of the ether from the vocabulary of physics by providing a natural mathematical description of universes which are completely devoid of mass and energy – ‘vacuum’ universes. No ether was necessary even if electrical and magnetic fields were introduced to curve space. Yet there was to be a sting in the tail of this new theory. It permitted a new force field to exist in Nature, counteracting or reinforcing the effects of gravity in a completely unsuspected way, increasing with distance so that it could be negligible in its terrestrial effects yet overwhelming on the cosmic scale of the Universe’s expansion. This ubiquitous ‘lambda’ force allows itself to be interpreted as a new cosmic energy field: one that is omnipresent, preventing the realisation of nothing. But, if such a vacuum-buster exists, where does it come from and how is it linked to the properties of ordinary matter? Astronomers like Lemaître and McCrea posed these questions but did not answer them. They hoped that the world of subatomic physics would enable a link to be forged with the vacuum energy of Einstein.