Browsed by
Category: Uncategorized

The road to the space coast (looking back on the ideas that led to Euclid)

The road to the space coast (looking back on the ideas that led to Euclid)

On Florida’s Space Coast, the Euclid satellite is undergoing the final preparations for launch on a Falcon 9 rocket next Saturday, July 1st at 11:12 EDT. Although the Euclid mission was approved by ESA in 2011, the origins of the project date back more than a decade before that, starting with the realisation that the expansion of the Universe is accelerating.

In cinema, great discoveries are usually accompanied by the leading lights throwing their hands in the air and exclaiming, “This changes everything!” But in real life, scientists are cautious, and the first reaction to any new discovery is usually: is there a mistake? Is the data right? Did we miss anything? You need to think carefully about finding the right balance between double-checking endlessly (and getting scooped by your competitors) or rushing into print with something that is wrong. At the end of the 1990s, measurements of distant supernovae suggesting the accelerated expansion of the Universe were initially greeted by scepticism.

Conceptually, what those measurements were saying was simple: the further away an object is, the faster it is receding from us. Edwin Hubble’s early observations of galaxies demonstrated that there was a straight-line relationship between the distance of an object and the speed of movement. The most simple explanation (although one that scientists took a while to accept) was that the Universe was expanding.

Over the next few decades, researchers embarked on a long quest to find different classes of objects for which they could estimate distances. Supernovae were one of the best: it turned out that if you could measure how the brightness of supernovae changed with time, you could estimate their distances. You could then compare how the distance depended on redshift, which you could measure with a spectrograph. Wide-field cameras on large telescopes allowed astronomers to find supernovae further and further away, and by the end of the 90s, samples were large enough to detect the first tiny deviations from Hubble’s simple straight-line law. The expansion was accelerating. The origin of the physical process of expansion was codified by “Lambda”. Or “dark energy”.

First measurements of distant supernovae from two teams. The most distant measurements are above the straight-line measurements by ~20%.

But those points on the right-hand side of the graphs which deviated from Hubble’s straight-line law had big error bars. Everyone knew that supernovae were fickle creatures in any case, subject to a variety of poorly understood physical mechanisms that could mimic the effect that the observers were reporting.

Initially, there was a lot of resistance to this idea of an accelerating Universe, and to dark energy. Nobody wanted Lambda. Not the theorists, because there were no good theoretical explanations for Lambda. And not the simulators, because Lambda unnecessarily complicated their simulations. And not even the observers, because it meant that every piece of code used to estimate physical properties of distant galaxies had to be re-written (a lot of boring work). Meanwhile, the supernovae measurements became more robust and the reality of the existence of Lambda become harder and harder to avoid. But what was it? It was hard to get large samples of supernovae, what other techniques could be used to discover what Lambda really is? Soon, measurements of the cosmic microwave background indicated that Lambda was indeed the preferred model, but because the acceleration only happens at relatively recent epochs in the Universe, microwave background observations only have limited utility here.

Meanwhile, several key instrumental developments were taking place. At the Canada France Hawaii Telescope and other observatories, wide-field cameras with electronic detectors — charge coupled devices, or CCDs — were being perfected. These instruments enabled astronomers for the first time to survey wide swathes of the sky and measure the positions, shapes and colours of tens of thousands of objects. At the same, at least two groups were testing the first wide-field spectrographs for the world’s largest telescopes. Fed by targets selected from the new wide-field survey cameras, these instruments allowed the determination of the precise distances and physical properties of tens of thousands of galaxies. This quickly led to many new discoveries of how galaxies form and evolve. But these new instruments would also allow us to return to the still-unsolved nature of the cosmic acceleration, using a variety of new techniques which were first tested with these deep, wide-field surveys.

In the 1980s, observations of galaxy clusters with CCD cameras led to the discovery of the first gravitational arcs. These are images of distant galaxies which are, incredibly, magnified by the passage of light near the cluster. The deflection of light by mass is one of the key predictions of Einstein’s theory of general relativity. The grossly distorted images can only be explained if a large part of the mass of the cluster is concealed in invisible or ‘dark’ matter. However, in current models of galaxy formation, the observed growth of structures in the Universe can only be explained if this dark matter is distributed throughout the Universe and not only in the centres of galaxy clusters. This means also that even the shapes of galaxies of the ‘cosmic wallpaper’ throughout the night sky should be very slightly correlated, as light rays from these distant objects pass close to dark matter everywhere in the Universe. The effect would be tiny, but it should be detectable.

Simulation of the passage of light rays through the Universe, passing close to dark matter (S. Colombi, IAP).

Around the world, several teams raced to measure this effect in new wide-field survey camera data. The challenges were significant: the tiny effect required a rigorous control of every source of instrumental error and detailed knowledge of telescope optics. But by the early 2000s, a few groups had measured the ”correlated shapes” of faint galaxies. They also showed that this measurement could be used to constrain how rapidly structures grow in the Universe. At the same time, other groups, using the first wide field spectroscopic surveys, found that measurements of galaxy clustering could be used to independently constrain the parameters of the cosmological model.

Halfway through the first decade of the 21st century, it was beginning to become clear that both clustering combined with gravitational lensing could be an excellent technique to probe the nature of the acceleration. Neither method was easy: one required very precise measurements of galaxy shapes, which was very hard to do with ground-based surveys which suffered from atmospheric blurring; the other required spectroscopic measurements of hundreds of thousands of galaxies. And both techniques seemed highly complementary to supernovae measurements.

In 2006, the report from a group of scientists from Europe’s large observatories and space agencies charted a way forward to understand the origin of the acceleration. Clearly, what was needed was a space mission to provide wide-field high-resolution imaging over the whole sky to measure the shapes, coupled with an extensive spectroscopic survey. Both these ideas were submitted as proposals for two satellites: one would provide the spectroscopic survey (SPACE) and the other would provide the high-resolution imaging (Dune). The committee, finding both concepts compelling, asked the two teams to work together to design a single mission, which would become Euclid. In 2012, the mission was formally approved.

Euclid in the clean room
Euclid in the clean room at Thales Alenia

Euclid aims to make the most precise measurement ever of the geometry of the Universe and to derive the most stringent possible constraints on the parameters of the cosmological model. Euclid uses two methods: galaxy clustering with the spectrograph and imager NISP (sensitive to dark energy) and gravitational lensing with the imager VIS (sensitive to dark matter). Euclid‘s origins in ground-based surveys makes it unique. Euclid aims to make a survey of the whole extragalactic sky over six years. But unlike in ground-based surveys, no changes can be made to the instrument after launch. After launch, Euclid will travel to the remote L2 orbit, one of the best places in the solar system for astronomy, to begin a detailed instrument checkout and prepare for the survey.

I have been involved in the team which will process VIS images for more than a decade. The next few weeks will be exciting and stressful in equal measures. VIS is the “Leica Monochrom” of satellite cameras: there is only one broad filter. The images will be in black-and-white. It will (mostly) not make images deeper than Hubble or James Webb: Euclid‘s telescope mirror is relatively modest (there are some Euclid deep fields, but that is another story). But to measure shapes to the required precision to detect dark matter, every aspect of the processing must be rigorously controlled.

VIS images will cover tens of thousands of square degrees. Over the next few years, our view of the Universe will dramatically snap into high resolution. That, I am certain, will reveal wonders. Those images will be one of the great legacies of Euclid, together with a much deeper understanding of the cosmological model underpinning the Universe that will come from them and the data from NISP.

This Thursday, I’ll be travelling to Florida to see Euclid start its journey to its L2 orbit for myself. I’ll be awaiting anxiously with many of my colleagues for our first glimpse of the wide-field, high-resolution Universe that will arrive a few weeks later.

52 photographs (2018) #24: In Zollverein

52 photographs (2018) #24: In Zollverein

After Bonn, I spent a few days visiting friends who live near Essen, in the Ruhr valley. Here one can find the remains of the massive Zollverein industrial city, once the largest coal-mine in Europe. The main coal-washing hall has been transformed into an enormous museum telling the history of the Ruhr valley dating right back to prehistoric times. It is a beautiful and and encyclopaedic museum.

So in the west, this is the story of our age: factories transformed into cultural artefacts. In the museum, countless photographs testified to the terrible conditions at the factory. And it’s not just photographs: one can see embalmed lungs shrivelled up by emphysema. Of course, such places still exist elsewhere in the world, and an epic film from Wang Bing a few years ago shows the hard life to be had in Chinese steel mills.

But today, the control room on the main floor at the Zollverein coal-washing plant has become a cafe.

In Zollverein

Today, Zollverein is quiet, the machines are stilled, and the dials are all at zero (as you can see from my previous post).

Next: Copenhagen.

A few thoughts on “Sapiens: A Brief History of Humankind ” (Yuval Harari)

A few thoughts on “Sapiens: A Brief History of Humankind ” (Yuval Harari)

I recently discovered Yuval Harari’s book, “Sapiens”, which was first published in 2014. It is an ambitious book, attempting as it does to summarise the whole history of the human race in a few hundred pages. It’s obvious in this kind of enterprise there are going to be some oversimplifications and sweeping generalisations, and that’s certainly what happens. I also thought, starting the book, that his aim was simply to describe the history of the human animal, but his ambitions are much larger than that. His book is also a history of human society and civilisation. Harari has stated that he is strongly inspired by Jared Diamond, and Diamond’s influence is visible at least to the extent that both authors agree that no question, no matter how large, is not amenable to rational enquiry.

Harari attempts to explain how Homo Sapiens has become so successful and now completely dominates planet Earth. He mentions the “Dunbar number” which is the number of people a person can know and trust: it is around a hundred. Beyond that, there has to be some other way in which people can bind together into groups. Trust is a fundamental part of our societies (a point also made in Bruce Schnieder’s books). For Harari, this trust comes from a series of shared beliefs. His point is that they are just that, beliefs, with for the most part no basis in reality. For him, almost all of the constructs at the foundations of our society are shared beliefs. For Harari, liberal humanism is just as much as a religion as, say Christianity. He goes further. What drives us as a species? One answer is that we are driven by the shared belief systems of our society or simply the pursuit of happiness. Our consensual illusion. He suggests that a future study of history should examine in detail how happy people were in past times, but at the same time reminds that this is of course, a completely arbitrary and subjective measurement. He leans heavily on Buddhist philosophy as way out of this dead-end, in particular the notion that, well, you must become aware of your feelings in order to surpass them. Well.

Rationally, it is hard to disagree with this. However, the discussion does show the hole you can dig yourself into if you decide that Humans are intrinsically not very different from other species on the planet (apart from a few important cognitive innovations which Hariri explains very well) or that the search for knowledge or belief in “progress” are also partially delusional. It seems to me that this line of thinking has led to one of the predominant problems of our time: a lack of belief in human agency and the idea that there is nothing much worth saving in our culture. Until we can change that, I don’t see how we can decide where we, as a species, want to go.

Rationality, Loach, Trump, Science

Rationality, Loach, Trump, Science

A few years ago I read John Raulston Saul’s excellent “Voltaire’s bastards”. The thesis of this book is that in the West we have fallen under the control of vast rational systems which have no underlying morality of their own. These systems allow our society to function but they function outside any moral system. The link between justice and reason has been cut, and governments use rationality as a means to justify their actions.

I couldn’t help thinking about this book after we went to see Ken Loach’s excellent new film, “I Daniel Blake”.  The eponymous Daniel Blake is a honest tradesman who loses his job after a heart attack at work. Although he has a serious medical condition, his honestly leads him to falling onto the wrong side of the benefits system, and although his doctors strongly believe otherwise, he is declared “fit to work”. But he isn’t fit to work, not really. To get his benefits, he must look for work, but he is unable to accept anything he finds, because of his medical condition.

He patiently explains these contradictions to anyone who will listen in the benefits office, but to no avail. What struck me most is the constant refrain to him from various council employees (in a strong Geordie accent): “It’s not against you, like, it’s what we have to do”. It is not us, it is the system. The refrain of the last hundred years. It’s not a big leap to go from the there to consider the results of the British referendum and the American elections. The most striking aspect of these two events is the complete disregard of any opinion of “experts”.  There are certainly a large number of reasons for that, but one that seems relevant here is how disconnected many people have become from the enormous systems that have become enormously important for our lives and well-being, and which just don’t care what we think.

Perhaps we could extend this thought a little further? Science has become even more incremental in the last few years. Part of the the problem is that any new theory of the Universe must also explain the last few hundred years of observations as well as any new ones. Each minuscule advance now requires an enormous amount of work. And these advances take place inside enormous systems which have been calibrated extremely finely to succeed. It is the old problem: you cannot build anything expensive and complicated unless you are certain it will work, but how in this case are you ever certain to discover something new? And behind that there is a system of thousands of people somehow trying to work together, in a system doesn’t care anything for the people inside it

The conclusion: I do want to suggest that rationality is a bad thing. Of course it is not ! But we must find a way reconnect rationality and reason to a sense of social justice. And as for science? That is for another post.