F2 : Du Big Bang à l’Univers du futur
Cosmology is the study of the Universe in its largest dimensions and of its evolution. In the successful big bang model, the early Universe is hot and dense and cools down as it expands. The question of the status of the original singularity (the so-called big bang) from which space emerged is a central one. It may be studied through the gravitational waves that are produced immediately after the big bang, for example during an explosive phase of expansion, known as inflation, that follows immediately the big bang.
The study of the expansion of the Universe has recently shown an unexpected acceleration in the more recent stages of the evolution, which is attributed to a new form of energy, known as dark energy. Understanding the nature of dark energy and the fate of the Universe is another fundamental question.
Although these scientific questions are specific to the understanding of the early Universe and its subsequent evolution, the methods used have far reaching applications. For example, in the context of the Space Campus, teams from IPGP and APC realized that they are using similar methods to analyze seismic data on the Moon surface and fluctuations in the Cosmic Microwave Background (CMB): in both cases fluctuations on a sphere. Also, cosmology requires processing larger and larger amounts of data. For example, the 10 year observations by the LSST telescope (performing large sky surveys to understand the properties of dark energy) will require a database of 60 PetaBytes of raw data. Processing such a vast amount of data is a challenge which will position the field in a very central place for the treatment of massive amounts of data.
We have identified a certain number of axes:
1. Support of the Paris Centre for Cosmological Physics
All the fundamental questions listed above are addressed by the Paris Centre for Cosmological Physics (PCCP: http://www.pariscosmo.fr). We stressed earlier that the goals of PCCP are very similar to those advertised by the LabEx, although the PCCP is more focussed thematically but has a larger laboratory base. The LabEx will support the Centre by providing one postdoc position (PCCP fellow) every year, as well as financing the visits of scientists with a high international visibility through a special UnivEarthS-PCCP programme.
2. B-mode polarization of the Cosmic Microwave Background
The measurement of B polarization modes of the Cosmic Microwave Background (CMB) may provide a direct probe of primordial gravitational waves produced during the inflationary epoch. Measuring precisely the polarization of the CMB is thus the next exciting frontier. Its characterization will be further improved by the Planck satellite mission (launched May 14, 2009). The weakness of the B-mode signal requires the development of highly sensitive experiment with an exquisite control of systematic errors. Most of the experiments or projects dedicated to the quest are based on the well known direct imaging technology. While imagers measure maps of the CMB, interferometers directly measure Fourier components of the Stokes parameters and thus are expected to be less sensitive to systematic effects. Unfortunately, the classical heterodyne interferometry concept may have reached its limits in term of scale and sensitivity. However, Bolometric Interferometry could combine the advantages of interferometry in terms of systematic effects handling and those of direct detectors in terms of sensitivity.
Although many experiments are already proposed in the USA (ground based and balloon borned), only one project is emerging in Europe, the QUBIC program supported by a French-Italian-USA-UK-Irish collaboration (http://www.qubic.org). The APC CMB team including its experimental laboratory is leading this research effort with particular interest in:
- Conception and design of the QUBIC instrument
- Data analysis and simulations
- Development of the detection chain:
- Bolometer arrays based on the Transition Edge Sensors (TES) technology and multiplexed readout.
- Kinetic Inductance Detectors (KIDs), a new path towards large detector arrays: this new detection technique uses the variation of kinetic inductance of a superconductor when it absorbs a photon flux. Their advantages are the following: (i) they are relatively simple to fabricate, (ii) the readout electronics is inherently multiplexed allowing for a large number of detectors (of the order of 1000 or more) to be readout with a single wire and (iii) the intrinsic sensitivity could theoretically be very high. We propose to couple these new detectors with the current developments made for TESs. Within 2 years, a demonstrator of some 100s KIDs will be realized, fully compatible with the QUBIC requirements. The number of detectors will be further improved to reach some thousands of KIDs within 10 years.
- Realization of receiver horns based on the platelets technology.
The goal is to achieve during the 4 coming years a complete design of the final instrument which means that it will be necessary to develop all the detector chain and choose between the 2 possible solutions we are working on (KIDS and TES based arrays). We will also have to build a prototype of the first module within 2 years in order to perform first observations on the sky and consequently to tune and improve the behaviour of this first module. As developed in section 6., we intent to put the LABEX contributions on the KIDs development and on the conception and design of the instrument which is now well established but still needs some important effort in the data analysis aspects. The description of the concept is on the process to be published and available on the archive : ‘QUBIC: The QU Bolometric Interferometer for Cosmology’, E. Battistelli et al. astro-ph 1010.0645 and submitted to AstroParticle Physics.
3. Understanding the nature of Dark Energy
The other sub-work package has a different timeline: it concerns the analysis of data of experiments searching for the nature of dark energy. They are based on large scale surveys which require to store and analyse massive amounts of data. The François Arago Centre, together with the IN2P3 computing centre in Lyons, will play a significant role in this challenging task. The work proposed will be first to identify what will be the exact need for data storage and processing (2011-2014) and then to participate in setting up an international centre for dark energy, as already planned in the US by the LSST collaboration (2015-2020).
Progress in this field is expected both on the theoretical and observational sides.
On the theoretical side, alternate models of dark energy are examined along with possible large distance modifications of gravity. Both directions should be followed, given that dark energy is currently only known through its gravitational effects. Hence, observations leading to infer its existence can also be explained instead by changes in the gravitational laws at cosmological distances. There are plenty of models which replace a simple cosmological constant by a new more or less exotic dark content of the Universe, but no fully consistent model of large distance modification of gravity is known.
The APC and LUTh theory goups are involved in both directions, with a special expertise on the study of large distance modifications of gravity at APC. New proposals along this line have recently been made by APC theorists and are currently under examination.
From an observational point of view, understanding dark energy requires more accurate characterization of the properties of this dark energy, and tests of standard general relativity. Two complementary families of measurements should be pursued, since they allow to differentiate modified gravity from sensu stricto dark energy: measurements of the expansion rate of the Universe and its evolution, and measurement of the growth of structures, whose rate is slowed down by dark energy.
For each of these measurements, several complementary techniques should also be used : the exquisite required precision of these observations requires to carefully control degeneracies and systematic effects, which will affect different probes in different ways.
APC is involved in several short and long term observational projects using several dark energy characterization techniques, through its wide field astronomy group. It is also participating in the Planck project (see above) and will be able to use CMB data in correlation with other wide field surveys.
On the longer term, APC is focusing on cosmic magnification, another way to exploit gravitational lensing than the more common shear measurement — this technique is very well adapted to the depth of future surveys. The preparation of future analyses is currently done on SDSS data, and APC plans to apply this technique to LSST and Euclid data. LSST (Large Synoptic Survey Telescope) is a proposed telescope with a 8.4 m diameter main mirror, which should get its first scientific images in 2016. It will be located on top of Cerro Pachón in Chile, which already houses the Gemini South telescope. The camera has a mosaic of 200 4k x 4k CCDs, totaling 3.2 billion pixels, with a field of view of ten square degrees. This project led by Anthony Tyson from the University of California, Davis has been ranked by the Astronomy and Astrophysics Decadal Survey “New Worlds and New Horizons in Astronomy and Astrophysics,” as its top priority for the next large ground-based astronomical facility. Euclid, a space based project, is proposed by a European consortium (led by Alexandre Refregier from CEA/IRFU/SAp) which is currently applying to the ESA call for a medium mission. It will be, if selected, the perfect instrument to measure gravitational lensing and BAO.
Both experiments should be in operation around 2019. APC is already strongly involved in their preparation — camera control software and photometric calibration for LSST, and most importantly, data processing for both LSST and Euclid ground segment. Euclid is a space project that relies on ground-based data to fully exploit its science data: LSST data will be an important asset in Euclid’s science exploitation, and APC will be in charge of merging LSST and Euclid data. Since LSST data will represent tens of petabytes, the data processing will heavily rely on computing resources and staff at CC-IN2P3 and François Arago Centre (FACe).