The sky-averaged, or global, background of redshifted 21-cm radiation is expected to be a rich source of information on the history of re-heating and re-ionization of the intergalactic medium. However, measuring the signal is technically challenging: one must extract the small, frequency-dependent signal from under the much brighter spectrally smooth foregrounds. Traditional approaches to study the global signal have used single-antenna systems, where one must calibrate out frequency-dependent structure in the overall system gain (due e.g. to internal reflections) as well as remove the noise bias from auto-correlating a single amplifier output. This has motivated several proposals to measure the global background using interferometric setups, where the signal appears in cross-correlation and where additional calibration techniques are available. In this paper, we focus on the general principles that drive the sensitivity of any interferometric setup to the global signal. In particular, we prove that this sensitivity is directly related to two characteristics of the setup: the cross-talk between the readout channels (i.e. the signal picked up at one antenna when the other one is driven) and the correlated noise due to thermal fluctuations of lossy elements (e.g. absorbers or the ground) radiating into both channels. Thus in an interferometric setup, one cannot suppress cross-talk and correlated thermal noise without reducing sensitivity to the global signal by the same factor -- instead, the challenge is to characterize these effects and their frequency dependence. We illustrate our general theorem by explicit calculations within toy setups consisting of two short-dipole antennas in free space, and above a perfectly reflecting ground surface.
↧