Quantcast
Channel: Instrumentation and Methods – Vox Charta
Viewing all articles
Browse latest Browse all 2573

Calibration of the EDGES Receiver to Observe the Global 21-cm Signature from the Epoch of Reionization

$
0
0

The EDGES experiment strives to detect the the sky-average brightness temperature from the $21$-cm line emitted during the Epoch of Reionization (EoR) in the redshift range $14 \gtrsim z \gtrsim 6$. To probe this signal, EDGES conducts single-antenna measurements in the frequency range $\sim 100-200$ MHz from the Murchison Radio-astronomy Observatory in Western Australia. In this paper we describe the current strategy for calibration of the EDGES instrument and, in particular, of its receiver. The calibration involves measuring accurately modeled passive and active noise sources connected to the receiver input in place of the antenna. We model relevant uncertainties that arise during receiver calibration and propagate them to the calibrated antenna temperature using a Monte Carlo approach. Calibration effects are isolated by assuming that the sky foregrounds and the antenna beam are perfectly known. We find that if five polynomial terms are used to account for calibration systematics, most of the calibration measurements conducted for EDGES produce residuals of $1$ mK or less at $95\%$ confidence. The largest residuals are due to uncertainty in the antenna and receiver reflection coefficients at levels below $20$ mK when observing a low-foreground region. These residuals could be reduced by restricting the band to a smaller frequency range motivated by tighter reionization priors. They could also be reduced by 1) improving the accuracy in reflection measurements, especially their phase, 2) decreasing the changes with frequency of the antenna reflection phase, and 3) improving the impedance match at the antenna-receiver interface.


Viewing all articles
Browse latest Browse all 2573

Trending Articles