A network of detectors is, therefore, essential for source reconstruction. Network observation is not only powerful in identifying a source in the sky, but independent observation of the same source in several detectors adds to the detection confidence, especially since the noise background in the first generation of interferometers is not well understood and is plagued by nonstationarity and non-Gaussianity.
The availability of a network of detectors offers two different methods by which the data can be combined.
One can either first bring the data sets together, combine them in a certain way, and then
apply the appropriate filter to the network data and integrate the signal coherently, coherent
detection [283, 89, 160, 43], or first analyze the data from each detector separately by applying the
relevant filters and then look for coincidences in the multi-dimensional space of intrinsic (masses of the
component stars, their spins,
) and extrinsic (arrival times, a constant phase, source location,
)
parameters, coincidence detection [206, 208, 160, 44, 355, 2
, 6
, 7
, 8
].
A recent comparison of coherent analysis vis-a-vis coincidence analysis under the assumption that the background noise is Gaussian and stationary has concluded that coherent analysis, as one might expect, is far better than coincidence analysis [265]. These authors also explore, to a limited extent, the effect of nonstationary noise and reach essentially the same conclusion.
At the outset, coherent analysis sounds like a good idea, since in a network of similar detectors the
visibility of a signal improves by a factor of
over that of a single detector. One can take advantage
of this enhancement in SNR to either lower the false alarm rate by increasing the detection threshold, while
maintaining the same detection efficiency, or improve detection efficiency at a given false alarm
rate.
However, there are two reasons that current data-analysis pipelines prefer coincidence analysis over coherent analysis. Firstly, since the detector noise is neither Gaussian nor stationary, coincidence analysis can potentially reduce the background rate far greater than one might think otherwise. Secondly, coherent analysis is computationally far more expensive than coincidence analysis and it is presently not practicable to employ coherent analysis.
Coincidence analysis is indeed a very powerful method to veto out spurious events. One can associate with each event in a given detector an ellipsoid, whose location and orientation depends on where in the parameter space and when the event was found, and the SNR can be used to fix the size of the ellipsoid [316]. One is associating with each event a ‘sphere’ of influence in the multi-dimensional space of masses, spins, arrival times, etc., and there is a stringent demand that the spheres associated with events from different detectors should overlap each other in order to claim a detection. Since random triggers from a network of detectors are less likely to be consistent with one another, this method serves as a very powerful veto.
It is probably not possible to infer beforehand which method might be more effective in detecting a source, as this might depend on the nature of the detector noise, on how the detection statistic is constructed, etc. An optimal approach might be a suitable combination of both of these methods. For instance, a coherent follow-up of a coincidence analysis (as is currently done by searches for compact binaries within the LSC) or to use coincidence criteria on candidate events from a coherent search.
Coherent addition of data improves the visibility of the signal, but ‘coherent subtraction’ of the data in a detector network should lead to data products that are devoid of gravitational wave signals. This leads us naturally to the introduction of the null stream veto.
Data from a network of detectors, when suitably shifted in time and combined linearly with coefficients that
depend on the source location, will yield a time series that, in the ideal case, will be entirely be devoid of
the gravitational signal. Such a combination is called a null stream. For instance, for a set of three
misaligned detectors, each measuring a data stream ,
, the combination
, where
are functions
of the responses of the antennas
and
, and
’s,
, are time delays that
depend on the source location and the location of the antenna, is a null stream. If
,
, contain a gravitational wave signal from an astronomical source, then
will
not contain the signature of this source. In contrast, if
and
both contain the
signature of a gravitational wave event, then that is an indication that one of the detectors has a
glitch.
The existence and usefulness of a null stream was first pointed out by Gürsel and Tinto [186]. Wen and Schutz [390] proposed implementing it in LSC data analysis as a veto, and this has been taken up now by several search groups.
Stochastic background sources and their detection is discussed in more detail in Section 8. Here we will
briefly mention the problem in the context of detector networks. As mentioned in Section 3.6, the universe
might be filled with stochastic gravitational waves that were either generated in the primeval universe or by
a population of background sources. For point sources, although each source in a population
might not be individually detectable, they could collectively produce a confusion background via
a random superposition of the waves from that population. Since the waves are random in
nature, it is not possible to use the techniques described in Sections 4.7.1, 4.7.2 and 5.1 to
detect a stochastic background. However, we might use the noisy stochastic signal in one of the
detectors as a “matched-filter” for the data in another detector [362, 163, 30
, 93]. In other words,
it should be possible to detect a stochastic background by cross-correlating the data from a
pair of detectors; the common gravitational-wave background will grow in time more rapidly
than the random backgrounds in the two instruments, thereby facilitating the detection of the
background.
If two instruments with identical spectral noise density are cross-correlated over a bandwidth
for a total time
, the spectral noise density of the output is reduced by a factor of
. Since the noise amplitude is proportional to the square root of
, the amplitude of
a signal that can be detected by cross-correlation improves only with the fourth root of the
observing time. This should be compared with the square root improvement that matched filtering
gives.
The cross-correlation technique works well when the two detectors are situated close to one another. When separated, only those waves whose wavelength is larger than or comparable to the distance between the two detectors, or which arrive from a direction perpendicular to the separation between the detectors, can contribute coherently to the cross-correlation statistic. Since the instrumental noise builds up rapidly at lower frequencies, detectors that are farther apart are less useful in cross-correlation. However, very near-by detectors (as in the case of two LIGO detectors within the same vacuum tube in Hanford) will suffer from common background noise from the control system and the environment, making it rather difficult to ascertain if any excess noise is due to a stochastic background of gravitational waves.
http://www.livingreviews.org/lrr-2009-2 | ![]() This work is licensed under a Creative Commons License. Problems/comments to |