Paradoxically, signal detection is concerned with fitting models to
supposedly random series similarly to mathematical proofs by *reductio ad absurdum* of the antithesis. That is, the hypothesis
(antithesis) *H*_{o} is made, that the observed series *X*^{(o)} has
properties of a pure noise series, *N*^{(o)}. Then, a model is fitted
and series *X*^{(m)} and *X*^{(r)} are obtained. If the quality of the
model fits to the observations *X*^{(o)} does not significantly differ
from the quality of a fit to pure noise *N*^{(o)}, then *H*_{o} is true
and we say that *X*^{(o)} contains no signal but noise. In the
opposite case of model fitting *X*^{(o)} significantly better than
*N*^{(o)}, we reject *H*_{o} and say that the model signal was detected
in *X*^{(o)}. The difference is significant (at some level) if it is
not likely (at this level) to occur between two different realizations
of the noise *N*^{(o)}.

The quality of the fit is evaluated using a function *S* of the series
*X*^{(o)}, *X*^{(m)}, and *X*^{(r)}. A function of random variables,
such as
*S*(*X*^{(o)}), is a random variable itself and is called a *statistic*. A random variable S is characterized by its probability
distribution function. Following *H*_{o} we use the distribution of *S*for pure noise signal *N*^{(o)}, *N*^{(m)} and *N*^{(r)}, to be
denoted *p*_{N}(*S*) or simply *p*(*S*). Precisely, we shall use the *cumulative probability distribution* function which for a given
critical value of the statistic *S*=*S*_{o} supplies the probability
*p*(*S*_{o}) for the observed *S* to fall on one side of the *S*_{o}.

The observed value of the statistic and its probability distribution,
*S*(*X*^{(o)}) and *p*(*S*) respectively, are used to obtain the
probability *p*(*S*(*X*)) of *H*_{o} being true. That is, if *p* turns out
small, ,
*H*_{o} is improbable and *X*^{(o)} has no properties
of *N*^{(o)}. Then we say that the model signal has been detected at
the confidence level .
The smaller
is, the more
convincing (significant) is the detection. The special realization of
a random series which consists of independent variables of common
(gaussian) distribution is called (gaussian) white noise. We assume
here that the noise *N*^{(o)} is white noise. Note that in the signal
detection process, frequency
and lag *l* are considered
independent variables and do not count as parameters.

Summarizing, the basis for the determination of the properties of a an
observed time series is a test statistic, *S*, with known probability
distribution for (white) noise, *p*(*S*).

Let *N*^{(o)} consist of *n*_{o} random variables and let a given model
have *n*_{m} parameters. Then the modeled series *N*^{(m)} corresponds
to a combination of *n*_{m} random variables and the residual series
*N*^{(r)} corresponds to a combination of
*n*_{r}=*n*_{o}-*n*_{m} random
variables. The proof rests on the observation that orthogonal
transformations convert vectors of independent variables into vectors
of independent variables. Let us consider an approximately linear
model with matrix
so that
,
where *P* is a vector of *n*_{m} parameters. Then *N*^{(m)} spans a
vector space with no more than *n*_{m} orthogonal vectors (dimensions).
The numbers *n*_{o}, *n*_{m} and *n*_{r} are called the numbers of degrees
of freedom of the observations, the model fit, and the residuals,
respectively.