Next: Hierarchical Wiener filtering
Up: Noise reduction from the
Previous: The convolution from the
The Wienerlike filtering in the wavelet space
Let us consider a measured wavelet coefficient w_{i} at the scale i.
We assume
that its value, at a given scale and a given position,
results from a noisy process, with a Gaussian distribution with a
mathematical expectation W_{i}, and a standard deviation B_{i}:



(14.76) 
Now, we assume that the set of expected coefficients W_{i} for a given
scale also follows a Gaussian distribution, with a null mean and a
standard deviation S_{i}:



(14.77) 
The null mean value results from the wavelet property:



(14.78) 
We want to get an estimate of W_{i} knowing w_{i}. Bayes' theorem gives:



(14.79) 
We get:



(14.80) 
where:



(14.81) 
the probability
P(W_{i}/w_{i}) follows a Gaussian distribution with a mean:



(14.82) 
and a variance:



(14.83) 
The mathematical expectation of W_{i} is
.
With a simple multiplication of the coefficients by the constant ,
we get a linear filter. The algorithm is:
 1.
 Compute the wavelet transform of the data. We get w_{i}.
 2.
 Estimate the standard deviation of the noise B_{0} of the first plane
from the histogram of w_{0}. As we process oversampled images, the
values of the wavelet image corresponding to the first scale (w_{0})
are due mainly to the noise. The histogram shows a Gaussian peak
around 0. We compute the standard deviation of this Gaussian
function, with a
clipping, rejecting pixels where the signal
could be significant;
 3.
 Set i to 0.
 4.
 Estimate the standard deviation of the noise B_{i} from B_{0}. This
is done from the study of the variation of the noise between two
scales, with an hypothesis of a white gaussian noise;
 5.

S_{i}^{2} = s_{i}^{2}  B_{i}^{2} where s_{i}^{2} is the variance of w_{i}.
 6.

.
 7.

.
 8.
 i = i + 1 and go to 4.
 9.
 Reconstruct the picture from W_{i}.
Next: Hierarchical Wiener filtering
Up: Noise reduction from the
Previous: The convolution from the
Petra Nass
19990615