weiner khinchin relation

The Wiener-Khinchin Theorem

So far, we have only asserted that the sum of waves with random phases generates a time-stationary gaussian signal. We now have to check this. It is convenient to start with a signal going from $0$ to $T$, and only later take the limit

$T\rightarrow \infty$. The usual theory of Fourier series tells us that we can write

 

 

\begin{displaymath}E(t)\equiv \sum{a_n\cos \omega_nt + b_n\sin\omega_nt}\end{displaymath}



 

 

 

 

\begin{displaymath}\equiv \sum{r_n \cos(\omega_nt+\varphi_n)}\end{displaymath}



 

 

where,
 

 

\begin{displaymath}\omega_n=\frac{2\pi}{T},~r_n=\sqrt{a_nx^2+b_n^2}, and<br />
~\tan\varphi_n=-b_n/a_n\end{displaymath}



 

 

Notice that the frequencies come in multiples of the ``fundamental'' $2\pi/T$ which is very small since $T$ is large, and hence they form a closely spaced set. We can now compute the autocorrelation

 

 

\begin{displaymath}C(\tau)=\langle{E(t)E(t+\tau)\rangle} = \langle \sum_n r_n<br />
\c...<br />
...+\varphi_n) \sum_m r_m<br />
\cos(\omega_m(t+\tau)+\varphi_m)\rangle \end{displaymath}



 

 

The averaging on the right hand side has to be carried out by letting each of the phases $\varphi_k$ vary independently from $0$ to $2\pi$. When we do this, only terms with $m=n$ can survive, and we get

 

 

\begin{displaymath}C(\tau)=\sum\frac{1}{2}r_n^2\cos \omega_n\tau\end{displaymath}



 

 

.

Putting $\tau$ equal to zero, we get the variance

 

 

\begin{displaymath}C(0)=\langle E(t)^2 \rangle = \sum\frac{1}{2}r_n^2\end{displaymath}



 

 

We note that the autocorrelation is independent of $t$ and hence we have checked time stationarity, at least for this statistical property. We now have to face the limit

$T\rightarrow \infty$. The number of frequencies in a given range $\Delta \omega$ blows up as

 

 

\begin{displaymath}\frac{\Delta\omega}{(2\pi/T)}=\frac{T\Delta\omega}{2\pi}.\end{displaymath}



 

 

Clearly, the $r^2_n$ have to scale inversely with $T$ if statistical qualities like $C(\tau)$ are to have a well defined

$T\rightarrow \infty$ behaviour. Further, since the number of $r_n$'s even in a small interval $\Delta \omega$ blows up, what is important is their combined effect rather than the behaviour of any individual one. All this motivates the definition.

 

 

\begin{displaymath}\sum_{\omega <<br />
\omega_n < \omega+\Delta\omega}{\frac{r^2_n}{2}}=2S(\omega)\Delta\omega\end{displaymath}



 

 

as

$T\rightarrow\infty.$ Physically,

$2S(\omega)\Delta\omega$ is the contribution to the variance

$\langle E^2(t)\rangle$ from the interval $\omega$ to

$\omega+\Delta\omega$. Hence the term ``power spectrum'' for $S(\omega)$. Our basic result for the autocorrelation now reads

 

 

\begin{displaymath}C(\tau)=\int^\infty_0 2S(\omega)\cos\omega\tau d\omega =<br />
\int^{+\infty}_{-\infty}S(\omega)e^{-i\omega\tau}d\omega\end{displaymath}



 

 

if we define

$S(-\omega)=S(\omega)$.

This is the ``Wiener-Khinchin theorem'' stating that the autocorrelation function is the Fourier transform of the power spectrum. It can also be written with the frequency measured in cycles (rather than radians) per second and denoted by $\nu$.

 

 

\begin{displaymath}C(\tau)=\int^\infty_0 2P(\nu)\cos (2\pi\nu\tau) d\nu =<br />
\int^{+\infty}_{-\infty}P(\nu)e^{-2\pi i\nu\tau}d\nu\end{displaymath}



 

 

and as before,

$P(-\nu)=P(\nu)$.

In this particular case of the autocorrelation, we did not use independence of the $\varphi$ 's. Thus the theorem is valid even for a non-gaussian random process. (for which different $\varphi$ 's are not independent). Notice also that we could have averaged over $t$ instead of over all the $\varphi$'s and we would have obtained the same result, viz. that contributions are nonzero only when we multiply a given frequency with itself. One could even argue that the operation of integrating over the $\varphi$'s is summing over a fictitious collection (i.e ``ensemble'') of signals, while integrating over $t$ and dividing by $T$ is closer to what we do in practice. The idea that the ensemble average can be realised by the more practical time average is called ``ergodicity'' and like everything else here, needs better proof than we have given it. A rigorous treatment would in fact start by worrying about existence of a well-defined

$T\rightarrow \infty$ limit for all statistical quantities, not just the autocorrelation. This is called ``proving the existence of the random process''.

The autocorrelation $C(\tau)$ and the power spectrum $S(\omega)$ could in principle be measured in two different kinds of experiments. In the time domain, one could record samples of the voltage and calculate averages of lagged products to get $C$. In the frequency domain one would pass the signal through a filter admitting a narrow band of frequencies around $\omega$, and measure the average power that gets through.

A simple but instructive application of the Wiener Khinchin theorem is to a power spectrum which is constant (``flat band'') between $\nu_0 -B/2$ and  $\nu_0+B/2$. A simple calculation shows that
 

 

\begin{displaymath}C(\tau)~=~2KB \left(<br />
\cos(2\pi \nu_0 \tau)\right)\left(\frac{\sin(\pi B \tau)}{\pi B<br />
\tau}\right)\end{displaymath}



 

 

The first factor $2 K B $ is the value at $\tau = 0$, hence the total power/variance to radio astronomers/statisticians. The second factor is an oscillation at the centre frequency. This is easily understood. If the bandwidth $B$ is very small compared to $\nu_0$, the third factor would be close to unity for values of $\tau$ extending over say $1/4B$, which is still many cycles of the centre frequency. This approaches the limiting case of a single sinusoidal wave, whose autocorrelation is sinusoidal. The third sinc function factor describes ``bandwidth decorrelation1.1'', which occurs when $\tau$ becomes comparable to or larger than $1/B$.

Another important case, in some ways opposite to the preceding one, occurs when $\nu_0=B/2$, so that the band extends from $0$ to $B$. This is a so-called ``baseband''. In this case, the autocorrelation is proportional to a sinc function of $2\pi B \tau$. Now, the correlation between a pair of voltages measured at an interval of $1/2B$ or any multiple (except zero!) thereof is zero, a special property of our flat band. In this case, we see very clearly that a set of samples measured at this interval of $1/2B$, the so-called ``Nyquist sampling interval'', would actually be statistically independent since correlations between any pair vanish (this would be clearer after going through Section 1.8). Clearly, this is the minimum number of measurements which would have to be made to reproduce the signal, since if we missed one of them the others would give us no clue about it. As we will now see, it is also the maximum number for this bandwidth!

courtesy:gmrt.ncra.tifr.res.in/gmrt_hpage/Users/doc/WEBLF/LFRA/node7.html

The Wiener–Khinchin theorem (also known as the Wiener–Khintchine theorem and sometimes as the Wiener–Khinchin–Einstein theorem or the Khinchin–Kolmogorov theorem) states that the power spectral density of a wide-sense-stationary random process is the Fourier transform of the corresponding autocorrelation function.[1][2][3]

Continuous case:

<br />
    S_{xx}(f)=\int_{-\infty}^\infty r_{xx}(\tau)e^{-j2\pi f\tau} \ d\tau<br />

where

r_{xx}(\tau) = \operatorname{E}\big[\, x(t)x^*(t-\tau) \, \big] \

is the autocorrelation function defined in terms of statistical expectation, and where

S_{xx}(f) \

is the power spectral density of the function x(t)\,. Note that the autocorrelation function is defined in terms of the expected value of a product, and that the Fourier transform of x(t)\, does not exist, in general, because stationary random functions are not square integrable.

The asterisk denotes complex conjugate, and can be omitted if the random process is real-valued.

Discrete case:

 S_{xx}(f)=\sum_{k=-\infty}^\infty r_{xx}[k]e^{-j2\pi k f}

where

r_{xx}[k] = \operatorname{E}\big[ \, x[n] x^*[n-k] \, \big] \

and where

S_{xx}(f) \

is the power spectral density of the function with discrete values x[n]\,. Being a sampled and discrete-time sequence, the spectral density is periodic in the frequency domain.

Application

The theorem is useful for analyzing linear time-invariant systems, LTI systems, when the inputs and outputs are not square integrable, so their Fourier transforms do not exist. A corollary is that the Fourier transform of the autocorrelation function of the output of an LTI system is equal to the product of the Fourier transform of the autocorrelation function of the input of the system times the squared magnitude of the Fourier transform of the system impulse response. This works even when the Fourier transforms of the input and output signals do not exist because these signals are not square integrable, so the system inputs and outputs cannot be directly related by the Fourier transform of the impulse response.

Since the Fourier transform of the autocorrelation function of a signal is the power spectrum of the signal, this corollary is equivalent to saying that the power spectrum of the output is equal to the power spectrum of the input times the power transfer function.

This corollary is used in the parametric method for power spectrum estimation.

 Discrepancy of definition

By the definitions involving infinite integrals in the articles on spectral density and autocorrelation, the Wiener–Khintchine theorem is a simple Fourier transform pair, trivially provable for any square integrable function, i.e. for functions whose Fourier transforms exist. More usefully, and historically, the theorem applies to wide-sense-stationary random processes, signals whose Fourier transforms do not exist, using the definition of autocorrelation function in terms of expected value rather than an infinite integral. This trivialization of the Wiener–Khintchine theorem is commonplace in modern technical literature.courtesy:en.wikipedia.org/wiki/Wiener%E2%80%93Khinchin_theorem


Dear Guest,
Spend a minute to Register in a few simple steps, for complete access to the Social Learning Platform with Community Learning Features and Learning Resources.
If you are part of the Learning Community already, Login now!
0
Your rating: None

Posted by



Sat, 05/23/2009 - 15:54

Share

Collaborate