stationary process
Introduction
From the definition of a random process, we know that all random processes are composed of random variables, each at its own unique point in time. Because of this, random processes have all the properties of random variables, such as mean, correlation, variances, etc.. When dealing with groups of signals or sequences it will be important for us to be able to show whether of not these statistical properties hold true for the entire random process. To do this, the concept of stationary processes has been developed. The general definition of a stationary process is:
 Definition 1: stationary process

a random process where all of its statistical properties do not vary with time
Processes whose statistical properties do change are referred to as nonstationary.
Understanding the basic idea of stationarity will help you to be able to follow the more concrete and mathematical definition to follow. Also, we will look at various levels of stationarity used to describe the various types of stationarity characteristics a random process can have.
Distribution and Density Functions
In order to properly define what it means to be stationary from a mathematical standpoint, one needs to be somewhat familiar with the concepts of distribution and density functions. If you can remember your statistics then feel free to skip this section!
Recall that when dealing with a single random variable, the probability distribution function is a simply tool used to identify the probability that our observed random variable will be less than or equal to a given number. More precisely, let X be our random variable, and let x be our given value; from this we can define the distribution function as
=Pr[X≤x]
(1)
This same idea can be applied to instances where we have multiple random variables as well. There may be situations where we want to look at the probability of event X and Y both occurring. For example, below is an example of a secondorder joint distribution function.
=Pr[X≤x,Y≤y]
(2)
While the distribution function provides us with a full view of our variable or processes probability, it is not always the most useful for calculations. Often times we will want to look at its derivative, the probability density function (pdf). We define the the pdf as
=
d 
dx 
F_{x}(x)
(3)
dx=Pr[x<X≤x+dx]
(4)
Equation 4 reveals some of the physical significance of the density function. This equations tells us the probability that our random variable falls within a given interval can be approximated by f_{x}(x)
dx. From the pdf, we can now use our knowledge of integrals to evaluate probabilities from the above approximation. Again we can also define a joint density function which will include multiple random variables just as was done for the distribution function. The density function is used for a variety of calculations, such as finding the expected value or proving a random variable is stationary, to name a few.
note:
The above examples explain the distribution and density functions in terms of a single random variable, X. When we are dealing with signals and random processes, remember that we will have a set of random variables where a different random variable will occur at each time instance of the random process, X(t_{k})
. In other words, the distribution and density function will also need to take into account the choice of time.
Stationarity
Below we will now look at a more in depth and mathematical definition of a stationary process. As was mentioned previously, various levels of stationarity exist and we will look at the most common types.
FirstOrder Stationary Process
A random process is classified as firstorder stationary if its firstorder probability density function remains equal regardless of any shift in time to its time origin. If we let x_{t1} represent a given value at time t_{1}, then we define a firstorder stationary as one that satisfies the following equation:
=f_{x}(x_{t1+τ})
(5)
The physical significance of this equation is that our density function, f_{x}(x_{t1})
, is completely independent of t_{1} and thus any time shift, τ.
The most important result of this statement, and the identifying characteristic of any firstorder stationary process, is the fact that the mean is a constant, independent of any time shift. Below we show the results for a random process, X, that is a discretetime signal, x[n]
.
X  =  m_{x}[n]

=  E[x[n]
]


=  constant (independent of n) 
(6)
SecondOrder and StrictSense Stationary Process
A random process is classified as secondorder stationary if its secondorder probability density function does not vary over any time shift applied to both values. In other words, for values x_{t1} and x_{t2} then we will have the following be equal for an arbitrary time shift τ.
=f_{x}(x_{t1+τ},x_{t2+τ})
(7)
From this equation we see that the absolute time does not affect our functions, rather it only really depends on the time difference between the two variables. Looked at another way, this equation can be described as
≤x_{1},X(t_{2})
≤x_{2}]
=Pr[X(t_{1}+τ)
≤x_{1},X(t_{2}+τ)
≤x_{2}]
(8)
These random processes are often referred to as strict sense stationary (SSS) when all of the distribution functions of the process are unchanged regardless of the time shift applied to them.
For a secondorder stationary process, we need to look at the autocorrelation function to see its most important property. Since we have already stated that a secondorder stationary process depends only on the time difference, then all of these types of processes have the following property:
R_{xx}(t,t+τ)

=  E[X(t+τ)
]

=  R_{xx}(τ)

(9)
WideSense Stationary Process
As you begin to work with random processes, it will become evident that the strict requirements of a SSS process is more than is often necessary in order to adequately approximate our calculations on random processes. We define a final type of stationarity, referred to as widesense stationary (WSS), to have slightly more relaxed requirements but ones that are still enough to provide us with adequate results. In order to be WSS a random process only needs to meet the following two requirements.
 X=E[x[n]
]
=constant
 E[X(t+τ)
]
=R_{xx}(τ)
Note that a secondorder (or SSS) stationary process will always be WSS; however, the reverse will not always hold true.
courtesy:cnx.org/content/m10684/latest/
Definition
Formally, let X_{t} be a stochastic process and let represent the cumulative distribution function of the joint distribution of X_{t} at times . Then, X_{t} is said to be stationary if, for all k, for all τ, and for all ,
Law of large numbers
A stationary sequence of random variables can be written as X_{t + 1} = TX_{t}, where T is a measurepreserving operator on some probability space,^{[1]} thus a law of large numbers in the form of BirkhoffKhinchin theorem applies.
Examples
As an example, white noise is stationary. However, the sound of a cymbal crashing is not stationary because the acoustic power of the crash (and hence its variance) diminishes with time.
An example of a discretetime stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is a Bernoulli scheme. Other examples of a discretetime stationary process with continuous sample space include autoregressive and moving average processes which are both subsets of the autoregressive moving average model.
If one also assumes divisibility (independent increments) and continuity, one obtains a Lévy process.
Weaker forms of stationarity
Weak or widesense stationarity
A weaker form of stationarity commonly employed in signal processing is known as weaksense stationarity, widesense stationarity (WSS) or covariance stationarity. WSS random processes only require that 1st and 2nd moments do not vary with respect to time. Any strictly stationary process which has a mean and a covariance is also WSS.
So, a continuoustime random process x(t) which is WSS has the following restrictions on its mean function
and autocorrelation function
The first property implies that the mean function m_{x}(t) must be constant. The second property implies that the correlation function depends only on the difference between t_{1} and t_{2} and only needs to be indexed by one variable rather than two variables. Thus, instead of writing,
we usually abbreviate the notation and write
This also implies that the autocovariance depends only on τ = t_{1} − t_{2}, since
When processing WSS random signals with linear, timeinvariant (LTI) filters, it is helpful to think of the correlation function as a linear operator. Since it is a circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the frequency domain. Thus, the WSS assumption is widely employed in signal processing algorithms.
Secondorder stationarity
The case of secondorder stationarity arises when the requirements of strict stationarity are only applied to pairs of random variables from the timeseries. The definition of second order stationarity can be generalised to Nth order (for finite N) and strict stationary means stationary of all orders.
A process is second order stationary if the first and second order density functions satisfy
for all t_{1}, t_{2}, and Δ. Such a process will be wide sense stationary if the mean and corelation functions are finite. A process can be wide sense stationary without being second order stationary.
Other terminology
The terminology used for types of stationarity other than strict stationarity can be rather mixed. Some examples follow.

 Priestley^{[2]}^{[3]} uses stationary up to order m if conditions similar to those given here for wide sense stationarity apply relating to moments up to order m. Thus wide sense stationarity would be equivalent to "stationary to order 2", which is different from the definition of secondorder stationarity given here.
courtesy:en.wikipedia.org/wiki/Stationary_process
Spend a minute to Register in a few simple steps, for complete access to the Social Learning Platform with Community Learning Features and Learning Resources.
If you are part of the Learning Community already, Login now!