So, if you recollects we have looked at the definition of random processes. These are
the 2 ways of looking at random process 1 is to
look up on it as a collection of waveforms which we are randomly available right. Any
particular waveform is certain probability depending on the probability distribution
of the underlying probability space right. There
is a mapping from omega to the set of functions which are
functions of time. How a ways to think of it have a sequence in this sequence of
we are there is some kind of index which gives out the functional
dependency eventually. It could be a functional dependency on t index could be t the
index could be some special variable or any other kind of variable right.
So, you have a sequence one after another or random variables occurring the certain
probability distributions in that constitute of random process. The later approach is
convenient from the point of view of characterization of the random process in terms of a
probability distribution functions. Only thing is the complete characterization is a very
elaborate affair and we discuss that it is very very difficult for a complete
characterization of an object random process. Because you when need to characterize the
process at every time instant individually at every pair of time instant jointly and
so on so forth for every the points every quadrate
the points that you select. And therefore, 1 and infinitive number
of going to distribution functions distribution function
as well as joined distribution function of various orders you
completely correct raise the process. To simplify the process to simplify the characterization
you made some assumptions above the process. We can make some assumptions
of course, they must be valid if you are going to use this typically many of this
functions are valid why assumption that we made is the stationary assumption. Then we
defined processes as 1 whose turbidity distribution functions or
whose characterization whose statistical characterization is independent of timology
right time shifts. So, these are the things are
we cover last time, we also looked at the definition of certain limits of a elements
process mainly the new value function and the auto
correlation function. So, we will start from .
So, look at the auto correlation function is second order characterization why do we
say size of characterization? Because required
the joint density function of the 2 in the Xt way and Xt. So,
next call it Rx t 1 t 2 Then integral . So, take the random variables
values random process values are time instance t 1 that
will be denoted by the value x 2. Of course, this
could occur anywhere from this value could be anywhere lying between the value of this
minus infinitive plus infinitive x 1 into x 2 multiply these two. Multiply to the join
density function of these 2 random variables. So, this is 1 notation by which we can
denote the join density function of random variables t 1 by samply the random process
x t at the time instance t 1 and t 2 right.
So, this is the notation this t 1 and t 2. This in general there will be
a time dependency. Yes there will be a time dependency of the processes not stationary.
If the processes is stationary the time dependency only being in terms of P 1 minus 2.
This t 1 and t 2 separately right. But yes in general this is evaluate
as this joint density function depends on this dimensions t 1 and t 2. Therefore, the
auto correlation function that dependent the various
of symptoms of random process right. So,
this is the general definition for strictly stationary processors. For stationary processors
this will be a function only or you could say this you could right as Rx t 2 minus t
1 if the be if you are Rx tau the tau is defined to
the variable t 2 minus t. This is I thing we are
we stop last. So, now let us look at a few things.
Let us look at the values of Rx tau for tau equal to 0 that is it mean that we are looking
at what is this equal to? You can think of this
portion in terms of expectation of portion are
equal to expected value of x t 1 into x t 2 right. This integral is nothing for the
expected value or average value of the product of the
p random variables sampling the process of time instance t 1
and t 2 right. So, where is Rx if you choose t 1 equal to t 2 That will be that will give
you tau equal to 0 right. So, it could say this is equal to expected
value of x square t any arbitrary time instance t.
So, this the value of the auto correlative function for a stationary process at all could
0 it going to be equal to the mu square value of
the process it is a constant. Suppose the process was not stationary then with this
be a constant? No. In fact, we will not write Rx
ca in that case. We will write Rx t t or t 1 t 1 right dependent the
value of t. So, this equivalently for a non stationary
process the corresponding relation will be Rx t t
it represent t 1 equal to t 2 equal to t. And this will be equal to expected value of
x square t that is correct. But this is going to be
a function of time it is are to be constant. In this
case it is not going to be a function of time it is going to be constant right is it clear?
So, for a time for a non stationary process this
is the relation for a stationary process ((Refer Time: 08:25)). It is clear from the definition
of the auto correlation function at least for
real value processors that if I interchange t 1 and t 2 if I interchange write x t 2 first
x t 1 later it would make any difference. We just
multiplying the same 2 numbers right the average value will be the same right what
did it mean? The Rx Rx tau will be equal to Rx
minus tau right. So, this is a important property of the auto correlation function for real
value processors right. Student: what is the difference between Rx
They are same for stationary processors. This will become this for stationary process; for
stationary process this will be equal to t minus t minus t. But in
general it is not a stationary process this we have have to work
with. If it is a stationary process work with this place.
Student: . That is right because the value of this auto
correlative function has been discussed we depend only on the separation of the t time
instance have a on value in the time instance. Because if it
is a the basic function .
Student: So for Rx 0 should not be used . No no no please understand if it is what is
the Rx t 1 and t 2? t 2 value x t 1 and x t 2. So,
ordinary choosing t 1 and t 2 will be the same time instance. The
tau 0 any kind of term look nearly these value that is R 0. Rx stop
is a functional stop then tau equal to 0 if it is it becomes it could be a least four
validity function that is the point that you are trying
make sure. Look at the basic definition can place of the .
So, this is true for real value processors. So, sometimes I simply use a word process
when I want to say random process. In general you could also field
that the Rx stop. Of course, we can prove it more formally like it is
with that yourself the magnificent of RX stop cannot be more than the value of tau equal
to 0 is not it? Again for the same reason we can expect next line correlation to occur
when the p random variable of the same right. The way you example of a different time
extend the additional random variables it can have correlation right. But; obviously,
that will be less than the correlation that you
will have of random variable with itself that will
be perfectly right. So, of course, one can with mathematically which are likely to be
yourself. It is very easy to do easy to check that the magnitude of auto correlation function
for a laptop the variable size as sometimes sometimes called the long variable,
because long between 2. Time instance t 1 and t 2 are the time difference t 1 and
t 2. It is always less than or equal to the value
of the auto correlation function the tau equal to
0. Please understand that Rx tau is not a random quantity in case this there is any
kind of confusion in your mind. Rx tau is the highly
deterministic function right it is a deterministic function, because a basically
look here this interview right it is a function of I like x t which is a random process Rx
tau Rx t t or Rx t 1 t 2 They are deterministic functions which are properties which are which
describe the second order of the random process x t right. It tells you the behavior
of the random variables with respect to each other on an average by looking at the average
value of the product. By sampling a random process the different kinds of things
the multiplying this random variables and looking at the average value of this product
it is some kind of a characterization of the process some kind of a characterization of
the process. It is a value it is a fixed value for
again process even though the random process x t random in nature.
It is the value that you will see random in nature just nothing random in a Rx t 1 t 2
or Rx t t or Rx 0 or Rx stop right. Lot of correlation
function is a deterministic function. So, these are some of the properties of a stationary
process. You say it is just I will mention possibly here and we leave it if . If the
process x is a complex value process. For example, if you are working with
in a naroboam process naroboam process when we looking are its complex envelop then
it will be a complex value process right. In that case the definition are slightly modify
that correlation function is defined as x t 1
into x t 2 right we do a correlation that you will use that one in the
. Suppose in a same instinct process could briefly simplify our characterization of the
process right we discussed earlier. In many
many instrances your real life when you are working with random signals or random
processors. We do not really need to worry about distribution
beyond second order right. Typically we are interested in collectors in the process
in terms of individual value that you might see are various time instance or in terms
of join behavior of a sphere of values right. Typically very early need to work work a situation
very need to look at more than 2 values at a time right, which essentially
means that we do not need to worry about characterization beyond certain order of in
many many application in practice right. So, if that with the case
wherever the process executes stationality in the
stripes or not is another key all of them really then want us that the process be second
order stationary right. The second order stationary means that the first order density
function E x t 1 x should be independent of t. And second order joint density function
E x t 1 and x t 2 should be independent of should
over dependent on t 1 minus t 2 right. So, if the process set as files this time
origin you various only the respective these 2
distribution function we say the process is second order stationary ratherstic
sensationally right. In reality you do not to worry about second order sationarity as
engineers. It is sufficient for us if the first 2 elements of the process execute these
properties right. So if that happens we are working with that process which is set to
be stationary in the widest sense or wide sense.
So, we define a nation of wide sense stationarity which is a special case of stripces
rather stripes sentationallity imply of course, this. So, wide sense stationarity
is defined as follows. If the mean value function the various function which of course, is redundant,
because if the auto correlation. And the third thing auto correlation; if the auto
correlation is a lead to various as a special case or if these 2 are independent of time.
And if the automobiles function is a function only of t 2 minus t 1 t 2 minus t 1 is a function
only of the difference variable tau. Then the process x t is set to be void sensation
this, your usual notation WSS. So, what I am saying? We not asking even in the density
functions to be invariants even the first and second order density function you only
say that the new value function. You know what is new value function how will you new
value function define? Mu x t will be simply equal to expected value of x t. Now,
it is a that is the definition of mew looking at
the random variable of time t look in survey value what is this going to be equal to x
E x x it will also dependent t in general dx right.
So if it is a first our stationary process Px t
could not dependent t right therefore, it is obvious that mew x with constant right.
But other definition of wide sense stationary
process does not even require this to be independent of t all as various if this is
independent t that for enough for us right. We
look all the other first of the movement associate with this x t function right that is one.
What is this various? Various is sigma x square t it will be expected value of x t minus x
bar t whole square right. Again this will depend only on the density function p x t
right. So, this will be equal to x minus x bar that
the definition of various is in it that expected value of x minus x bar whole square that is
integral of x minus x bar and the density function associated with x right. Again this
is this is independent of t therefore, this should be in dependent of t, but again in
WSS whereas, in the, this should be independent of t are you say yes this should
be independent of t right. The second order movement also independent of t. And finally,
what we are saying is that auto correlation function Rx t 1 t 2 as defined earlier should
be a function only is tau right. Of course, the various function is related
to this function this will display this property actually this will be automatically implied
this is something that we can check right. So,
if these three conditions satisfy actually 2 conditions then we the mew value function
is independent of time. And the auto correlation
function is dependent only of on the time difference between the 2 dimensions t 1 and
t 2 These are the processors stationary wide
sense stationary. And many times we are quite happy the processors wide sense
stationary be many times you do not you are need to worry about the distribution
functions right. Do not need to worry about the first and second order movement
functions then we the mean value function and the auto correlation function.
The mean value function is a first order function, because the race
power race power of x that we racing 2 is 1 right when we taking the movement is a first
order movement the auto correlation function the various function in a second order
function, second order movement function. And many time it is sufficient to work with
the first movements right they contain most to the physical information’s we are
generally interested in in looking other processes right. So, therefore, introduce the
concept of strict sense stationary process a process which is stationary up to second
order or third order could be a special case of
there. But wide sensational process is something which is all together much more tolerate of
non stationary, process maybe non stationary stationary in the wide sense right.
So, what your is that if a process is strict sense stationary this
stands for strict sense stationary it could; obviously, implied that it is wide sense
stationary is not it? That obvious should be obvious I think this could imply WSS
property the will happen only very very special cases. WSS could
not in general imply strict sense stationary right because this is independence only in
terms of movements there also if the first thing first 2 orders right. And therefore,
we cannot say that all density functions or the
complete statistical characterization could be
independent of time right. But in very very special cases this could imply this for
example, for a class of association which we call casion processors right that is very
very special.
To clear example of a process which could be which is not strict sensationary and get
it is wide sense stationary right I will just
give you one simple example. I leave a file to work out yourself we can do that quickly
where itself suppose, I generate a Raymond process in a following bit. The many reasons
of n m process right that we define a random process which
essentially we can random by virtual of its dependents on it is a finite function of time
right. It is a finite sense function of time, but A that 3 parameters in the cosine functions
the amplitude A is if you and phase theta right. Let me look to one
of them that it the phase random way that is say the value of theta is something that
of course, randomly from 1 function to another
function. If you look upon this as an example of functions
right all every function in this collection should be cosine function. But with different
phase and the value of phase is govern by some distribution right. So, as much as you
have a collection of function it is a random process is it clear and that every function
occurs with a certain probability right. So, becomes random if any one move of its parameters
random variable. So, for this becomes random because we assuming the theta
is random variable let us say with uniform distribution. That is why if I say
its uniformly distributed between 0 and 2 pi
there could be the density function of theta with B equal to 1 by 2 pi 0 0 and 2 pi and
0 elsewhere. This will imply this is the density
function of the random variable theta we have could be.
Next look at the mean value of this next we look at the mean value function we want to
see whether this function is a function of time or not right. Mind you we are not looking
at density function of x t here right now. But sometimes we can work around that even
the doubt completely the base density function of x t you can compute. So, this is the
important point we can compute this quantity even without move density function of x of
t right, because we know how to compute the expected value of or function of a random
variable. So, you can think of you can think of this constant as a function of random
variable theta and complete this every value on that process. So, will be at the it will
be A cosine omega a t plus theta you multiply
with the density function of theta right think of this as a function of theta is entire thing
at any given time right. Multiplied to the density function of theta which is 1 by 2
pi or 0 to 2 pi d theta is there. Your function of
theta multiplying the density function of theta integrating over the range of theta
or the d 1 of theta right that is
That is one way of computing the expecting this and using the
let me recollect file the discussed. Expected value of any function
of x requires simply do this a portion is not it? I am using this solution here. X here
is our theta variable theta; f of x is this function
right. We are multiplying the density function of x or density function of theta the integration
of theta. Now, look upon what will be the value of this integral? 0 right. So, it is
independent of time right. So, value of this is
equal to 0. Just look at the auto correlation function just look at x t 1 into x t 2 or
lets say more conveniently. Just write this as x t
into x t plus tau right p 1 to be some arbitrary time is complete and t 2 T plus tau for the
difference between ls equal to tau right. So,
this will be equal to 0 to 2 pi again now the function is different that is how? The
function is now this kind of function everything else with the same. So, become A square
cosine omega t plus theta into cosine omega 0 t plus tau plus theta multiply this with
the joint if the basic function of theta right
integrate theta right. And a simple evolution of
this intergral we will show you that this is equal to A square by 2 into cosine omega
0 tau.
Student: . No no no no joint density function I am just
looking at this product as a function of theta I am again
using the same definition here, because I am not
going through the path of first joint density function of x t 1 and x t 2 That could be
1 way. That is very complicated in this phase
it is not necessary. In this case mush easier to
look upon this simply as a function of random variable theta this product function right
multiply the basic function of theta and into our theta. So, you can see that after this
integration it is not going to depend on t it depends only tau right.
Student:
Please repeat your question. No no it has do dependent time theta is a parameter of
this function right. Depending on the value of
theta you have a different time function this different time function are all cosine a same
frequency in same but different phase right.
So, therefore, basically this collection if you are looking at is a collection like this
and so on and so forth. It is a infinite collection
you can define a fairly different value of theta
right. So, this shape of article this shape of article this shape of
right. The way document the mew value function yeah say if I pick up some times is
empty what is the average value that I am likely to see here across the example it also
is 0. If I pick up 2 Time instance t 1 and t
2 which are separated by time instance separate by some interval at all. What is the average
cos . It is these 2 random variables answer is cosine variable
top it does not depend on the class of t 1 and t 2 This is what you
have demonstrate right is it clear? So, here is
an example of a process which is; obviously, not strict sense stationary is it obvious
it is not strict sense stationary. It is not obvious
I am assuming that it obvious you think about it. It is not obvious, but it is possible
to argue very easily to see that. Let us looks at let us say 1 time instance
like this. I think it look at little bit of thinking I
think like linear of the timing, because we are going to a square a
bit. It is possible to argue the density function is not constant with I right. You will a first
add density function is not constant this specific time actually it is one of the problems
in the book. Please look at the problem with
very carefully and you will arrive this argument. But even though the density function
itself is a function of time the first elements then will the mean value function
expected value of x of t and the auto correlation function. Set if pi the required
properties of a wide sense stationary process there is not a strict stationary process and
we are taken a wide sense stationary process. It
is. So, only you need to wide pi rates not a strict sense stationary process. So, please
do the as an exercise. Now, what are let us take
talk about what we are discussed so far. We say that if we are working with random
form random process we can characterize it for all technical processes as engineers most
of the time it is sufficient for is to characterize it with
2 kinds of functions. The mean value function that
gives us an idea of what is the average behavior of the time functions. What is the
average behavior the various time functions which constitute the process right that is
that is 1 property. The second is the auto correlation
function which tells us if I look at 2 random variables which are which other bian
interface tau seconds what will be the obvious value of the cross
correlation of the 2 random variables right. So, rather than trying to specify the density
function and the joint density function which has much more information many times as engineers.
We are sufficient you are sufficiently happy with these 2 information’s
namely the mean value function property and the auto correlation function property
right. In some cases we need to going to more detail, but many times this is good enough.
So, all purposes the function may mew x t which is going to be a constant function for
a wide sensation process. And the function R
x t 1 t 2 which is going to be the function only of tau for wide sensation processors
is good enough us. You do not going to worry about the density functions in many many cases
right. Now, as electrical engineers we are used to describing things in the time
delay as well as in the right. When we talked about the time delay
function immediately ask ourselves what its specters domain description? What is it spectrum
live is not it? I you have worked out the elaborate theory for bring that in the
transcription. You could have similar interest here you have occur
whether the signal is deterministic signal; however, is a random signal. You can ask some
properties post in the time domain as well as in the frequency domain right. So,
we like to also see whether it is possible to
characterize a random process in a frequency domain.
Now, let us see look me just discuss a few basic concepts or conceptual difficulties
associated with this and then just give you the important results in this connection.
Let us look at the difficulty just say you are defining
a random process as a collection of waveforms right that is from you are looking
at. Now, when we take a tele transform, what is the tele transform? Tele transform
is let us say you have function x of t. And you multiplied e to power
minus j 2 pi ft and take the intervals from minus infinitive to infinitive that this is
the that is the initial 1 thing that you work with what are the assumptions
in this? The assumption x of t is a
energy signal right it is a its absolutely integral right. Remember the base shape
condition specify with the definition of a whether the system
transform associated with a system approach transform.
So, what are the requirements are that x t should be absolutely integrable that way we
should be a power signal hen you are working a random
processors. We do not know where we are be working a energy signal of process that is
one issue one difficulty with need to work with right. A functions maybe power signals
your function maybe energy signals. But that is the problem that we are also direct with
the in the context of deterministic signals and we typically better around that by
introducing in function in the frequency domain right if you
recollect that is one problem. The second problem is more difficult more conceptually
more difficult. Except you have a very well defined waveform very well defined
mathematical function of time right. It is very transform you are taking for example,
e to the power minus alpha t or cosine omega 0
t right. But the collection when we talk about x of t be a random process you need not know
what waveform you are working with actually.
It is one of infinite number waveforms an every one of them
transform what we exist for that waveform you have a different transform and therefore,
you could have a different kind of spectrum. So, what is we talk about does it make sense
to talk about transform or the spectrum in the normal sense. Are
you ? No it does not make sense right. However, what that make
senses on an average where is a energy distribution as a function of frequency? What is
the power distribution as a function of frequency? Depending on whether we are working
a energy signals of approximate right. So, important frequency domain concept for
random processors is not the usual spectral which is just the, which is just a
transform of a function, but a average kind of function which is known
that the layer of power spectral density function. First just define this process density
function conceptually right let us say we have random process x of t.
So, what we will do is to convert this into an energy alpha can
takes this process. So, lets random waveform just look at this
waveform between that is a minus 1 to plus t right. This waveform
one same sample function in the process
with here one sample function the process. As you know random process is a infinite
collection of such sample functions right its some arbitrary selected sample function
from that collection. So, you pick up 1 and truncated between minus t to plus t you note
the result in process x sub T t. So, x sub T t has been generated from x t by looking
at its interval between minus t to plus t making
it 0 outside this interval. This artificial construction ensures that high converted even
a power signal into an energy signal right is it clear?
Because now clarifying that energy and for the peri transformer this could be define
again sense again this is your random quantity I do not want to take the full transform of
this right where I am tested in what is first of all I scored this function. Because of
I do not interested in the power energy I am not
interested in the individual functions values by themselves right. Squaring is measure of
the energy at various time instance right is
of course, not literally, but some approximations. is no that thing
in beginning long . So, 1 let me define a free transform of X T all
here using the capital letter please remember I have pick a 1 sample function right. I pick
a one sample function and that sample function is low energy signal I am taking pay
transfer right. This pay transfer what exist now whatever the function maybe whichever
it does not matter. Because I have converted this into energy signal
the pay transform will exist all.
So, I take the pay transfer. This will now become a function of frequency right take
the magnesium square of that right. Now, this
will be the magnesium square or the energy as
a function of frequency of one sample function . If I want to look at
the average value average properties what shall I do here is the average value of this
right. And if I want to convert this energy function this a energy function is not it?.
Into a part function what shall I do? Do varied the
2 T, because of my function duration is 2 T
and if I want look at the original function take the limit us theta theta as infinitive.
So, this motivates my definition of the parse
with density function of a random process S x f.
I denote it by S sub x f as limit as T tends to infinity of 1 by 2 T expected value of
magnitudes square of the free transform of X T t. Look at this carefully I have gone
through the argument reading to this that if you have a doubt please ask your questions.
That is a formal definition of the density function. It is a measure
of the average distribution of power for given random process X t in the right. How is the
power distributed among difference frequency components in
the frequency in the net? Are you agree, you have any questions? So, that is you can take
that as a definition. Now, without going through the details of a result I just like to give
a result which is should no off which is a very
important result in the characterization of NM processors. It relates the density function
the density function when we talk about density function as a process
you must ask the where you talking primitive density function we are talking
about density function. density function is a it is what so, say it
is a once again. Do you agree with that?
Because you are taking a random process in this I just not enough scoring in that in
the time domain in the limit. Now, we are any
way squaring it up right and you taking the away value of the x squared in a function
of frequency right. So, it is a second order
movement just like the auto correlation function was the second order movement right. So, if
you look at this way it must be natural to expect the least 2 second order movements
must be some more related is not it? We are saying in the auto correlation function is
a second order movement description of the random process x t. And now we are saying
similarly that the less function at the just define the function is also a second order
movement
characterization in some cases only thing this is the frequency
characterization that was characterization. Therefore, it sounds logic error the, these
2 second order movement description should be related to each other right. And that is
the important result that I am talking about there is a very simple way to not exactly
very simple. But it possible to show that these 2
Things the second order element description in time delay which is auto correlation
function. And the second order movement remaining in the frequency domain which is
the density function or essentially free transform phase right. So,
which is your result to very similar to what you are used to doing for deterministic
signals what you thing is where reject the free transform of the signal directly where
we taking the free transform of the auto correlation
function right. This result is note the name Wiener Khinchin
theorem which essentially states that the
density function of a process x t and the auto correlation function
of a process x t or the auto correlation function of a process x t or previous one determines
the other. So, p s t we put here like that in auto correlation
function of free transfers. And therefore, in as much as the mean value function and
the auto correlation function are complete second
order characterization of a random process. Similarly, the mean value function
that the density function of complete second order characterization
of the random process. Let me now finally, define one more concept and then
next time we will now will essentially concentrate on concept that we are specifically
go to need in our treatment of the communication systems.
The final concept is the concept of ergodicity. I will just mention it here will have to
discussion of this concept, but we need to know it right a twice.
The concept is as follows we say that the process that I need a random process is if
it is statistical average is or equal to or can be replaced with its time
averages. That is statistical averaging of any is equal to the
corresponding time average of a given any any sample function. Now, this is the very
peculiar concept that the very useful concept because without this concept you know
really we particulate we have to work with random process. Particularly when comes to
measurement of what properties of random process just look at the very quickly a
modification for this property before talk about the property itself.
Motivation is as follows I was said our concept of random process is limit of how you
recovery basically it is a collection of infinite number of waveforms. And you not know
which waveform we are going to see suppose perform the
experiment are you look at the random processes as it as it occurs and display waveform
that you see some arbitrary waveform which we cannot predict
right. Now, anytime in typical situations I will see just one such waveform from this
infinite collection. So, for what measures properties average
properties let us us say if you want to measure the mean value. So,
measure the auto correlation function right suppose I do not know a density functions.
How will I have find out this qualities? right very difficult, because I have only 1 sample
function available in front of t right. I have just 1 sample function which I have observed
I do not know what are the other infinite what are the other members of the this infinite
family. In the other hand this property if it is valid for a given random process helps
to with find this properties just from a single
sample function that we might have observed. So, what we are saying is even if I do not
know if I even if I average they cross the whole family because
I done have the whole family with we. If I just look at the average across the time of
one sample function of the family it is good enough I get the same value right. So, processes
with exhibit such properties are called algolic process. Fortunately for us many physical
processes which we work with I got it in nature right.
And therefore, many times we can obtain this the auto correlation function by looking at
the time mean auto correlation function of a given sample function or the mean value
function by just looking at the mean value of a given time to mean function. Just like
you do for any time . So, what is a time mean
average are you simply this right take the limit as the T junction
infinite that the time mean average. This does
not involved the density function of x is taking a sample function right and integrating
minus T to t that is the sum of all the values that you see between minus infinitive the
vary for the time interval that is the time mean average. What is the correspondent
statistical average? It will be x Px t x dx that is the correspondent statistical average.
So, what you are saying is this is equal to this
similarly for the auto correlation function for
any other kind of average. If this kind of relation hold for all time averages
corresponding statistical averages the process is continue step by step I.
Final statement in this connection I know time this one final statement, because overall
this continuous. If I denote by this space as a clause of all stationary clause of all
random processes right clause of white sensationary
processes. So, this is all processes clause of
white sentationary process is a sub clause its satisfy those 2 conditions which I
mentioned linearly function is independent of time auto correlation function is dependent
only on time difference. Strict sense stationary is a further sub set of that right and
it is a smallest sub set that just I want to it make and complete this
discussion. So, if a process I go to it will also be strict sentationary it will also be
wide sentationary etcetera.
Thank you.