Practice English Speaking&Listening with: Lecture - 31 Random Processes

Normal
(0)
Difficulty: 0

We will continue with our discussion on Random Processes and see whether we can find

it up today, that is most of it and next quickly recollect what we did last time. We discuss

the concept, basic concept of a random process and also the concepts of simplifying the

characterization of a random process through assumption

stationarity, wide sense stationarity and . So, I think that is maybe

work, let us continue the discussion now.

From now onwards, we will assume that, we are closely working with wide sense

stationary process, processes which are at least wide sense stationary. They may or may

not be strict sense stationary, but we can assume that they are at least wide sense

stationary. And please recollect that, as far as wide sense stationarity is concerned,

you are only required to worry about two properties,

namely the mean value function and the auto correlation function.

The mean value function is, maybe the constant independent of time, the auto correlation

function is going to be a function only of the log variable t 1 minus t 2, rather t 1

and t 2.

Now, therefore, if these two functions more or less are in complete certain order,

characterization of a wide sense stationary process. That, they are not complete

characterization, that complete second order characterization, now you just look at the

properties of the auto correlation function in some more detail.

So, for now onwards, we will denote the auto correlation function by R tau, R sub X tau,

R sub Y tau etcetera. So, we just looking at the properties, some of the, I am just

going to mention some of the properties because they

are very simple to prove, the more or less follow, directly from the definition of the

auto correlation function or at does a little bit

of manipulation. And I am sure you already know this, so I am not spend too much time

on it, I am just mention them. For a real process X t, X of t the auto correlation

function could be real, if you also real valued, if it is a complex function, if your

process X t is a complex valued function, then

auto correlation function could be complex valued. It will be an even function, so even

function, see already seen last time, because if I change the order of t 1 and t 2. You

multiplying X t 1 and X t 2, it will not make any difference to the average value of the

product. So, whether you write t 1 minus t 2 or t 2

minus t 1, is the same thing, so it is an even

function. So, if I replace tau with minus tau, the value will be the same, again for

complex valued functions, the symmetry this even symmetry will change into conjugate

symmetry. So, for complex valued functions, the auto correlation function will be

conjugate symmetric. What is that, that means R x minus tau could be equal to, R x

conjugate tau. That is because, there is a communication

operation involved in the multiplication, then

you take the auto correlation function of to, a complex valued random process.

Conjugate symmetry is also sometimes for called Hermitian symmetry and this third

property, so this is property number 1, this is property number 2 and property number 3

is, that it is maximum, auto correlation function is maximum at tau equal to 0. Property

number 4, if the random process x t happens to be periodic.

So, for periodic random processes, the corresponding auto correlation function also will

be periodic. You have really state for properties to prove, just animating them for the

sake of computers, so that you are familiar with these things.

5, this is important, very important, the Fourier transform are the auto correlation

function. So the Fourier transform of the auto correlation function, which we have seen,

is nothing but the, so called power spectral density function and you know a how to

define the power spectral density function, which have expected value of, Fourier

transform of X t, t magnitude square. So, taking the expected value of some square

quantity, so what can you say about such an expected value, it will always be positive.

And though for, the Fourier transform of the auto correlation function of any random

process will always be positive. It will be a real valued positive quantity, is not, unlike

the Fourier transform of an arbitrary function which can even be complex valued

function, the Fourier transform of an auto correlation function will always be real valued

and positive. So, is non-negative, actually note generally we should say it is non-negative

for all frequencies. And there is one of the important tasks, for

let us say checking whether or not a given function, would be an auto correlation function

of some process are there or not. We take it is Fourier transform, if there is a positive

function for all frequencies, then it is likely to

be an auto correlation function of some process are there. Then we how would you

know, is this so that, this is understood by everyone, the motivation why this is, this

property comes.

This comes from the definition of the power spectral density function, which we define

to be the expected value of some square constant. And therefore must be positive and we

already seen, that the power spectral density function and the auto correlation function

are Fourier transform pairs, there is by Wiener Khinchin theorem. So, also the value of R

is 0, which we also said is maximum, preservely for a periodic

auto correlation function, people repeat itself, at periodic intervals with period t,

whatever is a period. But in general, R of 0 is maximum and it is

value is equal to, if have a physical significance, is equal to sigma square. The

variation of the process at time t, of all time,

we are, since we are considering wide sense stationary processes, the various will be

the same for all time instance. So, it is a take,

which we sometimes also called the total average power, why because, as you can see

if you remember R tau would be the inverse Fourier transform of, the power spectral density

function. So, if I put tau equal to 0, what do you get,

R 0 equal to integral of S X f, that is the area

under the density function. And therefore, it is a total average power

and the last property there are like to mention here is, that in

general for random processes, if you consider the

limit limiting value of R tau, as tau tends to either plus or minus infinity. That is

you are giving, here is the general nature of the

auto correlation function, what kind of a auto,

what kind of function would be the auto correlation function.

In general, as we approach infinity on either side, plus positive

side or negative side, the value of the auto correlation function could tend to, this limit

will be equal to mu x square. So, R X tau if you put mu x square, here mu X is a mean

value and this is a process happens to be zero mean,

then what will happen to this limiting value for either side, return to

0. Now, this I am not proving, but very simple

to prove, in even more simpler to appreciate, what we are say, suppose

it is a zero mean process, just for the sake of , what we are saying is, every auto

correlation function for a zero mean process, would tend to 0 as tau

tends to infinity. Does it may contiguity sense, what does R X, just remember

try to ask yourself, what does the value of R tau represent, for a given

value of tau.

It should presence the cross correlation between two random variables which are

sampled, which have the team by sample the random process at two time instance which

are separated by tau seconds, is it not? This is the correlation between two random

variables which are separated by tau seconds and what this says is, the larger you make

the separation, the smaller we correlation between them and it will become 0, as tau

tends to infinity. That is, if the separation between then becomes

very large, there is every reason to expect, that the relative values will have

absolutely no correlation with respect to each

other. So, that is basically, that makes a lot of sense, if say truly random process

it should happen that way, please close that parts,

would have a larger correlation that is why R of

0 is a maximum and then it starts to decay and it decays to 0 for either side, so typical

form,

Yes please. Sir, what is this random particle are periodic

function. If it is a periodic function then this will

not be true, this property is not valid for periodic

auto correlation functions. This is only valid for a periodic auto correlation functions,

so basically what we are saying is, you are right.

Auto correlation function typically will have, which was like that, in the case zero as tau

tends to, this is the plot of, the typical auto correlation function as tau tends to

infinity or minus infinity, it has to be even symmetric,

if it is an even value for real value process and so and so further. At this point, having

define the basic concepts which characterize random process, it is now possible for us

to, look at a very important specify process which we shall using in modeling noise in

communication systems. And in fact, the process this particular random

process is known by the name of, in white noise process. So, let me define a process,

what you look kind of process which is called a white noise process, if I am take it up,

I just like into like to reemphasize the fact, that

the auto correlation function and the power spectral density function, they are equivalent

second order descriptions of the random process, is it not. Because of free transform

pairs, if you give one you know the other. So, the more second order descriptions and

secondly, these descriptions are, at that we

are discuss or independent of what kind of density function the process has, because

we are only looking at the first two movements

of the density function. The first order movements, which is the mean function and

the second order movement which is the, auto correlation function.

You could have any density function and you could have random processes with

different density functions, different joint density functions having the same second

order properties. So, get something that what you keep at the back of your mind, that

when I am discussing, only second order properties, we are ignoring the density of a

.function properties, because we are ignoring the detail, you only looking at these two

cross properties. The detail properties are not known to us

or we are not talking about them at least, let we

come back to the white noise process. The white noise process is one, is really the

definition is very simple, for which the power spectral density function as this form, it

is constant, in this you just the value, arbitrary

value or arbitrary notation for the constant individually denoted by N sub 0 by 2, for

all frequencies. For all values of the frequency f, the auto correlation

the power spectral density function is a constant.

So, the power function power density function, plot as a function of f looks like this

and like you see why, where the name comes from,

the name comes from the fact that, all frequency components are present in equal

measure. So, the analogy comes from white light we are, a large number of frequency

components constitute the white light, so that

is where the name comes from. If the process is, if the power spectral density

function is not flat, then if you want to call

it we can call it a colored random process, the colored noise process, because it is not,

so it is just as against wide, so it is not flat

we call it is sometimes we just loose the quality

colored noise process and they are various kinds of colors the one thing have, but usually

the simply quality color tend to noise process. Now, what will be the auto correlation

function of this process. What is the R X tau, of a white noise process,

what is R X tau, it will be delta function, it

will be N 0 by 2 delta tau, because this is the Fourier transform pairs. The Fourier

transform of this is equal to this, the inverse Fourier transform of this is this, so the

auto correlation function therefore plot, looks

like this. This is delta tau, sorry this is tau and

this is N 0 by 2 delta tau, tau equal to 0, the delta function occurs at tau equal to

0, this N 0 by 2 is more or less a standard notation,

using communication theory. It essentially says, this is called two sided

power spectral density value, the value of N 0

by 2 is called two sided power spectral density. In, as much as this negative frequencies

are only abstract entities, the corresponding one sided spectral density sometimes called

N 0, is denoted by Ns, the twice of this. There is a few reflect the negative access

also the, or if you only talking in terms of positive

frequencies, the total power of frequency in a frequency f, will be actually the some

of these two areas. Remember that, because every real signal,

with frequency f will also have minus f, so really speaking, if you want to

compute the power contained in, a region f to f plus delta, f equal to get a

this area, with this area. So, N 0 by 2 delta f plus

N 0 by 2 delta which will become N 0 delta, so N 0 is sometimes called single sided

power spectral density for the white noise process and N 0 by 2 would become the

double sided power spectral density. These were just some general terms which are

used in the free . Now, this is a very convenient, modify, many

physical random processes, that we come

across in communication theory, but remember it is only a model, why, because such a

process cannot physically exist, can you see that, why can you give why reason why

cannot physically exist. That is, one we have to think at it,

just look at it more slightly more slightly different way, what is a total

power on the this process?

Infinity, no physical process can have infinite problem, because which is, area under it is

infinity, alternatively if you look in the auto correlation domain, what is it is say?

That, other than the fact that, you know X t would be relative, X t will be correlative

with X t, if you move even a slight, in the time

domain x in the time domain, along with the time axis, if you move even slightly and if

you look at two samples which are very very close to each other, but not the close this

is not equal to 0, they will be uncorrelated. For

such things to happen, for such wide sense to occur the process

must have infinite problem, otherwise it will be correlated.

Otherwise, the process will have some degree of smoothness and there is even a small

minute degree of smoothness, this kind may happen, in any time function that we are

need. So, basically infinite power or such an auto correlation function or idealization,

which will never occur in practice. However, inspite of the fact that this is, so this

is convenient to model many physical processes

in communication theory with this model, because the more or less satisfy, the broad

characteristics that this model assumes. For example, the kind of noise that if you

deal within communications, the thermal noise, as a very large spectrum, very wide spectrum,

more or less it is flat over that spectrum. So, it is, then it really go to infinity,

infinity is anywhere in a spectrum, it goes very wide,

much larger than the bandwidth of the signal which you are working with. So, for all

practical purposes with a model that is white noise, so even though in practice this is

not truly wide in the true size of the work.

Sir, you say wide noise density most thermal noise, white

noise.

Most thermal noise, short noise and various other kinds of noise, many of them can be

modulus wide, but their situations maybe have to be deal with the non white model, their

situations. So, it does not mean that, you will always have real product you always needs

to work with only white noise, their situation when you have to look at the actual

spectrum of the noise which is typically not wide, in some situations. At this point, I

think it is important for us to understand one additional set of relations, which we

typically need to work with.

Many a times, you will be faced with, when an analyzing systems, analyzing the systems

in the random inputs, you know after all when you work, when you our purpose of being

this view right now is, to have the necessary tools to analyze the performance of

communication systems in the process presence of noise and if you have, if you

remember whatever we are discussed so for. Most communication systems will involve,

one of the major components of any communication system, could be filters of various

kinds. You are doing a lot of filtering here and

there, so like to know, if there is a certain kind

of process that is, random process which is input your filter, what kind of random

process will be the output of this filter. So, like to know this relationships, so let

us look at these relationships that is, what happens

when you transmit random process to a linear

system, your linear system of a linear filter. So, we have a linear system with impulse cos

h of t, input is random process X of t, output is some random process Y of t.

So, in as much as we are concentrating only on the wide sense stationary processes, that

is we equal to assume that X is a wide sense stationary process. We will also try to

characterize Y t, in terms of first two minutes of course, you will, it can be shown the pie

t also will be wide sense stationary, maybe it could be obvious, even if it is not, it

can be shown.

So, that means given the mean value function and the auto correlation function of the

input process, you like to find out, what is the mean value function and the auto

correlation function of the of output, there is a concern. So, let us

look at the mean value function, how is Y t related to X t, through the convolution

relation. So, basically use, that is the starting point, so if I look at expected value of Y

t that is the mean value of function of .

First let me express Y t, Y t in terms of h t is X tau X t minus tau d tau, since I

think the expected value of this, basically where are

you say, is a expected value of this integral for this convolution

relation, X t is a deterministic filter X t is a

deterministic function. An expectation is also linear operator, it is a linear operator

with expected density function of the process and

because it is a linear operator and a certain conditions, you can clarify the interchange

these two integral operators. The integral operators corresponding to the

expectation operation and this integration, so

I can carry the expectation of vector inside the integral, write the expected value of

the product of these two, but X t is not random,

so we can keep it like that x tau and really speaking the averaging will operate of the

process X t. You are using the linearity property of the expectation operator, so what

is this, this is by definition, the mean value function of the this is mu X t minus tau.

And since we assuming wide sense stationarity, what will be the value of mu sigma X

star, will be constant. So, it will be minus infinity to infinity h tau into mu X, mu X

is a constant it comes out of the integral, so

just what you have. So, the mean value at the

output is the mean value of the input, so that shows that, the input process is stationary

up to first order, the output process also will be stationary up to first order.

Because, this is not a function of time, X t E X t was not a function of time, you find

a new variety also lot a function of time. Because,

this is going to be a fixed value, some number and what is that

number, the multiplying E X with the area under the impulse response. Can you explicit

it in terms of frequency domain valuation .

This is equal to?

So, mu Y is equal to mu X into H 0, is it not, because this is nothing but H capital

H of 0, H of f is H tau e to the power J 2 pi f tau,

if put f equal to 0, you get this and that is

very appealing because what is it, you can think of the mean value

of the input as a DC component of input process. It is a zero frequency component, mean

value is you can consider to be the zero frequency component.

And so what does it say, if the mean value of the output is the mean value of the input,

multiply by the response of the system of DC. DC response of the system, so that is

the first solution very simple basic relation,

so if you know the frequency response, in particular if you know the frequency response

or f equal to 0, that is how the mean values in the related, so this relationship

is very simple. However, the next one is not that

simple, but does not matter, it is still not very complicated.

So, let us look at the auto correlation function, so we want to find out R Y t u, which the

definition is expected value of Y t into Y u. Now, you might have notice that are not

done able to little t minus u, because I am not true yet that the

output process will be stationary. I know that the input process is stationary, but

we are ask to see whether the output process tends

out to be stationary or not, so we start with writing like this.

And if Y of t to be a function of only t minus u, then it becomes the stationary process,

so just look at this, this by definition is now I just, you just substitute the value

of Y t and Y u, this may entirely mechanical process,

use the dummy variable tau 1 x t minus tau 1

d tau 1, use a dummy variable tau 2, it is u minus tau 2 d tau 2, expected value of this,

so fast, you substituting for Y t substituting

from Y u, is defined dummy variables to differentiate between integrals.

Can write this like this, taking the linear operator on expectation of

theta once again this side, combining it with these two random entities, these are the two

random entities, so the expectation will now work on the product of this and this. So,

X t minus tau 1, X of u minus tau 2, of course,

certain conditions are required to be satisfied for this to happen to be valid, that we will

assume those conditions are valid. The basic conditions are that h of tau should

correspondingly stable system and x of t should be a finite imaginary process and infinite

power process. These are the conditions which are generally required, will not go

into details, so this is theta 1, I think this is a,

this purely mechanical manipulation, you should not have any difficulty and what is this

quantity, this expectation, this is R x t minus tau 1 and then let me write the general

expression u minus tau 2, if you assume with a stationary process, then it will be a

function only of, this minus this.

(Refer Side Time: 31:24)

So, if for WSS of a wide sense stationary X t, let we just define tau equal to t minus

u, so this becomes, can you see the region of function

of t minus u, because it will be a function of this minus this, or this minus

this. So, this because, I can write R Y tau instead of writing R Y t u, I can now write

R y tau because it is a function only of t minus u would be equal to, just rewriting

this, rewriting this relation. You have h tau 1, I think you can look at

a lower side check, you have extra to and what

will be write here, R x take the difference, it will become tau minus tau 1 plus tau 2,

d tau 1 d tau 2. So, as you can see, if a function

this tau 1 and tau 2 will disappear after these two integral integration has been done.

So, you will be left only with the function of tau, R Y is a function of tau, so which

means which implies that Y t is also WSS, so if

you pass wide sense stationary random process to a linear time invariant filter.

Because, X t was a linear time invariant filter, in fact, that is the reason the output

process is wide sense stationary. For example, if the filter was a time variant filter, even

though the input process was wide sense stationary, the output process then need not be

wide sense stationary, because the fact your filter impulse response keeps varying the

time. So, for the output process to be wide sense stationary, input the not only the input

process should be wide sense stationary, the filter to which you are passing it, should

be time invariant.

So, time invariants is the clue, which we have listed, so what you find, you find there

is some kind of since like this relationship

that you have, looks complicated, but actually it

is a very simple relationship. I am I will not go through the I will

give this as an exercise, I will give an exercise for you to complete, because these are, it

is a repetition of the same steps arrive done so for and that we help you to see the

relationship in a slightly better light. Instead of directly looking at the auto correlation

function of the output, there is a system where we are working with, what we have done

so for is, you are related the output auto correlation function

with the input auto correlation function, go

through a two step process. First try to relate the cross correlation between the input

output, in input and the output to the auto correlation function of the input. That is,

consider expected value of X t into Y u, rather Y t into Y u.

Then we have to only dealing y integral, rather two integrals, is in it, I should have

define, if the concept of cross correlation function which I have not done, but let me

complete that. So, just like you have the auto correlation function of a process, which

is R X t tau equal to R X t 1 t 2 equal to expected

value of X t 1 into X t 2, I can define a cross correlation function between two processes,

X and Y, any two processes X and Y in the same manner.

What we are do, I sample the random process X t at the time instance t 1, sample the

random process Y t at a time instance t 2, I have two random variables now, X t 1 and

Y t 2, take their cross correlation. That is

the definition of cross correlation function of two

random processes, so it will be X t 1 in to Y t 2, so basically what I am suggesting is,

first find out for this see, for this picture, the cross correlation

between these two processes, the input process and the output process..

And then, find the output auto correlation function in terms of this cross correlation,

if you go through this two step process, what

we just see the result please verify right, then

this whole relationship becomes much more meaningful, so what have to verifies the

following. Show that, R Y X tau is equal to R X tau convolved with h of minus tau, one

simple relation and of course, unlike the auto correlation function or the cross correlation

function does not have that even symmetric property.

Because, we have different processes, now if you change the order of X t and Y t, it

will never change, it will make it will be different

function. So, I mean, there will be some relationship, but it will not be in symmetric

relationship, example if I replace if I change the order, a first take X and then take Y,

then you find again the relationship is like this,

R X tau convolve with h tau, so they are equivalent. The general relationship between R

Y X tau and R X X tau, R x y tau could be this.

You see the difference, you are not saying R Y X tau equal to R Y X minus tau, it is

R X Y minus tau, which is obvious, again just

look at the definition of this things will more or

less follow from that. So, this is a one relationship, we are able to express, what is that

say, the cross correlation between the input and the output is the auto correlation function

of the input, convolve with the impulse response either directly depending on whether

you are, having this kind of cross correlation or within mirror image of the impulse

response, around the origin.

In the second step, we show that, we can express R Y Y tau or R Y tau, that is usually

write simply R Y tau to simplify, can we written us R Y X tau convolve with h tau. How

will you do this, to show this, you know you start with expected value of Y t into Y t

plus tau, that is the definition of R Y tau, they substitute only for Y t plus tau. Keep

this here, in terms of X t, I am go through that

process, so you will ultimately end up here, so

if I combine this solution with a previous solution which has this .

What do you get, if I combine these two relations what will you get, R Y tau is equal to, I

am substituting for R Y X tau as R X tau convolving X minus tau and this convolving h

tau and let us precisely, what this relationship is, that we defined. So, now if you can

look at this relationship in a much simpler manner, it is only a double convolution,

double convolution of the auto correlation function of the input with h minus tau first

and h tau later or the other variant.

Because, communication is the competitive operation, so this relationship which looks

rather complicated, is actually this which is much more appearing.

No no, what I am saying is, very nice, when I substituting for X of minus t plus tau,

it will be 1 Y and that is what we are need to

R Y X, is it not. Now, this is, therefore the relationship between

the input auto correlation function and the output auto correlation function. Incidentally,

this relationships are very useful, let us look at these relationship for example, which

we can prove very easily, which I am sure you will be able to prove in exactly one step.

I let us say, look at this relationship, this is extremely useful and this, I will discuss

the application of this very briefly. This terms

be there if a cross correlate, the input in the

output random processes, what I am see is the, this convolution. Suppose, I choose the

input auto correlation function to be a white noise process, input process to be a white

noise process, what will be the corresponding auto correlation function, auto correlation

function delta tau, some constant delta tau. What will be the cross correlation of input

and the output now?

Will be simply h tau, so what is the result, if I feed a white noise process at the input

to a linear time invariant filter, the cross correlation

between the input white noise and the output noise which may not be white, which

will not be white actually, would the cross correlation function would be proportional

to the impulse response of the system. So, this

gives me a method of finding the impulse response of the unknown system,

experimentally.

I feed white noise at the input to the system, look at the output random process, cross

correlate these two processes and the result of cross correlation will identify the impulse

response which was otherwise not known to me, let us say it is not

known to you. So, this is the very useful relationship, where can I use it in

communication systems? Channel.

Channel, the typical unknown thing in a communication system is the channel system to

which the signal is being passed. So, many, many physical communication system

actually use this method, to identify the channel impulse response, for example your

GSM standard which we are using in the mobile telephony, transmits voice in the form

of packets, information in the form of packets. Every packets has it is, as in the middle

of it, a training sequence which is a socially some kind of a pseudo noise sequence, pseudo

noise white sequence, which is power spectral whose you go for the out, if you

when you look at the output of the corresponding to this training

sequence of the receiver and see the most or

can find out through cross correlation, between what was transmitting and what was

received. The estimate of the channel impulse response and that is then used, to further

process the signal, to get your signal nicely. So, this relationship is extremely useful

what I am saying, we will discuss it further if necessary. Now, finally

this illustration implies a corresponding relationship in the frequency domain, let

us look at that relationship, just take the Fourier

transform of both the sides, so what will be the Fourier transform of this, the power

spectral density function of R Y t. Fourier transform of R Y tau will be S Y f, this will

be S X f, convolution in time domain will reduce

to multiplication in the frequency domain. So, what will be the product here, S X f into

h of f and what will be the Fourier transform of this, h conjugate of f, so what you get,

H f into H conjugate f, it is H f mod square. This is the corresponding very simple repeatedly

appearing frequency domain relationship. That is the

input the output power spectral density function is obtained by multiplying the input

power spectral density function, by magnitude square of the transfer function.

In the sense, because power spectral density function is a positive function, you must

multiplied by a positive number to get, make sure there using always a positive, that

makes a lot of sense, any questions, any question so far? No.

Finally, let we

spend some time on, if returning to density functions so far, when I was

discussing purely in terms of white sense stationarity and ignore the concept of density

function, because I was looking at a second order characterization of the process. But,

it is useful to know, which are mention before

to you, that many physical processes the

density function happens to be Gaussian. So, we need to also look at these functions or

these processes in some more detail. So, In fact, most of the time, the kind of

noise process will deal with, will model the second order properties by saying that it

is a white process, that is the power spectral density function is flat, but we are leave

it at that, will also say we say something about

the density function we say it is a white noise, it is a white Gaussian noise. So, now

we not only talking about second order property,

but we are giving some more detail information, that is which is the density

function associate with, that is somehow Gaussian.

You understand the notion of the Gaussian random, the notion we like to understand is,

that of a Gaussian process. When we say a process is Gaussian what is it mean, we know

that when a random variable X is Gaussian, what it really means is, the density function

is this. So, the question is, what do you mean by same, X t is a Gaussian process, what

is this mean, therefore to like to understand

a little bit, to start with, let we recap one very

important property of a Gaussian of Gaussian random variables.

Suppose, X 1, X 2, X n all Gaussian random variables we will just

take care of that. Suppose, these are Gaussian random variables, then if I construct a

linear combination of these Gaussian random variables by saying that, by constructing

Y in terms of X is and g i is of some arbitrary

coefficients of linear combination, i going from 1 to n. Then, Y will be Gaussian, we

know this. Now, I am say this is, a slightly different

way, we will define a set of random variables X

1, X 2, X n to be jointly Gaussian, this is the definition now, you say that X 1, X 2

to X n are jointly Gaussian, such that if I take

any linear combination of these, the output random variable, the the resulting random

variable is Gaussian. So, if this is Gaussian for

an arbitrary set of coefficients g i or arbitrary twice of g i, this will form the basis for

definition of a Gaussian random process which are take up next.

Thank you very much.

The Description of Lecture - 31 Random Processes