As, we have discussed in the last class, we
discussed symmetric random walk and how we can scale it slightly up and down and so that
we can make it more zigzag and go towards what is called a Brownian motion. So, the Brownian motion is a continuous stochastic
process which exhibits the property of a symmetric random walk. So, that is the idea of a Brownian motion. So, for example, if you look at the motion
of a pollen grain in water, it would be something like this, more zigzagging than this than,
I can draw. So, this is something a motion of a particle
in a gas chamber a motion of a molecule. So, Brownian motion encapsulates lot of phenomenon. So, what did I say, the best way to remember
about Brownian motion is that, Brownian motion is a continuous analog of symmetric random
walk. In a continuous setting, it behaves in the
way a symmetric random walk behaves in the discrete setting. So as usual, I will have the probability space
which is written down here. So, Brownian motion, so, your stochastic process,
Wt some people write W lower index t. it is up to you. it does not matter, is called a Brownian motion,
if given any time points, t is time actually, spreading
time points say tm tn whatever. Given any time points like this the increments
are independent. The increments mean just like the way you
have done increments in the symmetric random walk, the increments Wt1 minus Wt0. Wt0 is actually 0, W0 is 0 just a minute,
I think, I should just change it a bit. A stochastic process with W0 is equal to 0. So, it does not matter. Whatever be our scenario omega, W0 is always
0. So, it is identically a zero function. So, W0 itself is a function. Please remember it is a random variable. So, whatever be the form of that random or
whatever be this random variable, whatever be the scenario omega, W of that would always
be 0. That is the meaning of the whole thing. So, this Wt2 minus Wt1 this random variable,
these increments in the random variables. Hence, how much change you are having how
basically, you are telling that how much zigzagging has taken place in the interval t2 to t1,
t0 to t1, t1 to t2, t2 to t3 and so forth. So, it is some sort of a broad measure of
the zigzagging that has taken place. How much the function value has changed at
the 2 ends. These are independent random variables. Now the independent random variables. This is one property you know from the property
of the symmetric random walk. Also, you know the symmetric random walk,
the expectation of increments are 0 and the variance of the increments are the difference
between the 2 time end points. Here, actually one can prove through the Central
Limit Theorem that, these increments actually follow normal distribution, which we do not
prove because, that would take too much of time, it is a very short and compact course. So, these are independent random variables
and Wti plus 1 minus Wti, this random variable follows normal distribution, with mean 0 and
variance ti plus 1 minus ti. So, if all these properties, 2 properties
are followed, first that this has to be maintained whatever be the scenario, whatever be your
time points increasing, if you take an increasing set of time points finite set of time points,
then these things are independent and these increments themselves follow normal distribution,
for every i. Of course, this has to be for every i, now
start from i equal to 0 to i equal to n, i equal to n minus 1 basically. So, each of these, is following a normal distribution. So, this is what is called Brownian motion. Does stock price follow Brownian motion? The stock price truly does not follow Brownian
motion, though it looks like one. Because Brownian motions can take negative
values, because the symmetric random walks can also take negative values, but a stock
price can never take negative value. Once a stock price is 0, that stock has to
get out of the market now it is a 0-value stock. So, stock prices are not really a model using
Brownian motion, but actually these stock prices actually were modeled by Bachelier
in 1900, who was the student of Henry Poincare using Brownian motion and showing that the
pricing of such commodities in the stock market, of various instruments in the stock market
can be achieved by solving the heat equation. So, he linked the processing the financial
markets to partial differential equations. And of course, his idea was lost and now later
on he has become a very famous name. But the interesting part is that, the way
stock market or movements of stock prices are modeled that is called a geometric Brownian
motion and then that is built using, that will always give you a nonnegative thing,
which is built using a Brownian motion and you can understand and taking the trick that,
if you want to get non-negativity always use the exponential function. So, that trick is being played in this area
pretty often. So, now we will introduce what is called,
if I want to define a filtration associated with a Brownian motion, how do I define a
filtration associated with a Brownian motion? So, what I will do in this filtration associated
with the Brownian motion. So, this filtration Ft is a collection of
Sigma-algebras, this is a collection such that, number one, whenever t is strictly bigger
than s F of s must always be contained in Ft, but you have information as time evolves. So, this should always be contained in Ft
okay. Now number two, is that Wt must be adapted
to the filtration. The Brownian motion must be adapted to the
filtration and the third point is, so if you have this
situation then, Wu minus Wt is independent of Ft. That is, once you move beyond t, Ft does not
have any information about this. This is independent of what information you
have in Ft. Ft can also be viewed in some sense as a smallest
Sigma-algebra generated by the stochastic process till a given time. So, take all the values of the random variables
omega t, up to a given time and take the Sigma-algebra generated by them, that can be also viewed
as one of the filtrations right. You could have larger filtrations also but
that is essentially it is. So, once we know that that we can have some
filtration defined like this, we can prove that the Brownian motion is a Martingale. So, again, just do like this, take t less
than equal to s, so you will find that the tricks are almost similar. So, what I do is, I, and now I breakup the
whole thing. You know, Ws is completely determined by Fs
is known. So, you have to take out what is known. So, Ws is equal to Ws into 1 you take out
the Ws expectation of 1 of constant random variable is the same so. Now here, Wd minus Ws as the third definition,
does not depend on Fs. So, it is nothing bu,t it is so there is no
conditional expectation here, so is just the expectation, so this random variable is just
this constant random variable plus Ws. You take the Ws out, expectation of 1 given
Fs expectation of 1 is 1 basically that is it. Because 1, just the constant random variable
does not depend on F s but this you know is 0 because of this fact, 0 plus W s which is
equal to W s and that proves that it is a Martingale. There are several other Martingales, which
are associated with the Brownian motion. A Martingale that we are going to write now,
is very-very important in finance specifically in calculation of risk neutral probabilities
and all those things. So, this is a very important Martingale called
the exponential Martingale. So, the exponential Martingale
is defined like this. So, exportation means, e to the power, so
sigma Wt, where sigma is positive number, exponential means e to the power of this. Of course, this itself is a random variable
because you are taking exponential to the power of a random variable. So, it is, when you are taking the exponential,
exponentiated by a random variable. So, this itself is a random variable. Now the idea is that if you have a filtration
associated with respect to Wt then that same filtration will be associated with Zt, because
if Wt is adapted to a given filtration Zt will also be adapted to the same filtration,
because knowledge of Zt singularly depends on the knowledge of Wt right. The question is whether it is a Martingale. So, one has to prove that this is also a Martingale. So, we start in the similar fashion. This proof is not as straightforward, as this
proof, though similar type of approaches would be used but let us just go and do it. This omega, the sigma that you see, this we
will finally talk about as volatility of the stock price movement. It captures, how the randomness, captures
the massive movement. It really gives you a feel of the zigzagging
of the path how the prices are how fast they are going and coming down so that sort of
so this captures that idea basically. So, F is a filtration associated with the
Brownian motion Wt. Say I have added sigma Ws and subtracted sigma
Ws from here. Observe that sigma square t is a nonrandom
part. It is not a random variable right. So, here is the product x into y okay. So, the question is, in this sort of situations
right, there are some deeper questions. If you go back to your original, where we
had written down the laws of conditional expectation, the rules then, we expected this and this,
this product has to be integrable because if you have to maintain the definition of
conditional expectation. Of course, product has to be integrable means
you need to define, what is the meaning of integration of these two random variables
okay. Is such random variables are integrable? That is the question we are not going to answer
right now. We will later on show that these are actually
integral, let alone discuss the integrability of this, that is they are actually integrable
random variables, because they will come out to be solution of certain equations. So, and if you have a Brownian motion, how
can you integrate it. Can you integrate it just like any other random
variable, can you find the expectation of a Brownian motion? The answer is yes. I can find the expectation of this Brownian
motion because this is 0. Can you find the expectation of this Brownian
motion? The answer is expectation of Brownian this
Brownian motion is this, minus half sigma square t. Now it will be left to the reader to decide
whether. Of course, there are certain little technical
issues, I am not getting into, but it is clear that expect. What is meant by the meaning of ntegrability
of a random variable? Integrability means that the expectation is
finite. The expectation of this is finite, this is
finite. So, this exponential function will have the
integral will be integrable and this exponential function would be also integrable. So, at the end if you look at this part. So, this is nothing but this part, but this
part this is because, this has mean minus half sigma square t because, this mean is
0 and, this exponential is nothing but a constant thing, so mean of this is nothing but minus
half sigma square t so this is a thing with finite mean. Now if you take a you are taking exponentiation
so basically you are taking expectation of the exponential function. If the random variable itself has a finite
mean, then if you take the exponential function, that exponential function will also have a
finite mean it will have finite expectation. That is why it is meaningful to apply the
fact that I can take out what is known. You see this part is known to me when, at
time s, this whole part is known to me because this only depend this is just the evolution
of W the random variable up to the time s. So, I will take this part out. Taking out what is known I will have exponential. So, it is very important to get certain technicalities
clear, before you move, where you are actually applying the results correctly. That is an important thing, that one needs
to learn as one goes on doing more mathematics. Sometimes we can do some hand waving but not
always. Now once, I have this, remember that this
thing is independent of Fs, which means, I just need to calculate this. Now what is this, sorry, I would not have
the Fs here sorry it is independent. Now if those, who know some probability, they
will understand that, I know that Wt minus Ws is a normal distribution with mean 0 and
variance t minus s and this is nothing but the moment generating function of a normal
distribution. I have not spoken about moment generating
function in this discussion, but if you forget about this term moment generating function,
you can directly compute this expectation. I am not going to compute for you, this will
come as a homework and this will appear in your assignments. So, what I am just writing down the answer. Even this is simple integration, so I am just
not going to do that. This is equal to exponential of half sigma
square t minus s okay. So, this is what you will have that is the
answer. So, once you write that down then, the final
answer would be the following, that if you write this as exponential so you write this
as exponential sigma Ws minus half sigma square t and then you write this as exponential of
sigma square t minus s, so this is nothing but exponential sigma Ws minus half sigma
square s so this is nothing but Zs. There is another Martingale which is helpful
in finance is the following. It is Zt is W square minus t. So, this will also go as an exercise in your
homework assignments, to prove that this is a Martingale. Now we will talk about, so what is happening. We will talk about how to compute the joint
probability of a Brownian motion at certain given time points. You take Wt at any t, what is the distribution
of this random variable. Wt can be always written as, Wt minus W0 and
this has normal 0t. So, if you want to know whether, at a given
time t your Wt is lying between some points a and b, it is obvious you can just use the
basic idea that is. The question is, suppose I have now n time
points which are greater than 0, 0 I know where it is, so with n time points and we
ask you the question, how to find sorry, say Wt1, how do I do it. So, let me look at the case at the very first,
this is how you compute what are called the transition probability densities let us see. So, now suppose, you are given the information
that, under the given scenario Wt1, under the given scenario omega Wt1 omega was x okay. So, suppose I know that it is known to me
that, at t1 for the given scenario W, at time t1 it was x where say x1, x1 lying between
b1 and a1. So, what I now want to do is, I want to calculate
the probability at W t2 lies between a2 and b2 given that Wt1 is x1. That is what I want to do. But, if you see, if I fixed Wt1 as x1 then,
what do I have. I have Wt2 minus x1 as my random variable,
x1 is known to me that, at time t1 x1 is what happened. So, now what is the probability that Wt2 would
lie between the value of Wt2 for the given scenario omega will lie between a2 and b2. So, for the scenario omega t1 I know what
will happen. It was at x1. Now find so it is in the state x1 so what
is the probability that, the next state would lie between a2 and b2. It is like sort of a computing transition
probabilities. So, here if you look at this thing, the expectation
of this, this is also normal random variable, so this also follows normal random variable,
but the expectation of this, is what, expectation is, so it is a, see here, we had, so this
new random variable z, so Wt2 so, is equal to z plus x1. So, under this given information the expectation
of Wt2. So, Wt2 under this information follows normal
x1 with, because this is the time t1, so this z follows normal 0 at time t2 t1, so Wt2 is
nothing but x1 plus z. So, Wt2 follows this one, because this expectation
is 0. So, once you know this fact then what would
happen. This means now I can compute, sorry, I should
be having, so we will be applying the same idea of conditional probability, that probability
of A Intersection B is probability of A into probability of B sorry, so probability of
A given B is probability of A intersection B by probability of B. This is what we had learnt. So, you can always write probability of A
Intersection B is probability of A given B into probability of B. That is exactly what
we are going to do here. So first we are computing the conditional
expectation and then using this we will write the joint probability. So, this conditional expectation is, 1 by
the conditional probability is, 1 by x2 is the variable that I am using for the variable
x2, x1 is what is the mean. Now, if I want to write the joint probability
so this is my conditional density. This is my conditional density function and
thus I can write the conditional density function in a more simpler way. So, I can write the conditional density function
in a compact form but let us do a general form and then we will tell you how to write
the conditional density function. If you look at this, now I am writing this
one, is a joint probability okay, and this is equal to, so first I will write, I will
tell you, what is the meaning of this. So, these are the marginal probabilities,
a marginal densities. So, the marginal density, in general is given
like this. So, what is the density function associated
with the fact my current state is y and then in time t, I will move to the state x, at
time 0 say if my current stated time 0 is y in time t, I will move to the state x, what
is the density function associated with that particular transition. So, this is called the transitional density
function or the conditional density, transition density and that is given as, one by root
2 pi t, e to the power of minus x minus y whole square by 2t . So, this is my conditional density function. So, I would keep it as an exercise for you
to write down for the case till end. How can you write it? So, here you see, from 0 my state was 0, it
is always 0, W0, 0. I have come to the state X1 at t1. Given that, I am in state x1 at t1, within
the time t1 to t2, I have, within the time span t2 minus t1, I have come to the state
x2. So, essential a marco process. Brownian motion is also a marco process those
who know about marco process. So, we are just trying to compute, this is
nothing but a conditional, this is a transitional probability actually. It tells you, given the state is at x1, what
is the probability that the state is between a2 and b2 so next state. Because it is a continuous thing, you cannot
say that probability Wt2 is x2 because that will become 0, so you just have to say what
is in between. So, that is the whole thing. So, I know what is at 0 I will come to the
state x1. So, the conditional density at x1 is the conditional
density at x2. I am at time t1 and I am at x1 and at time
t2, so within the time spend t2 minus t1 I have come to x2 then what is the conditional
density. So, the joint integration of this, would give
me the conditional probability, which can be motivated, is motivated from this very
basic idea. Thank you very much.