Brownian Motion (Wiener process)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so we're going to cover an introduction to a topic called Brownian motion which a lot of times in the financial textbooks kind of the most important leaner processes but anyway just want to go over some definitions a process is a process is an event that it evolves over time and attending the intending to achieve a goal okay processes generally start at time equals zero and end at time equals capital T typical notation list during the time and event may at various points along the way have an effect on the eventual outcome of the event during example a baseball game baseball game happens over time let's say the innings are individual events that get added up to complete the process it is possible that the score from on a baseball game that previous scores get added to the score of the current inning to get the next score but it doesn't for a process it doesn't have to look at the past so process is basically a very vague concept it's just something that evolves over time and eventually he's trying to achieve a goal a statistic process is a process which can be described by a change in some random variable over time which is either discrete or continuous so an example of a stochastic process could be I don't know if you've ever seen questions in probability theory where like you're trying to predict today's weather and they say based on the last three days you put a weight on each of those and tell you whether we'll bring or not rain today hope you've ever had a question like that but basically the weather rains or doesn't rain just keeps evolving over time then you want to see at a certain you be nice crisp blanket certain point in time what's the chance it would be raining or not raining so the process just counts on random variable to figure out what its current values so that's a pretty big concept stochastic process okay now in probability this is a famous cliched problem called a random walk a random walk is a stochastic process that starts off with a score of zero and then add fixed points and times same interval between each of the fixed points and times the screen events there's a probability P that you will increase by your score will increase by one and a probability one minus V Europe score will decrease by one and then this will happen this event will happen T times and then the walk is finished so one example is the expected value of you want while you're starting off at zero you're starting off at zero and then T times you're going to with probability P go up by one and probability 1 minus P even go down by 1 so whatever that ends up to that's where you expect to be so for example if the probability was 50% chance you go up by 1 and 50% chance you go down by 1 each time an event occurs your expected gain would be 0 right so you start off at 0 you're expected to end at 0 if there was a 75% chance you would go up by 1 and a 25% chance you go down by 1 then your expected gain would be 1/2 right go up 75 or down by one so half the time you would go I'm sorry 75% chance we go up on 175 times yep your game would be a half so each time you're expected to increment by a head so where do you expect to be capital T units away T over two so if this if you did this 20 times you'd expect to be a 10 in case like that okay and again random walks are discrete discrete intervals I think it happens once every day or once every second but the amount of time can be the same okay again we're still on definitions a Markov process is a so again a process is something that evolves over time a Markov process is the process of a particular type of stochastic process where only the present value and a pair of a variable is relevant for predicting the future so an example of something that's not a Markov process is if if you did have one of those examples where you look at the last three days help can but what the weather for the last three days predicts the weather for tomorrow that's not a Markov process because all you have to sees your current state today's weather to be able to predict tomorrow as well you don't have to go back into the past and figure out where you are currently so that's a Markov process and then the issue the variable and the way that the present value merge of the past are irrelevant okay now a martingale process hopefully when you're the end up going over definitions but a martingale process morning I'll process is a stochastic process where at any point time T the expected value of the final the expected final value equals whatever the current value is so let's say for example you had a random walk and the random walk after let's say it's supposed to run for 20 time intervals and after 14 time intervals the current value is 4 then the final value is expected to also be 4 so what this is basically saying is at any point in time and you could randomly pick a point in time and say from that point until the end I expect to see no change so what would that actually mean about the probability of what's comedy events that are coming up the upcoming events have an expected gain of 0 they're not expected to go up not expect it to go down so for example if you had a morning Gale process like a random walk you start at 0 and you start going up or down I'll pretend you eventually get to minus 4 and then you said what do I expect it to be 1 this thing eventually ends if it hit minus force at some intermediate point if you said well I expect it to be minus 4 I don't expect to be a positive or negative change from this point on that would be a martingale process so does that make sense that that a process that basically says this notation is basically saying at a point T along the way the current value is X so where do you expect it to be when it finishes will expect it to be X because I don't think anything that's happen in the few what's going to happen in the future has no expected gain or loss and all martingales by nature are Markovian so again Markovian means you don't need to see the past of how it got to X all you need to know is it's currently an X and it's expected to stay it okay so this is something I just did in MATLAB this is a tool we can use to kind of generate or simulate things were doing so I basically put it up in MATLAB I started with a value of zero and then at each time interval and I did 100 time intervals but I basically said at each time interval I'm going to eat you know I I simulated a coin being tossed that had a 50% chance of being heads and 50% chance of being fails if a heads came out I incrementing my score by 1 and if the tails came out I decrement in my score by 1 and then just let it run a hundred times and then each time I let it run I change the color of the random walk so for example the blue random walk went like this probably Muscat must have been 2 3 right so then it went like a long missed pan and eventually ended up down here and then for example this yellow random walk went like this now if at any point in time so a random walk is by nature of martingale at any point in time let's say for example I started at 1 and this yellow one went along here so here I had a nice streak of head head head and then ventured macphails head but let's say at this point at this point in time it looks like I'm at about 8 or 9 my score right now is at 8 or 9 if you are asked where do you expect to be when you hit time equals 100 right now at time equals 60 after 60 coin flips I was at Plus 8 right about pluck it looks like about plus 8 where would you expect me to be at time equals 100 if the only piece of information I'm telling you is that 60 I was at plus 8 where would you expect I be at 100 you'd expect me to stay plus 8 I may go off my may go down but the chance I'll go up equals the chance now I should expect to stay at eight and then what you might see in the textbook it says it uses this term of filtration so basically says the definition of a martingale is a process who based on it current value and a current filtration you can expect the future value to be exactly the same as the current value and what we basically mean by a filtration is that suppose I told you suppose I let this one two time equal 60 and a time equals 60 the value is 8 but suppose I also told you at time equals 40 the value was 3 and at time equals 20 the value was 4 suppose I give you three pieces of information what the value was at time 20 what the value was at time 40 of what the value was at time 6 with all three of those pieces of information if I said what do you expect it to be at 100 what thinking would you do if it's a martingale the value of 20 and the value of 40 really don't matter it's the most recent piece of news just that basically you're filtering out they're not not needed pieces of news and you're just taking the most recent one and saying well ok if you're telling me at time 20 the value was for at time 40 the value was 3 and at times 60 the value was 8 and with those three pieces of information you're asking me to predict what will its value be at 100 I'm going to ignore 20 and 40 I'm just going to take the most recent piece of news and say since this is a martingale its expected value at any going into any future is that it is expected to not change so I would just say ok then if if at Point 60 the value was 8x to finish a that's a more detail okay so this is just a ket of what they call sample paths of a random walk so a random walk is a probabilistic thing it has no you know it's a random process each time you alter the score by whatever the random event was and you just keep a running total so each one of these lines is called a sample pad of a random walk so to see that a lot in the textbook they'll go over sample passing okay so now a formal definition of a Brownian motion and what I'm going to do in a couple of slides is basically take a a random walk and convert it into a Brownian motion but the formal definition of a Brownian motion is a stochastic process so all these things were defining tonita processes they happen over time a stochastic process and it's popularly used to letter W W you think you use B but some books use B for browning and some use W for wiener but it's a process that happens over time from zero off to infinity as a standard Brownian motion if number one it's gone till to zero so just like the random walk we could have made it stone anyway but we decided to start at a zero it has continuous sample paths so the random walk doesn't have continuous sample path it has discrete points where the value jumps and actually in the in the MATLAB diagram I drew is actually drawing an angle to get from one point to the other it actually should have just put dots at each of the points and let your mind connect the dots but it actually drew a line but this is a Content has continuous sample paths and it's it has independently normally distributed increments so this random walk had a uniformly distributed increment they either had a 50% chance of going up or 50% chance of going down this one as we go from one value to the next one the way that what we use to get to the next one is we running rent I'm sorry a normally distributed variable and see what its result is and then add that to the current score so the random walk is again we start off with a current score of zero we then have a random event occur that has a 50% chance of going up in a 50% chance of going down and take that outcome and add it to a current score and we just kept doing that until we got to the end of time in this case we're going to take our current score which starts up at zero run a normally distributed random variable and that result to a common score and that gives us our next score and we will do that until we until we get capital T now unlike a random walk where we go where the time between each random event is discretely scheduled these happen instantaneously so when one happens the next one happens instantaneously after them and if you remember from the central limit theorem adding together a bunch of variables with pretty much any distribution will add together become we distribute process okay so this ends up being kind of a formal definition and shortly we'll talk about how to build one from a random walk but what's going to happen a lot with our without textbooks and just in this field a Brownian motion which we said was a process and it was not Brownian motion was named that group a like a botanist like that whose people who studied plans for the people botanist okay so there's a botanist from maybe 250 200 years ago and he was kind of studying like when pollen falls in a lake how long does it take to move around the lake and it was kind of like where is its current location well it's its its current location plus a normally distributed variable will give you its next location I was kind of studying that and so that's why a lot of the older textbooks refer to this as a Brownian motion this type of process and then there was a mathematician if there's a mathematician but a famous person by the name of wiener I think it goes reading he like finished college when he was 11 and finished a PhD at Harvard when it was like 17 but forget if it was in that but he ended up making big contributions to mathematics and so this is sometimes referred to as a wiener process so Brownian motion and Wiener process you'll see in the literature end up for the most part meaning the same thing but a wiener process is basically a process characterized by three facts like a Brownian motion it starts over zero in this case that has almost continuous simple sample paths brand-new motion said it had to have simple sample paths this one allows for us to have two points in fun time T and time s the gap between those two could be normally distributed with a mean of 0 and a variance of t minus X so this kind of opens the door between bridging the Brownian motion and kind of that random walk and saying the two can be somewhat similar if the gaps between the event got really really small and there were many many gaps so as as the size of the gaps go to 0 a random walk starts to become a Brownian motion and a wiener process could be somewhere between the two kind of opens the door that it could be somewhere between these two but as far as our textbook is concerned and edifis field is concerned Brownian motions and we know processes can be can be considered the same thing okay and that's why actually a lot of a lot of the definitions set of Brownian motion is a process W of T in the W ends up being coming from the named wiener so you might see in the textbook they might describe a Brownian motion and use the word I use the letter W to symbolize and then if you want we could have many dimensional Brownian motions we could have a vector of Brownian motions and it's an n-dimensional Brownian motion if each of the WS which is the Brownian motion process is a standard Brownian motion and depending of each other so that's the definition will just pop up later on okay so now I'm going to just try to do is convert the random walk into a brownie okay so so if we took a random walk and instead of doing like we did originally where I said every time interval let's say you know there's a there's a gap of time capital T and then there's broken up into little small T climb intervals discrete time intervals instead of saying that the next event the thing we're going to add to our current score instead of that being a uniformly distributed variable 50% chance plus one fifty percent chance minus one I'm going to say it's a randomly I'm sorry it's going to be a normally distributed next event with a mean of zero okay now suppose we decided so first change we're making to the random walk is was saying the thing we're adding to our current score is normally distributed not uniformly distributed the second thing I want to do is to take the interval each of the intervals little T that we add or subtract to us for I want to divide that up into n equal parts some number n now obviously we know from continuous mathematics we're going to eventually let n go off to infinity and see what happens that's the way capitalist works okay so each of the increments if we take what used to be our only increment which happen to time T now divide that up into n intervals each of our increments will add this much to our would have the expectance we add this much to our score and the total increment si would equal the sum of all of these residual income at the expected value of all time ended up since this is a morning Gail would be zero the expected value of each of the increments would be zero actually I probably should have broken I'm sure about these are the other way around this is the individual ones adding up and this is the sum of all of these would be expected to be equal to 0 the expected value for R squared since RI is equal to what letting this be the square root of T over N for it to be a normal normally distributed variable R squared is expected to be T over so the expected value for the increment would be 0 but the expected value for the variable squared would be T over okay and then we have the expected amount so now we're going to start to calculate we're going to start heading in the direction of calculating the variance so this of course this is a wiener process or a Brownian motion it's a martingale process at any point time the expected increase would be 0 but we want to calculate we want to start heading towards calculating the variance so the value of s squared because it's the value of that squared minus the expected value of s that whole thing square we use that the calculate variance the value of s squared would be equal to all of the increments the value of x squared ups at some point in time I would be all of the increments from 1 to I times itself and then take the expected value of that so this is going to end up having a lot of terms this would be linear but there'll be a lot of terms of the r1 times r1 plus r1 times r2 plus r1 times r3 we go through that whole row that will be our two times our r1 and r2 times r2 r2 times our and so on if we said that each of these individual bits now that now that we're taking the random walk and slicing them up into n parts if we said each of those were independent of each other then there would be no correlation between any time I and J were not equal to each other there'd be no correlation so the expected value of any two random variables that had the same expected that both have an expected value of zero multiplied together would also be zero so since I and J are not any combination of I and J as long as it's not the same one or uncorrelated the expected value of this compass would be zero so a lot of the terms in this we're trying to calculate s what you know the expected value of x squared a lot of these terms we can end up getting zero that the only ones that would survive would be the expected value of s squared will end up being our one this is kind of a PowerPoint notation you should be lined up the one of the two on top of each other but our our one squared plus and we go all the way down to our I squared so B I terms each having a value of T over N so the expected value of s squared would end up being for example who run as far as up all the way up to n would end up being n times T over N which ends up being T so the value of the expected value of s squared would be T the way we took the random walk and chopped it up into little pieces and chopped it up into n pieces it's expected the expected value of s squared would be T the expected value of s the expected value of s what the squared on the outside would end up being zero right because the expected value it's the monoghan so it's expected value would be zero so the variance would end up being t minus zero which ends up being teeth so they've the variance of this process and all we've done so far is we've taken a random walk and instead of it just being it fixed a set of fixed intervals with one result we took those fixed intervals and cut them up into n individual people who sized pieces and had each of them and a beautiful random a normally distributed variable to those for each of the end pieces and we ended up figuring out what its variance would be so we know it's expected value because it's a martingale is 0 its variance would end up being T so now what we like to do I think the next step should be if the let n go to infinity so as we let n go to infinity if we take a random walk and take each of the time intervals and cut them up into n equal slices and then let n go to infinity this is now a this is now going to be a random walk where every event happens immediately after the previous one and we keep taking these really incredibly small increments or decrements which are normally distributed and add that to our cons for the expected value at the end of the process as n goes off to infinity we end up with a we're going to say now a different function and that different function now becomes a Brownian motion so we basically used instantaneous mathematics to go from a discrete scenario to a to a continuous scenario and this function which will now call X of T is called the Brownian motion that corresponds to the random if we let it slice up and let them go to infinity it's expected value once again it is a it's a morning Gale so it's expected value a zero an only distributed variable is has a bell curve to it so we're always adding something that has an expected value of zero to our current score we expect our farm school setting the same and the expected value of x squared ends up like we said before being t which then also contributes to the variances okay so then note that a Brownian motion is Markovian meaning all you have to know to know what its next value is is its current value and what we're adding was subtracting from we don't need to know the history of how it got there that makes it Markovian and ends up being finite now that we let n go to infinity this became a continuous variable it is a martingale because it's expected value from wherever it currently is it is expected to not change and it is normally distributed with a mean of 0 and a variance of T okay so where are we really going with this so this is kind of the last thing we'll talk about today where is this really Gulf what's the point of this suppose we have a differential equation of this form saying DX the rate that some variable X changes is equal to a DT plus B times the derivative of wolf we're now saying is a Brownian motion or Wiener process okay where a and B are constants okay now the DX equals a DT part this part let's say this let's say this was not here pretend this part is not here so it's just DX equals a DT what would that integrate to if you were solving that differential equation what would that integrate into that would turn into x equals 80 plus some constant so some initial value of X and then we're going to add an A and of T and T ends up there not in this case on the being time so this is basically saying like we'll start with some initial value and then depending on time we'll just keep whatever the time is we'll add that many aids to the score so what are we effectively doing here we're taking some initial value and then making it go assuming a is positive and financed it will be positive but in any other thing it could be positive or negative but we'll say this is positive we're going to take some initial score and then at times going to cause the value to go up this is going to be kind of like the current stock price and then this would be maybe interest an interest rate on it or something that something that goes up over time so when saying like the current price of something is equal to its initial price plus as time goes on its value somehow inflates it inflates by a if we wanted it to be a rate I guess we'd have to say whatever X 0 is multiply that by like off this one end of a would end up being our X 0 so that would be like for example if we wanted to say that a stock price is let's say a stock price is currently $100 and we want to model where is it going to go in the future and suppose we said it's going to just be its current value plus interest so it's not really stock it's kind of putting money in the bank it would be the current price is equal to the initial price plus the initial applied price times some rate times time so let's say it was initially $100 and the rate was 2% you take 2% times 100 which is to every time unit tongue units could be years every year you get an extra $2.00 will be like $100 time plus two plus two plus two something a little month long so we could start using an equation something like this to model the value of something that grows over time yeah so X oh that's the that's what its initial is and then X is the current yeah X would be the current so this is some initial this is so this is basically a modeling equation so this is like what we initially started with and then time causes us to add some value to it actually it could be negative which which could make value both down which could be in the case of you know present value future value if it was going down but what we're going to end up doing is we're going to say we're going to model the price of something by saying it's up whatever it initially started as we're going to assume in the case of things going up with respect to interest this is this part of the equation is going to be the interest part the part where we say okay the more time you have the more it goes up over time so we'd have to let a B or whatever you put in front of the t something that models the interest based on whatever the current prices now that would model this equation alone without this piece would model an amount of money that was growing at a very fixed rate with absolutely no volatility this note it's just like putting money in the bank and getting guaranteed interest what if we bought stock which may go up may go down but has an expected increased rate like when you buy stock you expect it to go up with the interest rate but it might go much higher it might go much lower but you have an X or expected value as it should at least compete with interest if you're a risk neutral minded person it's got to at least compete with interest so when we go to model is when we say the rate of change of the stock it should be equal to 2 into that two components to it should have and now this isn't this is one thing I said before say this this is not the model for stock but this is the most commonly used point so we're saying the price changes it has two components to it number one what we have something that models the interest rate something that grows with time a second component is just some variable part that could go up could go down and it's going to go up using it's going to be a wiener process which starts at zero and then may might go up might go down and then we're going to multiply it by a magnitude and add that number to the constant rate going up to take a guess at where would be at some point in the future so this is the model going to be using so the two components are the first component is exact interest and the second component is a wiener process which is just moving up and down as time goes on and we're multiplying that by a factor B what would the factor B be in the real world so this is the come this is the fixed component and this is the variable that we might win we might lose but we're going to map along the idea of an interest line and then maybe go up or down along that line on a based on a wiener process why would we have a magnitude of that one stock might vary a little bit once I might bury a lot and that magnitude would end up just being a bigger number for a highly variable stock so stuff like Google might have a big number for B and a stock that doesn't change much like a soda company like coca-cola might not change much that might have a very low B the Wiener process Wiener process is always the same thing starts at zero and then just keeps adding instantaneous increments of a normally distributed variable they all have the same magnitude so if we want to make a stop seem more volatile than another we just make we make be more and if the interest rates go up we make a larger so the a constant here controls interest rate component of our predicting the side and the Wiener process multiplied by some magnitude gives a randomness to the stock and the B makes the randomness Peter bigger or smaller and so what we're going to do is take those two concepts add these two components together and then say this is woven to use to model the behavior of a stock so the interest rate part will come from whatever the current interest rates are we assume that whatever today's interest rate is it's going to be the same a year from now we're trying to figure out what the price is going to be here from now and we're going to make the assumption that the volatility of a stock from the past will match the volatility of the stock from the future which is not always true but that's the only thing we can do we say the past volatility guess that's going to be the same for the future so what we can now do is we get using something like MATLAB is we can program them in all these numbers based on the current interest rate and the past volatility of a stock and then say here's what the stock might do and let it run Dook like i did with those random walks let a bunch of sample paths run it's going to be random and then if we wanted to so like I say up maybe I'll do next class I'll do a Brownian motion version of it but when we did this let's say instead of these being a bunch of random walks suppose these were a bunch of Brownian motions if these were a bunch I'm sorry if these were yep these were a bunch of Brownian motions rather than a random walk if we ended an interest rate we'd be taking all of these and slightly tilting them up so they'd be going like in this direction not not out like a martingale but they're going up on an angle and then we'd say okay the possible results based on the volatility of the stock we would have you know wide range less volatility they'd be closer to the middle but still angling up at the interest rate and then if we wanted to say what is the price of a stock option on this stop with let's say a call of this number that means from this number and up we get profits from this number of damn we don't get anything so we would take we could run this experiment thousands of times take all of the numbers above the strike price take an average of them and say that's what we think we're going to collect in the future and then subtract interests of discount by interest back to today and that's what we think the stock option would be worth if we use that model the equation we had before to model the stock correctly then run the experiment thousands of times take a weighted average we could guess what what a stock option would be worth today so this was kind of the groundwork of the method that was used by black and show black Scholes and more importantly came up with using this model in that equation by taking the saying the price of the stock will be tickets current price and constant interest to it and then a wiener process with a magnitude of volatility based on the past to predict the future and that intake and to calculate an expected value for the full price so that's what we'll talk about next class if you want to read up ahead it's the black shoal black there's two or three people black Scholes SCH ool es m and and Morton sometimes that name shows up and I won the Nobel Prize for this calculation heck I'll calculate the price of all options won a Nobel Prize
Info
Channel: profbillbyrne
Views: 126,860
Rating: 4.8894792 out of 5
Keywords: Bill, Byrne, Monmouth, University, financial, mathematics
Id: BVYVeaPojY4
Channel Id: undefined
Length: 39min 9sec (2349 seconds)
Published: Sun Nov 13 2011
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.