Markov Chains & Transition Matrices

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this video is my second video on markov chains and if you haven't seen the first video markup change check out the link down in the description in this video we're going to use some formalisms of linear algebra the idea of matrices the idea of vectors and how to multiply those to make the kinds of manipulations that we need to do when studying a markov chain much much simpler indeed in the introduction video we ended with a computational difficulty that we're going to solve in this particular video so here's the idea imagine i have a transition diagram that's what this is a transition diagram lists the possible states of some system so in this example it might be in state a or state b and then there's a bunch of arrows between those states and the numbers on each arrow describe the probability so for example the 0.75 that is on the arrow that goes from state a to state a represents that there's a 75 probability that state a in one iteration is going to transition to being state a in the next so what i'm going to do first is introduce some notation to help me describe different things the first is going to be what i will call s naught and this is referred to as the initial state of the system and it's a vector it's got a one and a zero so what does that mean so basically the top component of my vector refers to state a and the bottom component to state b if you had more components in your system a b c for example then you'd have three components in your vector and the fact that it says 1 and 0 means there's a 100 probability that it starts in state a and a zero percent probability it starts on state b i just asserted this but it could have been something different perhaps your initial state would have two different probabilities that would add up to one so this s naught vector is just a little bit of formalism to encode the initial state of the system indeed i sort of highlighted it in pink in my transition diagram just to illustrate i'm starting at state a okay then i want to go to s1 s1 is the state after one iteration that is if i start at my s0 my initial state and i go one iteration into the future i am then at s1 the state after one iteration and likewise the 0.75 and the 0.25 the top number is still referred to as the state a and the second number is referred to as the state b and so the 0.75 in the s1 is referring to the fact that there's a 75 chance of it being back in state a after one iteration so then the question becomes how can i manipulate to go from the s0 to the s1 i mean in this case we were able to sort of figure out what the s1 was just by looking at the diagram but i can then follow up on this but how do i go from s1 to s2 the state after two iterations or s2 to s100 is there a process that allows me to do those types of manipulations efficiently and indeed the answer is yes there is and we're going to use the notion of a matrix how this works is let's consider the transition diagram there are four numbers so let me just forget the rest of the diagram for a moment and just pull those four numbers together this is going to be called a matrix then array of different numbers the way i think about this is i imagine that my columns are denoted with an a and a b and i imagine that my rows are denoted with an a and a b so something like the 0.75 represents if i start in state a i will end up in state a something like a 0.4 says if i start in state b it's in the second column so if i started state b then i will end up in state a that's what the 0.4 represents this type of matrix is called a transition matrix and we often just write it as p for the transition matrix now here is the magic of matrix algebra the formula to go from s naught to s 1 is just multiplying by this matrix p in other words s1 is p times s naught and if i want to write that out with the actual information in this case the s1 was the 0.75.25 the p was that matrix and the s naught was the 1 0. this is matrix vector multiplication now if you don't know what matrix vector multiplication is that's totally fine so i have a video for you that introduces matrix vector multiplication has nothing to do with markov chain specifically because this little piece of algebra is useful all over the place and i encourage you to check that video out and make sure you're comfortable doing matrix vector multiplication i'm just going to assume that for the rest of this video nevertheless for the purpose of this particular video all that's worth noting is that when you do this multiplication on the right you get exactly that vector that we predicted on the left now thus far we haven't really gotten a big improvement because you were capable of coming up with the s1 just by looking at the diagram we didn't need to do all this formalism but what about going towards s2 the second state is again just take that transition matrix and multiply it now to the s1 in other words i can go and take a look at this this is going to be a new multiplication that same transition matrix but multiplied by 0.75.25 and if you do that matrix vector multiplication you get 0.66.34 for the s2 now i want to note that in the first video on markov processes we actually did this computation via a tree diagram and i showed how to justify it and so right now i really just want to note that using this matrix method gives us the same answer that we've done in the past but now it generalizes what have i just done if i just look at this s2 is p times s1 s1 was p times s naught so in other words what this is is p squared times s naught and in general if i wanted to figure out what the nth state of my markov processes would be it would be just start at the s naught start at the initial state and then multiply by n transition matrices p to the n and this is my final answer for how i can do manipulations in markov chains the m state is just this transition matrix multiplied n times to the initial state so if you're given a problem about a markov process the main steps are going to be to one write down that transition diagram so you get all the numbers two take that transition diagram and put it into a matrix form and then three you can figure out any future state just by multiplication of that matrix a sufficient number of times all right i hope you enjoyed this video on how to use transition matrices to solve problems in markup chains if you have any questions about the video please leave them down in the comments if you like the video give it a like for that youtube algorithm and we'll do some more math in the next video
Info
Channel: Dr. Trefor Bazett
Views: 103,327
Rating: undefined out of 5
Keywords: Math, Solution, Example, Markov Chain, Markov Process, Transition Diagram, Transition Matrix, Initial State, Vector, Probability, Statistics, Future Probabilities, Present, linear algebra
Id: 1GKtfgwf3ig
Channel Id: undefined
Length: 6min 54sec (414 seconds)
Published: Mon Sep 14 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.