What is a Moment Generating Function (MGF)?

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so what is a moment generating function when we're talking about random variables and let's think about moments we know the equation for the first moment which is the expected value of the random variable if we have a random variable x that equals the integral from negative infinity to infinity of x times the pdf of x dx so this is our equation for the mean the first moment and of course we have second moment which is the expected value of x squared and so on for all the higher order moments is the expected value of x to the higher powers and this of course written in terms of an equation in terms of the probability density function so you can calculate all of these moments using the probability density function so what is the moment generating function well it's another function which actually allows you to generate all of these more directly than calculating these integrals so let's look at its definition we have the moment generating function which we use capital m and then we put subscript x for the random variable x and it's a function of a variable t and this is defined as being the expected value of e to the t x for where x is of course again the random variable so here we have a relationship which is similar here we've got the expectation of a function of x this was x and x squared and so on and the function of x here is e to the t x and so we can calculate this the same way as uh using the bi directly with the formula of expectation so we can integrate from negative infinity to infinity of e to the t x times the pdf of x dx and we can calculate this function so why is it that this function is called the moment generating function well let's look at this function here and and realize that we have an expansion for this equation e to the t x and let me write out that expansion a series expansion so we've got it also then equals the expectation of and let me write the expansion now so this is equal to one plus t x plus t squared x squared divided by two factorial plus t cubed x cubed divided by 3 factorial and so on and so this is a direct series expansion of e to the t x and now we start to see how this function here can be written in terms of the moments because if i take the expectation inside then we've got 1 plus t e of x plus t squared divided by 2 factorial of e of x squared plus t cubed divided by 3 factorial e of x cubed and so on i'll fit the dot dots in there so this is an infinite series and you can see that each term in this series is one of includes one of the moments so this term here includes the first moment the mean this one the second moment and so on and so actually we can now should be able to see that we can calculate each of these moments so not only is the moment generating function made up of each of these moments but we can calculate these moments if we know what this function is by taking the derivative of this function and then setting t equal to zero so if we want the first moment we would take the first derivative and then set t equal to zero so let me see what happens then this in the first derivative this disappears this becomes that t just simply becomes e of x this becomes uh 2 times t divided by 2 factorial times this but then when you set t equal to 0 this term disappears and the same thing here you'd have 3t squared divided by 3 factorial times this but when you set t equal to 0 this one disappears so if you've taken one derivative you will be left with when you set t equal to 0 you'll be left with just e to the x when you take a second derivative this term would then disappear and then you would have uh this term here with a single t with with a to a power of one in the second derivative and then when you set t equal to zero uh you sorry this t would disappear in the second derivative and you'd have this one here uh disappearing and so on so let's see an example of that so let's look at the gaussian so if you had a gaussian so m x of t for the gaussian so the gaussian equals e to the t mu plus one half sigma squared t squared so this is the moment generating function for a gaussian with a mean of mu and a variance of sigma squared so let's take the first derivative of this and see if we can get the mean so d uh mxt dt equals well this is you have to do this by parts so this would equal mu e to the t mu time um plus or let me make it into two components here make this probably easier to see sigma squared t squared so this is well i've put the expectation uh divided this into two terms or put exponent exponential of this uh times exponential of this and then we take the derivative of the first term holding the second term and then we take the derivative of the second term so to e to the t mu times sigma squared t times e to the half sigma squared t squared so this is the first derivative of that term and then when we set it equal t equal to zero when the t equals zero of this term then you can see this term here will disappear because there's a t in here so this term disappears when you set t equal to zero uh and this term here you've got e to the zero which equals one and e to the zero which equals one so you've got one times mu and so the first derivative when you set t equal to zero gives you mu now you can do the same thing for the second derivative and you'll see that the it gives you the second uh function which equals so this if this is d squared mx t dt set t equal to zero will give you a term here which you can find for yourself which equals mu squared plus sigma squared and this is the expected value of the second it's the second moment the expected value of x squared and so from the moment generating function here's the procedure to generate all the moments and finding the moments this way is in for many pdfs a much more direct and simpler process than finding them by calculating these infinite integrals so if you can calculate the moment generating function it makes it easier to find the moments another thing is you can use the moment generating function because there are results that relate to it and there gives you some results on bounds and so on for example there's the churn off bound and there's another video coming on the channel related to the churn off bound there's one result we can directly say here is from jensen's inequality we have that the moment generating function is bounded by the exponenti by expected value of oh sorry the exponential of e l e time to the power of mu t so this is a result from jensen's inequality uh where we have some results that we know about the moment generating function which can be used in some stochastic calculations so this was the moment generating function it's a compact way to be able to calculate all of the moments for a random variable for the distribution of a random variable so if you found this helpful give the video a thumbs up it helps others to find the video subscribe to the channel for more videos and check out the links in the description below the video where there's a webpage with a full categorised list of all the videos on the channel
Info
Channel: Iain Explains Signals, Systems, and Digital Comms
Views: 6,682
Rating: undefined out of 5
Keywords: Moment Generating Function, m.g.f., MGF, Gaussian, Probability Density Function, p.d.f., PDF, Expectation, Moments, Mean, Variance
Id: wjwLTNYOuI4
Channel Id: undefined
Length: 8min 51sec (531 seconds)
Published: Sun Jun 06 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.