Lesson 16 Bernoulli and Binomial Distribution Part 1

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi this is lesson number 16 on probability and in this lesson we study Bernoulli and binomial distributions let's start with Bernoulli distribution a Bernoulli random variable is a random variable which has only two possible outcomes now consider a random variable X which can only take two possible values 1 or 0 let's say it takes a value 1 with probability of P and takes a value of 0 with probability of 1 minus P where P is between 0 and 1 ok so this is a random variable and I can give you examples of such random variable a coin toss a single coin toss when you when you toss a coin you gather get heads or tails heads with probability of P or tails but probability of 1 minus P another example I can give you is when you take an exam let's say a probability exam you may pass with the probability of P or you may fail with the probability of 1 minus P so you have a single trial which has only two possible outcomes when such a random variable is called a Bernoulli random variable or a Bernoulli trial now let's find the expected value of this Bernoulli random variable X the variance and the moment generating function of this random variable X as you can see this random variable X is a discrete random variable because for instance X takes a value of 0 or 1 and the probability that X takes the value of little X is given by probability that X takes a value of 0 is 1 minus P the probability that X takes a value of 1 is P so since this is a discrete random variable which takes two possible values 0 & 1 its expectation is simply the sum let's say little X starts from 0 - one of X times P of X of little X and that is equal to zero times the probability that X takes a value of zero plus 1 times the probability that X takes in volume 1 and that is equal to zero plus one times P which is P therefore the expected value of a Bernoulli random variable as equal to P which is the probability that X takes a value of one or the probability of success let's say if you pass an exam that's a success if you fail an exam that's a failure and the probability of passing is P and the probability of failing is 1 minus P therefore the probability of success is equal to the expected value of the random variable X let's find the variance of this random variable X the variance of this random variable is equal to the expected value of x squared which is the second moment - the expected value of x quantity squared we know the expected value of X therefore I can write this as the expected value of x squared minus P squared now let's find the expected value of x squared so that we could find variance but now the expected value of x squared is equal to the sum X from 0 to 1 of x squared times P of X of little X that is equal to 0 squared times the probability that X takes about x0 plus 1 squared times the probability that X takes value 1 that is 0 plus 1 squares 1 1 times P is P so 0 plus P which is P so the expected value of x square or the second moment of a Bernoulli random variable is again equal to the probability of success P therefore the variance of a Bernoulli random variable X is equal to the expected value of x squared minus the mean expected value of x 2 whole square and that is equal to P minus P Square if I take out P to multiply one minus P and that I have the variance so the variance of a Bernoulli random variable is equal to the probability of success times the probability of failure okay I think that's easy to remember the probability of success times the probability of failure now let's find the MGF the moment generating function of the random variable X MX of T by definition is equal to the expected value of e to the power of T times X okay X is a discrete random variable so I can write this as the sum X from 0 to 1 because my support is 0 or 1 or X could only take values y1 times e to the power of T times little X times the probability of X of little X now this is equal to e to the power of T times a 0 times the probability that X takes the value of 0 plus e times T to the power of 1 times the probability that X takes the value of 1 and that is equal to e to the power of 0 is 1 P of X of 0 plus e to the power of T times 1 as e to the power of T times the probability that X takes value of 1 the probability that X fix a value 0 is 1 minus P and the probability that X takes a value of 1 is P so I have 1 minus P plus if the power of T times P or 1 minus P plus P times e to the T as the MGF MX of T of a Bernoulli random variable X now we have the expected value to variance and MGF of a Bernoulli random error let me give you an exercise here for your exercise try to find the expected value and variants of this Bernoulli random variable X using the MGF okay that's what I have for a Bernoulli random variable but let's go ahead and discuss what we mean by a binomial random variable binomial random variable to explain what a binomial distribution is I'm going to start with an example example toss a coin three times note that each trial or each of the three trials that I have here are independent of each other meaning if I observe heads and my first coin toss that outcome does not affect the probability of observing heads or tails in the next coin toss or in the next trial with that I'm going to say that the probability of heads then each of the three trials is equal to P of course if we have a fair coin that P is equal to one-half what is the sample space of this experiment of tossing three coins heads on the first heads from the second heads on the third heads heads tails heads tails heads tails heads and heads heads tails tails tails heads tails tails tails heads and TTT now let me see one two three four five six seven eight eight possible outcomes and my sample space define the following random variable defined the random variable X to be the number of heads or the number of successes and the three Bernoulli trials that I have Bernoulli trials I'm saying that because each coin toss is a Bernoulli random barrel and each of those Bernoulli trials are independent now what I have such a random variable I say this random variable X follows a binomial distribution with the number of trials n equal to 3 and the probability of success P let's see let's examine each outcome of that experiment so for the outcome heads heads and heads what is the probability that outcome happens the probability of observing heads heads and heads as equal to the probability of heads on the first intersection heads on the second intersection heads on the third and each of these trials are independent so the property of observing heads on the first is P and since that is independent of the second trial I can multiply that by P and by P yeah so the probability of observing heads in the three trials is P times P times P which is P to the power of three so that probability is piece of power of three what is the value of x and this outcome the value of x in this outcome is equal to three because x is the number of heads and I have three heads if I look into heads heads and tails the probability that this event happens as P square times 1 minus P that is because the probability for observing heads in the first is P the probability of observing heads in the second is P so I have P times P which is P Square and the probability of observing T tells and third as - P and the number of heads and this event s two heads heads - let me list all the other outcomes looking into heads tails and heads that happens with probability of P squared times 1 minus P because I have two heads and one tail and the number of heads is 2 you looking at the event tails heads and heads that happens with probability of P squared times 1 minus P and I have two heads and that trial looking at the event h TT that happens with probability of P times 1 minus P squared and I have one head and that outcome looking at tails heads and tails that happens with probability of P times 1 minus P squared and the number of heads is 1 and T th has one head and that happens with problems here of P times 1 minus P Square and finally TTT has problems you have happening of 1 minus P to the power of 3 and I have zero heads and that outcome the outcomes are all listed here and looking carefully the the probability that the random variable X takes a value of 3 is equal to P to the power of 3 and the probability that's the random variable X takes the value of 2 as I have three outcomes in my sample space giving me a value of 2 for X and each of them happen with the same probability which is P squared times 1 minus P therefore this happens with the probability of 3 times P squared times 1 minus P the probability that X takes a value of 1 happens with a probability of three times I have three events and my own in my sample space giving me a value of 1 for X and each happening with probability of P times or minus B square so I have 3 times P times 1 minus P Square as the probability that X takes value of 1 and the probability that X takes a value of 0 is equal to 1 minus P times 1 minus P times 1 minus P which is 1 minus P to the power of 3 I can rewrite these probabilities as now I have three heads here so what I'm doing is I am choosing 3 from 3 so that is 3 choose 3 times P to the power of 3 times 1 minus P to the power of minus 3 ok let's look at this carefully choose 3 after 3 to give you heads and 0 this is 0 0 of the 3 to be tails and the number of ways you can do that is 3 combination 3 remember n combination X or n combination K is equal to n factorial divided by K factorial times n minus K factorial and here I am choosing two of the 3 so from the three possible trials I choose two of them to be heads the first and second one or at the first and the third one or the second and a third one and that can be done and three combination two ways times since I have two of them chosen to be heads that happens with probability of P squared times 1 minus P to the power of 3 minus 2 which is 1 3 minus 2 and again this is choose 1 up to 3 first one the second one or the third one to be heads times P to the power of one times one minus P to the power of three minus one and finally choose zero of the three three combination zero to be heads so I have P to the power of zero times one minus P to the power of three minus zero therefore if X follows a binomial distribution with the number of trials N equals three and probability of success P the PMF px of little X is equal to three combination X times P to the power of X times one minus P to the power of three minus X for x equals to zero one two three and in general if X follows a binomial distribution with number of trials n and probability of success P the PMF is given by n choose X P to the power of x times 1 minus P to the power n minus X and X could take values so the support of X is integers zero one two three four five six all the way to N and in the second part of this lesson we're going to see that the expected value is equal to n times P that is the number of trials times the probability of success and the variance of the random variable X is going to be equal to n times P times 1 minus P and the MGF the moment generating function of X is going to be equal to 1 minus P times 1 minus P plus P times e to the power of T the whole thing to the power of n and that's what I'm going to do next in the second part of this lesson which is to prove that the expected value as n times B the variance is n times 2 P times 1 minus P and the MGF that in fact I'm going to start with the MGF and then use the MGF technique to find the moments the first moment and the second moment and of course the parents
Info
Channel: Stat Courses
Views: 45,163
Rating: undefined out of 5
Keywords: Actuarial Science, Bernoulli Distribution, Binomial Distribution, Expected Value of Bernoulli, MGF of Bernoulli, Variance of Bernoulli, Actuarialpath, SOA Exam P, CAS Exam 1, Actuarial Exams
Id: 8RbXCXVCRcA
Channel Id: undefined
Length: 17min 49sec (1069 seconds)
Published: Sun Mar 03 2013
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.