FEEDBACK ABOUT
GEOMETRIC DISTRIBUTION
DEFINITION
The Geometric distribution is a discrete distribution under which the random variable takes discrete values measuring the number of trials required to be performed for the first success to occur. Each trial is a Bernoulli trial with probability of success equal to \(\theta \left(or\ p\right)\). This is a special case of the Negative Binomial distribution when the number of successes required is only 1.
There is another type of this distribution, under which the random variable measures the number of failures before the first success occurs.
CALCULATOR

Enter the following details:

Number of the trial on which first success is required:

Probability of success for each trial:


Probability that trials are required for first success:

Probability that or less trials are required for first success:

Probability that more than trials are required for first success:

Mean of the specified geometric distribution:

Variance of the specified geometric distribution:

Probability distribution
xp(X=x)P(X ≤ x)
FORMULA AND DERIVATION
Type 1 Geometric distribution

Let us first consider the first kind of geometric distribution that measures the number of trials required for the first success to occur.

Probability Distribution:-

If \(x\) is the number of trials required for the first success, it means that there are \(x=-1\) failures followed by one success. If each trial is a Bernoulli trial with probability of success, \(p\), and probability of failure of, \(q=1-p\), then the first success on trial number \(x\) can be written as \(q^{x-1}\times p\). So, the probability function can be written as follows:-
\[P\left(X=x\right)=q^{x-1}p\ for\ x=1,2,3\dots \dots \]
Mean:-

The mean or expected value for the geometric distribution can be calculated from first principles, as follows:-
\[E\left(X\right)=\mu =\sum{xP\left(X=x\right)}\] \[\mu =\sum^n_{x=1}{{xpq}^{x-1}}\] \[\mu =p\sum^n_{x=1}{{xq}^{x-1}}\]
(In the next step, \(x\) is written as \(\sum^x_{k=1}{1}\) )
\[\mu =p\sum^n_{x=1}{\left[\left(\sum^x_{k=1}{1}\right)q^{x-1}\right]}\] \[\mu =p\sum^n_{x=1}{\sum^x_{k=1}{q^{x-1}}}\]
(In the next step, the order of summation is changed)
\[\mu =p\sum^n_{k=1}{\sum^n_{x=k}{q^{x-1}}}\] \[\mu =p\sum^n_{k=1}{\sum^n_{x=k}{q^{k-1+x-k}}}\] \[\mu =p\sum^n_{k=1}{\sum^n_{x=k}{q^{k-1}q^{x-k}}}\] \[\mu =p\sum^n_{k=1}{q^{k-1}}\sum^n_{x=k}{q^{x-k}}\]
(In the next step, \(x-k=j\))
\[\mu =p\sum^n_{k=1}{q^{k-1}}\sum^n_{j=0}{q^j}\]
(In the next step, \(\sum^n_{j=0}{q^j}={{\frac{1}{1-q}}}\) using sum of geometric progression)
\[\mu =p\sum^n_{k=1}{q^{k-1}}\frac{1}{1-q}\] \[\mu =p\sum^n_{k=1}{q^{k-1}}\frac{1}{p}\] \[\mu =\frac{p}{q}\sum^n_{k=1}{q^{k-1}}\]
(In the next step, \(k-1=j\) )
\[\mu =\sum^n_{j=0}{q^j}\]
Using sum of geometric progression, we get
\[\mu =\frac{1}{1-q}\] \[\mu =\frac{1}{p}\]
Variance:-

Variance of the Geometric distribution can be derived from first principles using the formula:
\(Var\left(X\right)=E\left[{\left(x-\mu \right)}^2\right]=\sum{{\left(x-\mu \right)}^2P\left(X=x\right)}\)
or, using a simpler formula:
\(Var\left(X\right)=E\left(X^2\right)-E^2\left(X\right)\)

In order to be able to calculate \(E\left(X^2\right)\), we need to bring it to a different form.
\[E\left(X^2\right)=E\left(X^2\right)+E\left(X\right)-E\left(X\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =E\left(X^2-X\right)+E\left(X\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =E\left[X\left(X-1\right)\right]+E\left(X\right)\]
We must first calculate \(E[X(X-1)]\) and then substitute its value into the above equation to find \(E{(X}^2)\). Before calculating \(E[X(X-1)]\)it is necessary to consider the following result:-
\[\sum^n_{i=1}{i}=\frac{n\left(n+1\right)}{2}\] \[\Rightarrow \sum^{x-1}_{k=1}{k=}\frac{\left(x-1\right)\left(x-1+1\right)}{2}\] \[\sum^{x-1}_{k=1}{k}=\frac{\left(x-1\right)x}{2}\] \[\Rightarrow \left(x-1\right)x=2\sum^{x-1}_{k=1}{k}\]
Now, \(E[X(X-1)]\) can be calculated as follows:-

\[E\left[X\left(X-1\right)\right]=\sum^n_{x=1}{x\left(x-1\right)P\left(X=x\right)}\] \[E\left[X\left(X-1\right)\right]=\sum^n_{x=1}{x\left(x-1\right)q^{x-1}p}\] \[E\left[X\left(X-1\right)\right]=p\sum^n_{x=1}{\left[\left(2\sum^{x-1}_{k=1}{k}\right)q^{x-1}\right]}\] \[E\left[X\left(X-1\right)\right]=2p\sum^n_{x=1}{\sum^{x-1}_{k=1}{k}q^{x-1}}\]
(In the next step, the order of summation is changed)
\[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{\sum^n_{x=k+1}{kq^{x-1}}}\] \[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{\sum^n_{x=k+1}{kq^{x-1+k-k}}}\] \[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{\sum^n_{x=k+1}{kq^{x-1-k}}}q^k\] \[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{kq^k}\sum^n_{x=k+1}{q^{x-1-k}}\]
(In the next step, \(x-1-k=j\))
\[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{{kq}^k}\sum^n_{j=0}{q^j}\]
(The second summation above, is equal to \(\frac{1}{1-q}\) , using sum of geometric progression)
\[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{{kq}^k}\frac{1}{1-q}\] \[E\left[X\left(X-1\right)\right]=2p\sum^n_{k=1}{{kq}^k}\frac{1}{p}\] \[E\left[X\left(X-1\right)\right]=2\sum^n_{k=1}{{kq}^k}\]
\[E\left[X\left(X-1\right)\right]=2q\sum^n_{k=1}{{kq}^{k-1}}\] \[E\left[X\left(X-1\right)\right]=\frac{2q}{p}\sum^n_{k=1}{{kq}^{k-1}}p\]
(The summation in the above equation is the expression for the mean of a geometric distribution \(E\left(K\right)=\sum{{kpq}^{k-1}}\) )
\[E\left[X\left(X-1\right)\right]=\frac{2q}{p}\times \frac{1}{p}\] \[E\left[X\left(X-1\right)\right]=\frac{2q}{p^2}\]
This is the value of \(E[X(X-1)]\). This value can be substituted in the equation for \(E(X^2)\), as follows:-

\[E\left(X^2\right)=E\left[X\left(X-1\right)\right]+E\left(X\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =\frac{2q}{p^2}+\frac{1}{p}\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =\frac{2q+p}{p^2}\]
Finally, we can find the variance by substituting this value obtained, as follows:-
\[Var\left(X\right)={\sigma }^2=E\left(X^2\right)-E^2\left(X\right)\] \[{\sigma }^2=\frac{2q+p}{p^2}-\frac{1}{p^2}\] \[\ \ \ \ =\frac{2q-\left(1-p\right)}{p^2}\] \[\ \ \ \ =\frac{2q-q}{p^2}\] \[\ \ \ \ =\frac{q}{p^2}\]
Probability Generating Function(PGF):-

The PGF of a geometric distribution is
\[P_X\left(t\right)=\frac{pt}{1-qt} \ for \ \frac{-1}{q}<t<\frac{1}{q}\]
Moment Generating Function(MGF):-

The MGF of a geometric distribution is
\[M_X\left(t\right)=\frac{{pe}^t}{1-{qe}^t} \ for \ \frac{-1}{q}<e^t<\frac{1}{q}\]
Cumulant Generating Function(CGF):-

The CGF of a geometric distribution is
\[C_X\left(t\right)={\mathrm{ln} p\ }+t-{\mathrm{ln} \left(1-{qe}^t\right)\ }\]
Type 2 Geometric distribution

Let us now consider the second type of Geometric distribution, which measures the number of failures before the first success.

Probability distribution:-

Let \(Y\) be the random variable taking the values \(y=1,2,3\dots \dots \dots \) which count the number of failures before the first success. In the first type of the geometric distribution where \(x\) trials are required for the first success, the number of failures will be \(x-1\). Therefore, the relationship between the two types of geometric random variables can be written as \(Y=X-1\) . So, the probability function of the second type can be obtained by substituting \(y\) in place of \(x-1\) in the probability function of the first type.

\[P\left(Y=y\right)=q^yp \ for \ y=0,1,2,3\dots \dots \]
Mean:-

The mean or expected value of the second type of Geometric distribution can easily be found, as follows:-
\[E\left(Y\right)=E\left(X-1\right)\] \[\ \ \ \ \ \ \ \ \ \ =E\left(X\right)-E\left(1\right)\] \[\ \ \ \ \ \ \ \ \ \ =E\left(X\right)-1\] \[\ \ \ \ \ \ \ \ \ \ =\frac{1}{p}-1\] \[\ \ \ \ \ \ \ \ \ \ =\frac{1-p}{p}\] \[\ \ \ \ \ \ \ \ \ \ =\frac{q}{p}\]
Variance:-

The variance too can be obtained easily, as follows:-

\[Var\left(Y\right)=Var\left(X-1\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ \ =Var\left(X\right).1^2\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ \ =Var\left(X\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ \ =\frac{q}{p^2}\]
EXAMPLES
Example 1

A statistics teacher is trying to explain a complicated problem on geometric distribution to her student. From past experience, she knows that the probability that the student would understand a complicated problem in one attempt is 0.2. She wishes to determine the probability that the student would understand the problem on the 5th attempt.

In this case each attempt can be considered as a Bernoulli trial, with probability of success equal to 0.2 and probability of failure equal to \(1-0.2=0.8\) .

\(p=0.2\)
\(q=0.8\)
\(x=5\)

\[P\left(X=x\right)=q^{x-1}p\] \[P\left(X=5\right)={0.8}^{5-1}\times 0.2\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ =0.08192\]
Example 2

A boy sets a new password on his mobile phone, and challenges his friend to attempt to open the mobile lock by typing the right password. The friend says that he would have three attempts and would get the right password in one of the three attempts. He wishes to know the probability that he would win this challenge, if the probability of getting the right password on a single try is 0.4.

In this case, each try can be considered as a Bernoulli trial with probability of success equal to 0.4 and probability of failure equal to \(1-0.4=0.6\) . The friend would win the challenge if he would be able to get the right password either on the first attempt or on the second attempt or on the third attempt.
So, the total probability required here is the probability of getting the right password on the first or second or third attempt, which is
\(P\left(X=1\right)+P\left(X=2\right)+P\left(X=3\right)\)

In this example,
\(P=0.4\)
\(q=1-0.4=0.6\)

\[P\left(X=x\right)=q^{x-1}p\] \[P\left(X=1\right)={0.6}^0\times 0.4=0.4\] \[P\left(X=2\right)={0.6}^1\times 0.4=0.24\] \[P\left(X=3\right)={0.6}^2\times 0.4=0.144\]
Therefore, the probability that the friend would win the challenge will be \(0.4+0.24+0.144=0.784\)

This means that the friend has \(78.4\ \%\) chance of winning the challenge.

Mean:-
\[\mu =\frac{1}{p}\] \[\mu =\frac{1}{0.4}=2.5\]
This means that if the challenge was repeatedly performed many times, then on average, the friend would take 2.5 attempts to get the right password.

Variance:-
\[{\sigma }^2=\frac{q}{p^2}\] \[{\sigma }^2=\frac{o.6}{{o.4}^2}=3.75\]