FEEDBACK ABOUT
BERNOULLI DISTRIBUTION
DEFINITION
The Bernoulli distribution is a discrete probability distribution in which the random variable can take only two possible values 0 or 1, where 1 is assigned in case of success or occurrence (of the desired event) and 0 on failure or non-occurrence.
Thus, the probability of success is the probability that the random variable takes the value 1, and the probability of failure is the probability that the random variable takes the value 0.
This can be mathematically represented as \(P\left(X=1\right)=\theta \) and \(P\left(X=0\right)=1-\theta \) where θ is the probability of success and \(1-\theta \) is the probability of failure. θ and \(1-\theta \) are sometimes written as 'p' and 'q' respectively, where \(q=1-p\) .
The Bernoulli distribution is a special case of the Binomial distribution when the number of trails is 1.
CALCULATOR

Enter the following details:

Probability of success:


Mean of the specified bernoulli distribution:

Variance of the specified bernoulli distribution:

Probability generating function:


Moment generating function:


Cumulant generating function:


Probability distribution
x p(X=x)
FORMULA AND DERIVATION
Probability distribution:-

As stated above, a Bernoulli random variable takes the value 1 in case of a success, with probability p, and takes the value 0 in case of a failure, with probability \(q=1-p\ \). This can be written as follows:-

\(P\left(X=1\right)=p\ \) and \(\ P\left(X=0\right)=q\)

Therefore, the probability function can be written as follows:-
\[P\left(X=x\right)=p^xq^{\left(1-x\right)}\ for\ x=0,1\]
Mean:-

The mean or expected value for the Bernoulli distribution can be calculated from first principles as follows:-
\[E\left(X\right)=\mu =\sum{xP\left(X=x\right)}\] \[\mu =\left(0\times q\right)+\left(1\times p\right)\] \[\mu =p\]
The mean of the Bernoulli distribution is simply the probability of success, p.

Variance:-

Variance of the Bernoulli distribution can be derived from first principles using the formula:
\(Var\left(X\right)=E\left[{\left(x-\mu \right)}^2\right]=\sum{{\left(x-\mu \right)}^2P\left(X=x\right)}\)
or, using a simpler formula:
\(Var\left(X\right)=E\left(X^2\right)-E^2\left(X\right)\)

\(E\left(X^2\right)\)can be calculated as follows:-
\[E\left(X^2\right)=\sum{x^2}P\left(X=x\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =\left(0^2\times q\right)+\left(1^2\times p\right)\] \[\ \ \ \ \ \ \ \ \ \ \ \ \ =p\]
Substituting this into the formula for variance, we get:-
\[Var\left(X\right)={\sigma }^2=E\left(X^2\right)-E^2\left(X\right)\] \[{\sigma }^2=p-p^2\] \[\ \ \ =p\left(1-p\right)\] \[\ \ \ =pq\]
Probability Generating Function(PGF):-

The PGF for a Bernoulli distribution is
\[G_X\left(t\right)=pt+q\]
Moment Generating Function(MGF):-

The MGF for a Bernoulli distribution is
\[M_X\left(t\right)={pe}^t+q\]
Cumulant Generating Function(CGF):-

The CGF for a uniform distribution is
\[C_X\left(t\right)={\mathrm{ln} \left({pe}^t+q\right)\ }\]
EXAMPLES
Example 1

There are four cards marked A, B, C and D. A gambler makes a bet that on drawing one of the four cards, he would get the card marked D.
In this case, getting the card marked D would be considered a success. The probability of success is nothing but the probability of getting card D, which is ¼.

Therefore,
\[p={1}/{4}=0.25\]
\[q=1-{1}/{4}={3}/{4}=0.75\]
The probability distribution can be written as follows:-

\(x\) \(P(X=x)\)
\(0\) \(\frac{3}{4}=0.75\)
\(1\) \(\frac{1}{4}=0.25\)

Mean:-
\[E\left(X\right)=\mu =p=0.25\]
Variance:-
\[{\sigma }^2=pq=0.25\times 0.75=0.1875\]