# Bernoulli Distribution

## Parametric Random Variables

There are many classic and commonly-seen random variable abstractions that show up in the world of probability. At
this point in the class, you will learn about several of the most significant parametric discrete distributions.
When solving problems, if you can recognize that a random variable fits one of these formats, then you can
use its pre-derived probability mass function (PMF), expectation, variance, and other properties. Random variables
of this sort are called **parametric** random variables. If you can argue that a random variable falls under one
of the studied parametric types, you simply need to provide parameters. A good analogy is a `class`

in
programming. Creating a parametric random variable is very similar to calling a constructor with input parameters.

## Bernoulli Random Variables

A Bernoulli random variable (also called a *boolean* or *indicator* random variable) is the simplest kind
of parametric random variable. It can take on two values, 1 and 0. It takes on a 1 if an experiment with probability
$p$ resulted in success and a 0 otherwise. Some example uses include a coin flip, a random binary digit, whether a
disk drive crashed, and whether someone likes a Netflix movie. Here $p$ is the parameter, but different instances of
Bernoulli random variables might have different values of $p$.

Here is a full description of the key properties of a Bernoulli random variable. If $X$ is declared to be a Bernoulli random variable with parameter $p$, denoted $X ∼ \Ber(p)$:

**Bernoulli Random Variable**

Notation: | $X \sim \Ber(p)$ |
---|---|

Description: | A boolean variable that is 1 with probability $p$ |

Parameters: | $p$, the probability that $X=1$. |

Support: | $x$ is either 0 or 1 |

PMF equation: | $\p(X=x) = \begin{cases} p && \text{if }x = 1\\ 1-p && \text{if }x = 0 \end{cases}$ |

PMF (smooth): | $\p(X=x) = p^x(1-p)^{1-x}$ |

Expectation: | $\E[X] = p$ |

Variance: | $\var(X) = p (1-p)$ |

PMF graph: |

Because Bernoulli distributed random variables are parametric, as soon as you declare a random variable to be of type Bernoulli you automatically can know all of these pre-derived properties! Some of these properties are straightforward to prove for a Bernoulli. For example, you could have solved for expectation:

**Proof**: Expectation of a Bernoulli. If $X$ is a Bernoulli with parameter $p$, $X \sim \Ber(p)$:

**Proof**: Variance of a Bernoulli. If $X$ is a Bernoulli with parameter $p$, $X \sim \Ber(p)$:

## Indicator

Bernoulli random variables and indicator variables are two aspects of the same concept. A random variable $I$ is an indicator variable for an event $A$ if $I = 1$ when $A$ occurs and $I = 0$ if $A$ does not occur. $\P(I=1)=\P(A)$ and $\E[I]=\P(A)$. Indicator random variables are Bernoulli random variables, with $p=\P(A)$.