$\DeclareMathOperator{\p}{P}$ $\DeclareMathOperator{\P}{P}$ $\DeclareMathOperator{\c}{^C}$ $\DeclareMathOperator{\or}{ or}$ $\DeclareMathOperator{\and}{ and}$ $\DeclareMathOperator{\var}{Var}$ $\DeclareMathOperator{\Var}{Var}$ $\DeclareMathOperator{\Std}{Std}$ $\DeclareMathOperator{\E}{E}$ $\DeclareMathOperator{\std}{Std}$ $\DeclareMathOperator{\Ber}{Bern}$ $\DeclareMathOperator{\Bin}{Bin}$ $\DeclareMathOperator{\Poi}{Poi}$ $\DeclareMathOperator{\Uni}{Uni}$ $\DeclareMathOperator{\Geo}{Geo}$ $\DeclareMathOperator{\NegBin}{NegBin}$ $\DeclareMathOperator{\Beta}{Beta}$ $\DeclareMathOperator{\Exp}{Exp}$ $\DeclareMathOperator{\N}{N}$ $\DeclareMathOperator{\R}{\mathbb{R}}$ $\DeclareMathOperator*{\argmax}{arg\,max}$ $\newcommand{\d}{\, d}$

Uniform Distribution


The most basic of all the continuous random variables is the uniform random variable, which is equally likely to take on any value in its range ($\alpha, \beta$). $X$ is a uniform random variable ($X \sim \Uni(\alpha, \beta)$) if it has PDF: \begin{align*} f(x) = \begin{cases} \frac{1}{\beta - \alpha} &\text{when } \alpha \leq x \leq \beta \\ 0 & \text{otherwise} \end{cases} \end{align*}

Notice how the density $1/(\beta - \alpha$) is exactly the same regardless of the value for $x$. That makes the density uniform. So why is the PDF $1/(\beta - \alpha)$ and not 1? That is the constant that makes it such that the integral over all possible inputs evaluates to 1.

Uniform Random Variable

Notation: $X \sim \Uni(\alpha, \beta)$
Description: A continuous random variable that takes on values, with equal likelihood, between $\alpha$ and $\beta$
Parameters: $\alpha \in \R$, the minimum value of the variable.
$\beta \in \R$, $\beta > \alpha$, the maximum value of the variable.
Support: $x \in [\alpha, \beta]$
PDF equation: $$f(x) = \begin{cases} \frac{1}{\beta - \alpha} && \text{for }x \in [\alpha, \beta]\\ 0 && \text{else} \end{cases}$$
CDF equation: $$F(x) = \begin{cases} \frac{x - \alpha}{\beta - \alpha} && \text{for }x \in [\alpha, \beta]\\ 0 && \text{for } x < \alpha \\ 1 && \text{for } x > \beta \end{cases}$$
Expectation: $\E[X] = \frac{1}{2}(\alpha + \beta)$
Variance: $\var(X) = \frac{1}{12}(\beta - \alpha)^2$
PDF graph:
Parameter $\alpha$:
Parameter $\beta$:

Example: You are running to the bus stop. You don’t know exactly when the bus arrives. You believe all times between 2 and 2:30 are equally likely. You show up at 2:15pm. What is P(wait < 5 minutes)?

Let $T$ be the time, in minutes after 2pm that the bus arrives. Because we think that all times are equally likely in this range, $T \sim \Uni(\alpha = 0, \beta = 30)$. The probability that you wait 5 minutes is equal to the probability that the bus shows up between 2:15 and 2:20. In other words $\p(15 < T < 20)$: \begin{align*} \p(\text{Wait under 5 mins}) &= \p(15 < T < 20) \\ &= \int_{15}^{20} f_T(x) \partial x \\ &= \int_{15}^{20} \frac{1}{\beta - \alpha} \partial x \\ \\ &= \frac{1}{30} \partial x \\ \\ &= \frac{x}{30}\bigg\rvert_{15}^{20} \\ &= \frac{20}{30} - \frac{15}{30} = \frac{5}{30} \end{align*}

We can come up with a closed form for the probability that a uniform random variable $X$ is in the range $a$ to $b$, assuming that $\alpha \leq a \leq b \leq \beta$: \begin{align*} \P(a \leq X \leq b) &= \int_a^b f(x) \d x \\ &= \int_a^b \frac{1}{\beta - \alpha} \d x \\ &= \frac{ b - a }{ \beta - \alpha } \end{align*}