# Probability of **and**

The probability of the ** and** of two events, say $E$ and $F$, written $\p(E \and F)$, is the probability of both events happening. You might see equivalent notations $\p(EF)$, $\p(E ∩ F)$ and $\p(E,F)$ to mean the probability of and. How you calculate the probability of event $E$ and event $F$ happening
depends on whether or not the events are "independent". In the same way that mutual exclusion makes it easy to calculate the probability of the

**of events, independence is a property that makes it easy to calculate the probability of the**

*or***of events.**

*and***And** with Independent Events

If events are **independent** then calculating the probability of ** and** becomes simple multiplication:

**: Probability of**

*Definition***for independent events.**

*and*If two events: $E$, $F$ are independent then the probability of $E$ ** and** $F$ occurring is:
$$
\p(E \and F) = \p(E) \cdot \p(F)
$$

This property applies regardless of how the probabilities of $E$ and $F$ were calculated and whether or not the events are mutually exclusive.

The independence principle extends to more than two
events. For $n$ events $E_1, E_2, \dots E_n$ that are ** mutually** independent of one another -- the independence equation also holds for all subsets of the events.
$$
\p(E_1 \and E_2 \and \dots \and E_n) = \prod_{i=1}^n \p(E_i)
$$

We can prove this equation by combining the definition of conditional probability and the definition of independence.

** Proof**: If $E$ is independent of $F$ then $\p(E \and F) = \p(E) \cdot \p(F)$

See the chapter on independence to learn about when you can assume that two events are independent

**And** with Dependent Events

Events which are not independent are called ** dependent** events. How can you calculate the probability of the

**and**of dependent events? If your events are mutually exclusive you might be able to use a technique called DeMorgan's law, which we cover in a later chapter. For the probability of and in dependent events there is a direct formula called the chain rule which can be directly derived from the definition of conditional probability:

**: The chain rule.**

*Definition*The formula in the definition of conditional probability can be re-arranged to derive a general way of calculating the probability of the ** and** of any two events:
$$
\p(E \and F) = \p(E | F) \cdot \p(F)
$$

Of course there is nothing special about $E$ that says it should go first. Equivalently: $$ \p(E \and F) = \p(F \and E) = \p(F | E) \cdot \p(E) $$

We call this formula the "chain rule." Intuitively it states that the probability of observing events $E$ ** and** $F$ is the
probability of observing $F$, multiplied by the probability of observing $E$, given that you have observed $F$.
It generalizes to more than two events:
$$
\begin{align}
\p(E_1 \and E_2 \and \dots \and E_n) = &\p(E_1) \cdot \p(E_2|E_1) \cdot \p(E_3 |E_1 \and E_2) \dots \\ &\p(E_n|E_1 \dots E_{n−1})
\end{align}
$$