Expectation of Sum Proof
Now that we have learned about joint probabilities, we have all the tools we need to prove one of the most useful properties of Expectation: the fact that the expectation of a sum of random variables is equal to the sum of expectation (even if the variables are not independent). In other words:
For any two random variables
The proof is going to use the Law of Unconcious statistician (LOTUS) where the function is addition!
Proof: Expectation of Sum
LetAt no point in the proof do we need to assume that
Demonstration of the Proof
Here is an example to show the idea behind the proof. This table shows the joint probabilities
0.1 | 0.3 | |
0.2 | 0.4 |
Aside: These two random variables can each only take on two values. Having only four values in the joint table will make it easier to gain intuition.
Computing using joint probabilities:
A key insight from the proof is that we can compute
1 | 4 | 0.1 | 1 × 0.1 = 0.1 |
1 | 5 | 0.3 | 1 × 0.3 = 0.3 |
2 | 4 | 0.2 | 2 × 0.2 = 0.4 |
2 | 5 | 0.4 | 2 × 0.4 = 0.8 |
E[
Computing using joint probabilities:
Similarly, we can compute
1 | 4 | 0.1 | 4 × 0.1 = 0.4 |
1 | 5 | 0.3 | 5 × 0.3 = 1.5 |
2 | 4 | 0.2 | 4 × 0.2 = 0.8 |
2 | 5 | 0.4 | 5 × 0.4 = 2.0 |
E[
Computing using joint probabilities:
We can rewrite
1 | 4 | 0.1 | 0.1 | 0.4 | 0.1 + 0.4 = 0.5 |
1 | 5 | 0.3 | 0.3 | 1.5 | 0.3 + 1.5 = 1.8 |
2 | 4 | 0.2 | 0.4 | 0.8 | 0.4 + 0.8 = 1.2 |
2 | 5 | 0.4 | 0.8 | 2.0 | 0.8 + 2.0 = 2.8 |
Recall that
Using the above derivation of the formula for
E[
We can observe that each of these values showed up exactly once when calculating
E[
E[
Because they are summing the same values, it is no surprise that the sum of the expectations is equal to the expectation of the sum: