$\DeclareMathOperator{\p}{P}$ $\DeclareMathOperator{\P}{P}$ $\DeclareMathOperator{\c}{^C}$ $\DeclareMathOperator{\or}{ or}$ $\DeclareMathOperator{\and}{ and}$ $\DeclareMathOperator{\var}{Var}$ $\DeclareMathOperator{\Var}{Var}$ $\DeclareMathOperator{\Std}{Std}$ $\DeclareMathOperator{\E}{E}$ $\DeclareMathOperator{\std}{Std}$ $\DeclareMathOperator{\Ber}{Bern}$ $\DeclareMathOperator{\Bin}{Bin}$ $\DeclareMathOperator{\Poi}{Poi}$ $\DeclareMathOperator{\Uni}{Uni}$ $\DeclareMathOperator{\Geo}{Geo}$ $\DeclareMathOperator{\NegBin}{NegBin}$ $\DeclareMathOperator{\Beta}{Beta}$ $\DeclareMathOperator{\Exp}{Exp}$ $\DeclareMathOperator{\N}{N}$ $\DeclareMathOperator{\R}{\mathbb{R}}$ $\DeclareMathOperator*{\argmax}{arg\,max}$ $\newcommand{\d}{\, d}$

Bridge Card Game

Bridge is one of the most popular collaborative card games. It is played with four players in two teams. A few interesting probability problems come up in this game. You do not need to know the rules of bridge to follow this example. I focus on a set of probability problems which are most important for game strategy.

Distribution of Hand Strength

The way folks play bridge is that they make a calculation about their "hand strength" and then make decisions based off that number. The strength of your hand is a number which is equal to 4 times the number of "aces", 3 times the number of "kings", 2 times the number of "queens" and 1 times the number of "jacks" in your hand. No other cards contribute to your hand strength. Lets consider your hand strength to be a random variable and compute its distribution. It seems complex to compute by hand -- but perhaps we could run a simulation? Here we simulate a million deals of bridge hands and calculate the hand strengths. Let $X$ be the strength of a hand. From the Definition of Probability: $$ \p(X=x) \approx \frac{\text{count}(x)}{100000} $$

Wait! Is that a Poisson?

If you pay very close attention might notice that this PMF looks a lot like a poisson PMF with rate $\lambda = 10$. There is a nice explanation for why the rate might be 10. Let $H$ be the value of your hand. Let $X_i$ be the points of the $i$th card in your hand, which has 13 cards. $H = \sum_{i=1}^{13} X_i$.

First we compute $\E[X_i]$, the expectation of points for the $i$th card in your hand — without considering the other cards . A card can take on four non zero values $X_i \in \{1, 2, 3, 4\}$. For each value there are four cards out of 52 with that value, eg $\p(X_i=1) = \frac{4}{52}$. Thus \begin{align*} \E[X_i] &= \sum_x x \cdot \p(X_i=x) \\ &= (1+2+3+4)\frac{1}{13} \\ &= \frac{10}{13} \end{align*} We can then calculate $\E[H]$ by using the fact that the expectation of the sum of random variables is the sum of expectations, regardless of independence: \begin{align}\E[H] &= \sum_{i=1}^{13} \E[X_i] \\ &= 13 \cdot \E[X_i] \\ &= 13 \cdot \frac{10}{13} \\ &= 10\end{align} Saying that $H$ is approximately $\sim \Poi(\lambda=10)$ is an interesting claim. It suggests that points in a hand come at a constant rate, and that the next point in your hand is independent of when you got your last point. Of course this second part of the assumption is mildly violated. There are a fixed set of cards so getting one card changes the probabilities of others. For this reason the poisson is a close, but not perfect approximation.

Joint Distribution of Hand Strength Among Two Hands

It doesn't just matter how strong your hand is, but the relative strength of your hand and your partners hand (recall that in Bridge you play with a partner). We know that the two hands are not independent of each other. If I tell you that your partner has a strong hand, that means there are fewer "high value" cards that can be in your hand, and as such my belief in your strength has changed. If you think about each player's hand strength as a random variable, we care about the joint distribution of hand strength. In the joint distribution bellow the x-axis is your partner's hand strength and on the y-axis is your hand strength. The value is $\p(\text{Partner} = x, \text{YourPoints} = y)$. This joint distribution was calculated by simulating a million randomly dealt hands:

From this joint distribution we can compute conditional probabilities. For example we can compute the conditional distribution of your partner's points given your points using lookups from the joint: \begin{align*} \p(&\text{Partner} = x | \text{YourPoints} = y) \\ &= \frac { \p(\text{Partner} = x, \text{YourPoints} = y) } {\p(\text{YourPoints} = y)} && \text{Cond. Prob.} \\ &= \frac { \p(\text{Partner} = x, \text{YourPoints} = y) } {\sum_z \p(\text{Partner} = z, \text{YourPoints} = y)} && \href{ ../../part1/law_total/}{\text{LOTP} } \end{align*}

Here is a working demo of the result

Your points:

Distribution of Suit Splits

When playing the game there are many times when one player will know exactly how many cards there are of a certain suit between their two opponents hands (call the opponents A and B). However, the player won't know the "split": how many of that particular suit are in opponent A's hand and how many cards of that suit are in opponent B's hand.

Both opponents have equal sized hands with $k$ cards left. Across the two hands there are a known number of cards of a particular suit (eg spades) $n$, and you want to know how many are in one hand and how many are in the other. A split is represented as a tuple. For example $(0, 5)$ would mean 0 cards of the suit in opponent A's hands and 5 in opponent B's. Feel free to chose specific values for $k$ and $n$:

$k$, the number of cards in each player's hand:
$n$, the number of cards of particular suit among the two hands:

A few notes: If there are $k$ cards in each of the 2 hands there are $2 k$ cards total bewteen the two players. At the start of a game of bridge $k=13$. It must be the case that $n \leq 2 k$ because you can't have more cards of the suit left than number of cards! If there are $n$ of a suit, then there are $2k -n$ of other suits. This problem assumes that the cards are properly shuffled.

Probability of different splits of the suit:

Let $Y$ be a random variable representing the number of the suit in opponent A's hand. We can calculate the probability that $Y$ equals different values $i$ by counting equally likely outcomes. $$\p(Y = i) = \frac { {n \choose i} \cdot {2\cdot k-n \choose k-i} } { {2\cdot k \choose k}}$$

Each outcome in the sample set is a chosen set of $k$ distinct cards to be dealt to one player (out of the $2k$ cards). To create an outcome in the event space we first chose the $i$ cards from the $n$ cards of the given suit. We then chose $k-i$ from the cards of other suits. For $k = $13 and $n = $5 here is the PMF over splits:

If we want to think about the probability of a given split, it is sufficient to chose one hand (call it "hand one"). If I tell you how many of a suit are in one hand, you can automatically figure out how many of the suit are in the other hand: recall that the number of the suit sums to $n$.

Probability that either hand has at least $j$ cards of suit

Let $X$ be a random variable representing the highest number of cards of the suit in either hand. We can calculate the probability by using probability of or. $$\p(X \geq j) = 1 - \sum_{i=n-j+1}^{j - 1}\p(Y= i)$$