Bridge Card Game

Stub: This section is not complete. Parts might not be fully written

Bridge is one of the most popular collaborative card games. It is played with four players in two teams. A few interesting probability problems come up in this game. You do not need to know the rules of bridge to follow this example.

Distribution of Suit Splits

When playing the game there are many times when one player will know exactly how many cards there are of a certain suit between their two opponents hands (call the opponents A and B). However, the player won't know the "split": how many of that particular suit are in opponent A's hand and how many cards of that suit are in opponent B's hand.

Both opponents have equal sized hands with $k$ cards left. Across the two hands there are a known number of cards of a particular suit (eg spades) $n$, and you want to know how many are in one hand and how many are in the other. A split is represented as a tuple. For example $(0, 5)$ would mean 0 cards of the suit in opponent A's hands and 5 in opponent B's. Feel free to chose specific values for $k$ and $n$:

$k$, the number of cards in each player's hand:
$n$, the number of cards of particular suit among the two hands:

A few notes: If there are $k$ cards in each of the 2 hands there are $2 k$ cards total. At the start of a game of bridge $k=13$. It must be the case that $n \leq 2 k$ because you can't have more cards of the suit left than number of cards! If there are $n$ of a suit, then there are $2k -n$ of other suits. This problem assumes that the cards are properly shuffled.

Probability of different splits of the suit:

Let $Y$ be a random variable representing the number of the suit in opponent A's hand. We can calculate the probability that $Y$ equals different values $i$ by counting equally likely outcomes. $$\p(Y = i) = \frac { {n \choose i} \cdot {2\cdot k-n \choose k-i} } { {2\cdot k \choose k}}$$

Can you figure out how we came up with that formula? It uses equally likely outcomes where each element in the sample set is a chosen set of $k$ cards to be dealt to one player (out of $2k$ cards which go to both). For $k =$13 and $n =$5 here is the PMF over splits:

If we want to think about the probability of a given split, it is sufficient to chose one hand (call it "hand one") and think about the probability of the number of the given suit in that hand. Though there are two hands, if I tell you how many of a suit are in one hand, you can automatically figure out how many of the suit are in the other hand: recall that the number of the suit sums to $n$.

Probability that either hand has at least $j$ cards of suit

Let $X$ be a random variable representing the highest number of cards of the suit in either hand. We can calculate the probability by using probability of or. $$\p(X \geq j) = 1 - \sum_{i=n-j+1}^{j - 1}\p(Y= i)$$

Distribution of Hand Strength

The way folks play bridge is that they make a calculation about their "hand strength" and then make decisions based off that number. The strength of your hand is a number which is equal to 4 times the number of "aces", 3 times the number of "kings", 2 times the number of "queens" and 1 times the number of "jacks" in your hand. No other cards contribute to your hand strength. Lets consider your hand strength to be a random variable and compute its distribution. It seems complex to compute by hand -- but perhaps we could run a simulation? Here we simulate a million deals of bridge hands, calculate the hand strengths, and use that to approximate the the distribution of hand strengths:

You might notice that at first blush this looks a lot like a poisson with rate $\lambda = 10$. First, lets consider why the rate might be 10. Let $X_i$ be the points of a given card $i$. Since each card value is equally likely $\p(X_i=x) = \frac{1}{13}$. The expectation of points for each card is $\E[X] = \sum_x x \cdot \p(X_i=x) = (1+2+3+4)\frac{1}{13}$. Let $H$ be the value of a hand. The value of a hand is the sum of the value of each card: \begin{align}\E[H] &= \sum_{i\in \{1 \dots 13\}} \E[X_i] \\&= 13 \cdot \E[X_i] \\&= 13 \cdot (1+2+3+4)\frac{1}{13} = 10\end{align} Saying that $H$ is approximately $\sim \Poi(\lambda=10)$ is an interesting claim. It suggests that points in a hand come at a constant rate, and that the next point in your hand is independent of when you got your last point. Of course this second part of the assumption is mildly violated. There are a fixed set of cards so getting one card changes the probabilities of others. For this reason the poisson is a close, but not perfect approximation.

Joint distribution of hand strength among two hands

In most card games it doesn't just matter how strong your hand is, but the relative strength of your hand and another hand. In bridge, you play with a partner. We know that the two hands are not independent of each other. If I tell you that your partner has a strong hand, that means there are fewer "high value" cards that can be in your hand, and as such by belief in your strength has changed. If you think about each player's hand strength as a random variable, we care about the joint distribution of hand strength.

Finally lets consider the conditional distribution of your partners points given your points: