CasinoCityTimes.com

Home
Gaming Strategy
Featured Stories
News
Newsletter
Legal News Financial News Casino Opening and Remodeling News Gaming Industry Executives Author Home Author Archives Author Books Search Articles Subscribe
Newsletter Signup
Stay informed with the
NEW Casino City Times newsletter!
Recent Articles
Best of Donald Catlin
author's picture
 

Are Two Hands Better than One?

16 June 1999

    A while ago I was playing blackjack at a downtown casino in La Vegas.  As I often do I was playing two hands, $25 each.  This particular casino allows players to keep their own rating cards. When I finished playing and got my validated rating card I was surprised to see that I had not been rated at $50 but rather $35.  When I asked the pit boss about this she told me that since one hand often pays the other, thereby increasing the playing time for a given buy-in, that the reduced dollar figure compensates for this.  Well, I'll have to admit that the reasoning sounded plausible.  Nevertheless, I wasn't sure, I wasn't happy, and what is more, where did this $35 figure come from?

    This same trip I stopped at another casino just off the strip to ding my credit and before playing asked the pit boss how two $25 hands are rated.  He told me they are rated as $50.  He went on to say that if two different people were each playing $25, each would receive $25 credit for their time so it is the same as $50 to the casino.  Indeed, that sounded right also.  Well, almost.  It is conceivable that if each player bought in for the same amount, then one player could go broke while the other kept playing.  If the same person were playing two hands, money would effectively be shifted from the winning hand to the losing hand keeping the losing hand in the game longer (and the winning hand in the game for less time?).  Confusing isn't it?  It is certainly true, however, that one casino was right and the other was wrong.  Which was which?  Let's see if we can sort this out with a bit (quite a bit) of mathematics.

    To analyze the above question, I am going to make a few simplifying assumptions.  For one, I am going to ignore the fact that not all payoffs are even money, that is, there are splits, doubles and blackjacks.  I am also going to ignore the fact that on single hands ties can occur.  Finally, in the two-hand case, I am going to assume that the hands are independent. Essentially we are taking the blackjack game and replacing it with a sequence of Bernoulli trials that have the same house edge as the blackjack game.  That is to say, if the house edge at a particular blackjack game is e, we find probabilities p and q such that q - p = e, p is the probability of winning, q is the probability of losing, p + q = 1, and the probabilities do not change from trial to trial.  An example of a situation wherein this model would be exactly correct would be a bet at odd or even in roulette.  Nevertheless, this model is a close enough approximation to our blackjack game to answer the question we are addressing.

    I will need two mathematical facts.  The first involves the ruin probability Rz.  This represents the probability that the player will go broke when starting with a stake of z units, winning or losing one unit per trial, and playing until either he goes broke or reaches a stake of a.  Here is the first fact.

Fact 1:

If the function Rz satisfies the difference equation

Rz = pRz+1 + qRz-1 (1)
with boundary conditions
R0 = 1 and Ra = 0 (2)
then the function Rz necessarily has the form
Rz = [(q/p)a - (q/p)z] / [(q/p)a - 1] (3)
Note that the condition in (2) says that if your stake is zero then ruin is certain and if your stake is a then ruin is impossible.

    Is Fact 1 obvious?  Not at all.  But once you have the result in (3) it is just a matter of high school algebra to check that (1) and (2) hold. But where did (1) come from?  Simple.  If you start with stake z then your chance of ruin is just Rz+1 with probability p of winning on the next trial or probability Rz-1 with probability q of losing on the next trial; this relation captures the essence of Rz.

    Fact 2 involves the notion of the expected duration of play Dz starting with a stake of z units and playing till ruin or stake level a.  This quantity will be equal to 1 + Dz+1 with probability p of winning on the next play or 1 + Dz-1 with probability q of losing on the next play, the 1 added to account for that single play.  Hence Dz = p(1 + Dz+1) + q(1 + Dz-1).  Since p + q = 1 this reduces to Dz = 1 + pDz+1+ qDz-1.  Here is Fact 2.

Fact 2:

If the function Dz satisfies the difference equation

Dz = c + pDz+1+ qDz-1 (4)
with boundary conditions
D0 =  0  and  Da = 0 (5)
then the solution is necessarily of the form
Dz = [c(z - a)/(q - p)] + [ac/(q - p)]Rz (6)
Note that (5) says that once the stake reaches either 0 or a that the expected number of plays left is zero.  The reason for the constant c rather than the number 1 will be made clear later.  Again, once (6) is given, it is a simple matter of high school algebra to check and see that (4) and (5) hold.  Write to me if you would like details on this.

    Very well, we are ready to look at our question.  Let us hypothesize a 4-deck game with dealer standing on soft 17, resplit twice for three hands, Aces split once, no Surrender, double on any two cards, and double after split.  This game is around a 0.44% house edge as a percentage of ante for a Basic Strategy player.  This means that p = 0.4978 and q = 0.5022.  Let us further hypothesize that we buy into the game for 10 units and play until we reach 20 or we go broke.  Using the formulas (3) and (6) presented above we find that

R10 = 0.521985951 (7)
D10 = 99.936 hands
Under these conditions what is our expected loss?  Well, the casino can expect to win z units with probability Rz and lose a - z units (our win) with probability 1 - Rz.  The expected return for the casino is, therefore,
House Exp = zRz - (a - z)(1 - Rz) (8)
= zRz - (a - z - aRz + zRz)
= z - a + aRz
Using the formula (8) that we just derived we find that the expected return to the casino is
House Exp = 0.43972 units (9)

    Very well, now let us suppose that we play 1/2 unit on each of two hands.  What can happen?  Well, we can win both hands for a gain of one unit; this happens with probability p2 (here is where the assumption of independence is used).  We can win hand 1 and lose hand 2 or win hand 2 and lose hand 1 for no gain; this happens with probability 2pq.  Finally we can lose both hands for a loss of one unit; this happens with probability q2.  The difference equation for the ruin probability in this case looks like (I'll use RTz for the ruin probability for two hands)

RTz = p2RTz+1 + 2pqRTz + q2RTz-1 (10)
which can be rewritten as
(1 - 2pq)RTz = p2RTz+1q2RTz-1 (11)
Now since p + q = 1 it follows that (p + q)2 = 1 or p2 + 2pq + q2 = 1.  From this last relation it follows that 1 - 2pqp2 + q2.  Substituting this last expression in the right side of (11) we have
(p2+ q2)RTz = p2RTz+1q2RTz-1 (12)
Dividing through by  p2+ q2 we have
RTz = [p2/(p2+ q2)]RTz+1 +  [q2/(p2+ q2)]RTz-1 (13)

But look!  The quantities p2/(p2+ q2) and q2/(p2+ q2) add up to 1 so these act just like the probabilities in Fact 1.  Using this observation, the solution to (13) with the same boundary conditions as (2) can be obtained from (3) by replacing p with p2/(p2+ q2) and q with the quantity q2/(p2+ q2).  The result, after a bit of algebraic clean up is:

RTz = [(q/p)2a - (q/p)2z] / [(q/p)2a - 1] (14)
We will use (14) in a moment.

    What about the duration for two hands?  I'll use the symbol  DTz for the expected duration of play using two hands and a stake of z.  Here DTz is equal to the expected duration of 1 + DTz+1 with probability p2 (a win on the next hand) or 1 + DTz  with probability 2pq or 1 + DTz-1 with probability q2.  In symbols:

DTz = p2(1 + DTz+1) + 2pq(1 + DTz) + q2(1 + DTz-1) (15)
Carrying out the indicated multiplications, equation (15) can be rewritten as
DTz = p2 + 2pq + q2 + p2DTz+1 + 2pqDTz + q2DTz-1 (16)
As noted above, p2 + 2pq + q2 = 1 so that (16) becomes
DTz =  1 + p2DTz+1 + 2pqDTz + q2DTz-1 (17)
or
(1 - 2pq)DTz = 1 +  p2DTz+1q2DTz-1 (18)
As noted earlier, the term 1 - 2pq on the right of (18) is just p2 + q2.  making the substitution and dividing through by it we finally have
DTz = 1/( p2 + q2) + [p2/(p2+ q2)]RTz+1 +  [q2/(p2+ q2)]RTz-1 (19)
Note that (19) has the form of equation (4) with c = 1/( p2 + q2) and the probabilities are exactly the same as those used in deriving (13) and (14).  Thus we can use (6) to solve (19) and obtain
DTz = (z - a)/(q2 - p2) + [a/(q2 - p2)]RTz (20)
We can make one further simplification to (20) by noting that q2 - p2 = (q + p)(q - p) and that q + p = 1 so that q2 - p2 = q - p. Using this observation (20) becomes
DTz = (z - a)/(q - p) + [a/(q - p)]RTz (21)
which, coincidentally, is exactly the same form as (6).

    Let us use these new results to examine the two-handed game in our example.  Recall that z = 10 and a = 20; p = 0.4978 and q = 0.5022.  Using these numbers formula (14) gives us

RT10 = 0.543887052      
and (21) produces
DT10 = 199.4866 hands (22)

Comparing (22) with (7) we see that the expected number of hands has almost doubled from the single-hand case lending some credence to the downtown casino's point of view.  On the other hand, the ruin probability has increased.  What really matters though is the expected return to the casino in the two-hand scenario.  Using formula (8) with the number in (22) for RT10 we calculate

House Expectation = 0.87774104 units (23)
This number is almost double the number in (9).  In other words, playing 2 hands with 1/2 unit on each hand the house expectation is almost double what it was with one hand.  Likewise, the duration of play is approximately double the duration in the single-hand case.

    So there is the answer.  The casino should rank the two-hand play as the sum of the units on each hand; playing twice as long I expect to lose twice as much.  By ranking my two $25 hands as a $35 average bet, my estimated expected loss is 30% less than my actual expected loss.  Put another way, playing two hands, my actual expected loss is about 43% greater than the downtown casino is giving me credit for.  Where did they get $35?  I haven't the slightest idea.

    What about other games?  Earlier I mentioned roulette.  One could, for example, place one half unit on Red/Black and one half unit on Odd/Even.  Here the expected return to the house is 7.568 units (out of 10) whereas the return to the house playing one unit on (say) Odd/Even is only 4.824 units .  Notice that, unlike blackjack, the return playing two spots is less than double the return playing one spot.  Does this mean the two-spot player should be rated at less than one unit?  No indeed.  The figure is less than before simply because the duration of play is less than twice the duration of the one-spot player.  In fact, now that we have all of our formulas above, let me show you exactly what is at work here.

    Recall that the formula for the duration of play is the same for two hands as it is for one except that the ruin probability is different in each case.  Taking either formula (6) (with c = 1) or formula (21) and multiplying it through by q - p, the house expectation per hand, we have

(q - p)Dz z - a + aRz (24)
which is exactly the same as the right-hand side of (8), the formula for the expected return to the house.  In other words
House Expectation = (q - p)Dz (25)
that is, the house expectation is simply the expected return per hand times the expected number of hands, a fact that agrees with common ordinary horse sense.

    In summary, playing two hands at one half unit each must have an expected return per deal to the casino of q - p, the same as that for one hand at one unit, if the correct house expectation is realized.  What is true is that playing two hands rather than one changes both the ruin probability and the duration of play, but the expected return to the house is just q - p times this duration.  Rating the two-handed player at less than one unit is to estimate his expected loss at less than it really is and is, well, a rip off.  So if you want to play two hands, which I believe is more fun, you should realize that for a given buy-in and goal-to-quit, you will lose more per playing session on average than if you play a single hand, you'll get almost twice the playing time per session, and unless you get rated at the full sum of your two hands you are getting gypped when it comes to comps.  The lesson is: before you play two hands of anything, ask the floor person how you will be rated.

    The irony in the downtown casino's rating policy is that it discourages two-handed play yet, as we have seen above, a two-handed player playing to double his stake or go broke will, on average, produce a larger win for the casino than the corresponding one-handed player.  Of course, there has been ample evidence through the years that not all casinos understand the business they're in.

    See you next month; I'll try to lighten up on the algebra!

Donald Catlin

Don Catlin is a retired professor of mathematics and statistics from the University of Massachusetts. His original research area was in Stochastic Estimation applied to submarine navigation problems but has spent the last several years doing gaming analysis for gaming developers and writing about gaming. He is the author of The Lottery Book, The Truth Behind the Numbers published by Bonus books.

Books by Donald Catlin:

Lottery Book: The Truth Behind the Numbers
Donald Catlin
Don Catlin is a retired professor of mathematics and statistics from the University of Massachusetts. His original research area was in Stochastic Estimation applied to submarine navigation problems but has spent the last several years doing gaming analysis for gaming developers and writing about gaming. He is the author of The Lottery Book, The Truth Behind the Numbers published by Bonus books.

Books by Donald Catlin:

Lottery Book: The Truth Behind the Numbers