Newsletter Signup
Stay informed with the
NEW Casino City Times newsletter! Recent Articles
Best of Donald Catlin
|
Gaming Guru
Are Two Hands Better than One?16 June 1999
A while ago I was playing blackjack at a downtown casino in La Vegas. As I often do I was playing two hands, $25 each. This particular casino allows players to keep their own rating cards. When I finished playing and got my validated rating card I was surprised to see that I had not been rated at $50 but rather $35. When I asked the pit boss about this she told me that since one hand often pays the other, thereby increasing the playing time for a given buy-in, that the reduced dollar figure compensates for this. Well, I'll have to admit that the reasoning sounded plausible. Nevertheless, I wasn't sure, I wasn't happy, and what is more, where did this $35 figure come from? This same trip I stopped at another casino just off the strip to ding my credit and before playing asked the pit boss how two $25 hands are rated. He told me they are rated as $50. He went on to say that if two different people were each playing $25, each would receive $25 credit for their time so it is the same as $50 to the casino. Indeed, that sounded right also. Well, almost. It is conceivable that if each player bought in for the same amount, then one player could go broke while the other kept playing. If the same person were playing two hands, money would effectively be shifted from the winning hand to the losing hand keeping the losing hand in the game longer (and the winning hand in the game for less time?). Confusing isn't it? It is certainly true, however, that one casino was right and the other was wrong. Which was which? Let's see if we can sort this out with a bit (quite a bit) of mathematics. To analyze the above question, I am going to make a few simplifying assumptions. For one, I am going to ignore the fact that not all payoffs are even money, that is, there are splits, doubles and blackjacks. I am also going to ignore the fact that on single hands ties can occur. Finally, in the two-hand case, I am going to assume that the hands are independent. Essentially we are taking the blackjack game and replacing it with a sequence of Bernoulli trials that have the same house edge as the blackjack game. That is to say, if the house edge at a particular blackjack game is e, we find probabilities p and q such that q - p = e, p is the probability of winning, q is the probability of losing, p + q = 1, and the probabilities do not change from trial to trial. An example of a situation wherein this model would be exactly correct would be a bet at odd or even in roulette. Nevertheless, this model is a close enough approximation to our blackjack game to answer the question we are addressing. I will need two mathematical facts. The first involves the ruin probability Rz. This represents the probability that the player will go broke when starting with a stake of z units, winning or losing one unit per trial, and playing until either he goes broke or reaches a stake of a. Here is the first fact. Fact 1: If the function Rz satisfies the difference equation
Is Fact 1 obvious? Not at all. But once you have the result in (3) it is just a matter of high school algebra to check that (1) and (2) hold. But where did (1) come from? Simple. If you start with stake z then your chance of ruin is just Rz+1 with probability p of winning on the next trial or probability Rz-1 with probability q of losing on the next trial; this relation captures the essence of Rz. Fact 2 involves the notion of the expected duration of play Dz starting with a stake of z units and playing till ruin or stake level a. This quantity will be equal to 1 + Dz+1 with probability p of winning on the next play or 1 + Dz-1 with probability q of losing on the next play, the 1 added to account for that single play. Hence Dz = p(1 + Dz+1) + q(1 + Dz-1). Since p + q = 1 this reduces to Dz = 1 + pDz+1+ qDz-1. Here is Fact 2. Fact 2: If the function Dz satisfies the difference equation
Very well, we are ready to look at our question. Let us hypothesize a 4-deck game with dealer standing on soft 17, resplit twice for three hands, Aces split once, no Surrender, double on any two cards, and double after split. This game is around a 0.44% house edge as a percentage of ante for a Basic Strategy player. This means that p = 0.4978 and q = 0.5022. Let us further hypothesize that we buy into the game for 10 units and play until we reach 20 or we go broke. Using the formulas (3) and (6) presented above we find that
Very well, now let us suppose that we play 1/2 unit on each of two hands. What can happen? Well, we can win both hands for a gain of one unit; this happens with probability p2 (here is where the assumption of independence is used). We can win hand 1 and lose hand 2 or win hand 2 and lose hand 1 for no gain; this happens with probability 2pq. Finally we can lose both hands for a loss of one unit; this happens with probability q2. The difference equation for the ruin probability in this case looks like (I'll use RTz for the ruin probability for two hands)
But look! The quantities p2/(p2+ q2) and q2/(p2+ q2) add up to 1 so these act just like the probabilities in Fact 1. Using this observation, the solution to (13) with the same boundary conditions as (2) can be obtained from (3) by replacing p with p2/(p2+ q2) and q with the quantity q2/(p2+ q2). The result, after a bit of algebraic clean up is:
What about the duration for two hands? I'll use the symbol DTz for the expected duration of play using two hands and a stake of z. Here DTz is equal to the expected duration of 1 + DTz+1 with probability p2 (a win on the next hand) or 1 + DTz with probability 2pq or 1 + DTz-1 with probability q2. In symbols:
Let us use these new results to examine the two-handed game in our example. Recall that z = 10 and a = 20; p = 0.4978 and q = 0.5022. Using these numbers formula (14) gives us
Comparing (22) with (7) we see that the expected number of hands has almost doubled from the single-hand case lending some credence to the downtown casino's point of view. On the other hand, the ruin probability has increased. What really matters though is the expected return to the casino in the two-hand scenario. Using formula (8) with the number in (22) for RT10 we calculate
So there is the answer. The casino should rank the two-hand play as the sum of the units on each hand; playing twice as long I expect to lose twice as much. By ranking my two $25 hands as a $35 average bet, my estimated expected loss is 30% less than my actual expected loss. Put another way, playing two hands, my actual expected loss is about 43% greater than the downtown casino is giving me credit for. Where did they get $35? I haven't the slightest idea. What about other games? Earlier I mentioned roulette. One could, for example, place one half unit on Red/Black and one half unit on Odd/Even. Here the expected return to the house is 7.568 units (out of 10) whereas the return to the house playing one unit on (say) Odd/Even is only 4.824 units . Notice that, unlike blackjack, the return playing two spots is less than double the return playing one spot. Does this mean the two-spot player should be rated at less than one unit? No indeed. The figure is less than before simply because the duration of play is less than twice the duration of the one-spot player. In fact, now that we have all of our formulas above, let me show you exactly what is at work here. Recall that the formula for the duration of play is the same for two hands as it is for one except that the ruin probability is different in each case. Taking either formula (6) (with c = 1) or formula (21) and multiplying it through by q - p, the house expectation per hand, we have
In summary, playing two hands at one half unit each must have an expected return per deal to the casino of q - p, the same as that for one hand at one unit, if the correct house expectation is realized. What is true is that playing two hands rather than one changes both the ruin probability and the duration of play, but the expected return to the house is just q - p times this duration. Rating the two-handed player at less than one unit is to estimate his expected loss at less than it really is and is, well, a rip off. So if you want to play two hands, which I believe is more fun, you should realize that for a given buy-in and goal-to-quit, you will lose more per playing session on average than if you play a single hand, you'll get almost twice the playing time per session, and unless you get rated at the full sum of your two hands you are getting gypped when it comes to comps. The lesson is: before you play two hands of anything, ask the floor person how you will be rated. The irony in the downtown casino's rating policy is that it discourages two-handed play yet, as we have seen above, a two-handed player playing to double his stake or go broke will, on average, produce a larger win for the casino than the corresponding one-handed player. Of course, there has been ample evidence through the years that not all casinos understand the business they're in. See you next month; I'll try to lighten up on the algebra! This article is provided by the Frank Scoblete Network. Melissa A. Kaplan is the network's managing editor. If you would like to use this article on your website, please contact Casino City Press, the exclusive web syndication outlet for the Frank Scoblete Network. To contact Frank, please e-mail him at fscobe@optonline.net. Recent Articles
Best of Donald Catlin
Donald Catlin |
Donald Catlin |