How much reward does one need to expect in order to take a "one-time angle shot?"
I'm using a poker term here. angle shot -- (n.) poker table slang meaning "an ethically questionable gesture, motion, word phrasing, or other action, intended to deliberately take advantage of a rules loophole or a moderator's poor judgment, in the hopes of gaining decisive information or a risk-free shot at winning." And now I'll use it in a sentence: “I can't believe that degenerate scumbag tried to pull an angle shot like that!”
But sometimes "angle shot" types of opportunities just fall into your lap, with no attempt at deception or unethical play. Occasionally, fate just smiles on you. Which is what a particular advantage player experienced. Long story short: he was given the opportunity to play the next blackjack hand at any stakes he wanted, almost certainly without angering the pit crew or general surveillance, knowing his first card would be an ace. Now, under those circumstances, the mathematically correct play is obviously to "jam," using another poker term -- i.e., shove as many chips into the betting circle as you can get away with and let the gods of variance pass judgment.
But I think in reality, away from the immaculate chalkboards of probability and statistics, it's a little more complicated than that. How do we know for sure that "jamming" won't contribute to an immediate or future backoff? After all, it's a clear sign of intelligent gambling, which is generally not welcome in casinos.
Here's a hypothetical for your consideration:
Suppose you play a certain number of hours at a certain casino, with a positive expectation because you're a legitimate advantage player, and your estimated expectation for a given year is ... oh, say ... $20,000. And beyond that, you estimate, based on what you've heard from other AP's and your various inside sources, that your lifetime winnings can get up to around $125,000 or so before this particular casino will back you off, suspicion of advantage play or not. But then one day, suddenly, you get an opportunity to take a free angle shot, and it's really juicy. You have a 70% probability of winning, and you can wager as much as $200,000 on it. But win or lose, making this bet will run a 50% risk of getting you instantly barred from the casino forevermore. Would you do it?
Pretty complicated, when you think about it.
First of all, do you have the kind of bankroll where losing a $200,000 bet is acceptable? Probably most of us don't.
Suppose your bankroll isn't an issue. Then it's clear that, if it's worth wagering, it's worth wagering the maximum. The $200,000 bet with a 70% chance of winning has a huge positive expectation: E(200000) = 0.7(+200,000) + 0.3(-200,000) = 80,000. Not bad for a few minutes of work!
But now there's a 50% chance your lifetime winnings at this casino just capped at 80,000, because you're no longer welcome on the premises. The other 50% of the time, you're still welcome, but probably a major blip on the radar for the rest of your career, however long that lasts. Maybe making a giant play extends your longevity, maybe it contracts it. Hard to say. It's complicated.
There's a branch of financial mathematics known as Utility Theory that tries to take into consideration the effects of winning and losing money on future actions. A classic example: Suppose you have a million dollars and are offered a game that pays even money, even though you have a 60% chance of winning. But the catch is you only get to play once, and you have to bet your entire million. Would you play?
The mathematics clearly indicates you should play -- the expectation is +$200,000. But there's no chance I would play this game, and I suspect most of the people reading this would feel the same. It's because your first million dollars radically improves your life, and to have it and then lose it would be devastating. The odds favor that million doubling, but the improvement to my life of doubling the million is not nearly worth the potential devastation of going from millionaire to pauper. The first million is "worth more" than the second; it has a greater "utility."
Utility theory -- how does the value of something change as the circumstances of owning it change?
In blackjack, we address this sort of thing when we consider "certainty equivalence." And this article is already long enough, so I'll stop now and let any interested readers pursue the study of this on their own. Or better yet, let's talk about it on Green Chip and educate the AP masses.
Originally published on bj21.com Green Chip, edited for this format.