First I'll say something about positive and negative expectation. The sum of the expectations between two in a contest must equal zero. If the house has a disadvantage of 1% on a bet then you must have an edge of 1%. That's all there is too it.
Start with the most extreme case:
The player starts with one unit. The first round of any shoe is
negative expectation. The odds are greater than half that he will
go bust and stop playing on the first hand. If he doesn't go bust,
he continues to the second round -- depending on the size of the
shoe, this round may also be necessarily negative expectation and is
certainly much more likely to be negative than positive, etc.
Now, mathematically if every time this player goes bust, a new player
also with one unit bakroll and same conditions starts playing on
a new shoe, the long term result for the casino will be different
(better for the casino) than will be the long term result for players
starting with big banks.
Yes, the result will be different than for those with big banks. For the players with big banks the results will be more equally distributed among the group while among a group of small banks the results are skewed in favor of a lucky few.
Even in your extreme example, a finite number of players will lose one unit before the lucky player comes along who wins an infinite amount, thus the players have an edge over the casino. Some guy will get lucky enough on negative expectation bets that his bank grows to a point where he can play a winning spread. Even if only one in 200 wins enough to achieve a 50% risk of ruin after a few shoes, out of those some will grow a bank to the point where the ROR is essentially nill. This means that they are favored to aquire all the wealth in the universe before going bust... in fact, you are such a favorite with a 49% ROR. If even one guy out of 1000, or one million, is a favorite to win an infinite amount, and the number of guys who go broke is finite, then the players, as a group, have the advantage. The casino has the disadvantage. If the casino has the disadvantage, you must have the edge.
Basically, if the players have an average risk of ruin of 1%, then 1% will go broke and 99% will share the winnings. If the players have an average risk of ruin of 99%, then 99% will go broke and the lucky 1% will enjoy all the winnings. In either case, it is infinite winnings vs finite loses, thus advantage to the players.
Yes, I'm with you on all that.
And I'm really just slitting hairs
here. But "infinite winnings" are limited by human lifespan.
So we could concoct a situation, say where one in a ten thousand is
expected to make it to a small ROR situation and where his running
edge is so small that he is not expected to live long enough to
win back the whole 10,000 units. Basically, after the first hand the
casino will be up about 100 units (if the game is such that the first
hand of the shoe gives the house a 1% edge). The house net win will
increase in subsequent hands until only players with big enough banks
to spread nominally are left playing. Then the house net win will
begin gradually dcreasing. Untill the last AP dies:)
Also, if say 1,000 players start and a new
one starts each time one goes bust (and they all have these tiny
bankrolls) then the casino will have a constant 1,000 players who
know how to AP but the casino will be constantly making money.
This is really an artifact of the first few hands of all shoes being
negative expectation, not just ROR.
The problem is that you don't understand what positive expectation means. Swallow that first, understand what is theoretical and what is practical, and then perhaps you can move on to some more complex concepts. You don't have a good differentiation between win probability and win expectation. Don't confuse the two concepts or you will be unable to advance as an advantage player. N0 is a more advanced concept, which has to do with the length of time one must play to have a high probability of success. You are mixing concepts because you don't have a grasp of the basics. I'd advise you to join greenchip where you can read much archived material and get up to speed.
Repeat: Wow, you are way off base.
I respect your experience, and appreciate the good advice that you
often give. But this message is nuts. You did not respond to the
actual hypothetical arithmetic that I proposed. Several messages
ago, I acknowledged that this is only a pedagogical question, and
not relevent to how best to implement advantage play. On the other
hand, as anyone who has passed post-bacalaureate science knows, it IS
relevant to understand the logical extremes and hypotheticals of all
situations. Personal attacks in lieu of logical argument set the
whole field backwards.
And another thing, in response to this kind of slight:
"The problem is that you don't understand what positive expectation means"
This is an anonymous forum, but if you would like to exchange a list
of peer-reviewed first-authored scientific publications (which would
provide a measure as to if one "understands" advantage arithmetic)
I would be happy to exchange it with you.
and I am sorry if you feel that there was. I truly feel that you should join greenchip and become further eductated. It would help you. I believe I answered your questions.
1. If you are attempting to say that underfunded players take a big risk playing because their chances of success are small, then you're right.
2. If you are attempting to say that the casino is a net winner against underfunded advantage players then you are wrong.
3. If you are attempting to say that it takes a long time before some player or players has a strong probability of success, that has to do with the long run index and the concept of N0. Such things are used to determine game favorability.
4. If your point is something other than the three things I listed above then articulate it more clearly and I will respond.
When someone, after reading your posts, suggests that you need to learn more about the topic and gives you a suggestion as to where you can find good material, it is NOT a personal attack.
but most of the long time regular contributors know each other. In fact, their handles, which are well known, are more valid than their relatively unknown real names. But you can find much in the way of acedemic publications on related topics on bjmath.com.
OK, mostly I agree with you. But I think you should also consider
phraseology, as in "you don't understand..." that can sound like
a personal attack.
In this case, yes I have no idea what "Nzero" terminology is.
But that doesn't mean I don't have a basic understanding of the
relevant fundamentals.
I may join Green Chip, but only when I re-start actively playing
a lot, because I question whether it contains anything significantly
more than is available in the books that I already have.
Back to your points 1,2,3:
My specific hypothetical was stated clearly in previous messages.
I don't know what "Nzero" is but I'm sure it shifts the focus from
the hypotheical that I proposed to the question of how best to
advantage play -- which I have no problem with. but I believe the
simple example I described stands. (And yes it stands as irrelevant
to actual play, but as useful articulation of logical extremes.)
I agree that suggestions as to where to learn more are a good thing
and are not personal attacks. But language like "your problem is..."
and "what you don't understand..." can sound like a personal attack,
even if not meant as such.
OK, again you have good points, but missed my point.
I meant standard peer-reviewed publications in an independent math or
science field -- as independent evidence of ability to understand and
do original research in a math-related discipline.
But hey, I put
no stock in such credentials anyhow -- you just got me pissed enough
to mention it. And yes, I do know the identities of some of the
regulars posting here (though surely far less than you know), and
that semi-non-anonymity is why I have always used my same name in
my posts, but I also think it is important to maintain the veneer of
anonymity here for obvious reasons.
You will find things written on all different levels, including works presented at national gaming conferences from distinguished professors, etc. Even Dr Thorp himself sometimes visits. And our own MathProf, who is a regular poster in this forum, is widely recognized as the foremost blackjack mathematician. Much of his work is available on bjmath.com also.
OK, good new info about who is who on bjmath.com.
But back to my goofy little hypothetical case of APs with tiny bankrolls
-- I think it still stands.
Probably if we review all the literature, we'll find that I only
reinvented a wheel that was covered by some previous math treatise.
But it should be acknowledged that way, not as "you don't understand..." or "your problem is..."
Otherwise, what's the whole point of this forum, or even Green Chip
for that matter? Why not just put up a sign that says, "refer to
fundamental mathematics"? Similarly, instead of doing electrical
engineering, we could suffice with a sign that says, "refer to
Maxwell's Equations".
Questions:
1) Even if playing with a small theoretical advantage via counting and
bet spread, if one has such a small bankroll that one's risk of ruin
is large (say greater than 50%), is it really correct to say that one
is playing with an advantage over the house? I think the answer may
be "no" -- in this case the house really has an advantage over you.
It is correct to say that you have the advantage, as the term is used in gambling. Your personal definition of the word "advantage" may be something different. In this world, "advantage" is synonymous with positive expectation.
2) [This one I think moreso has not been addressed.] Consider the
perspective of the casino. Suppose a large number (say 1,000) of
players start playing the game with a count and bet-spread that gives
them theoretical advantage (say 0.5%) but they all have small bank
roll giving them 50% risk of ruin. Question: will the casino make or
lose money? [Clearly the answer will involve time -- how long can the
500 players who don't go broke play? and even actuary, how long will
they live on average...]
The casino loses money, again, on average. What happens in a specific case is of no concern. They may win on this particular thousand because some don't live long enough and lose on another particular thousand. Being underfunded is synonymous with overbetting. Among a large number of overbettors most will go broke and a small minority will win a huge amount. In the hypothetical you give above only half the players will go broke. Given that they started with very small banks the other 50% who win will very easily, and quickly, win more than the less fortunate group. When you play to a 50% risk of ruin, double or bust does not take long at all. And this will be proportional for other risks of ruin. A guy with a 90% risk of ruin will go bust or increase the bank 10 fold pretty rapidly, thus the one guy in ten who does not go bust quickly overtakes the losses of the other nine.
I pulled up an example of a rather poor bj game on the bj calculator, BJRM, to use as an example. This game is S17,DAS with 1.5 decks cut, and only a 1:10 spread used, play all. The desirability index is barely over 4, yet it is good enough to illustrate the point. We will give the players a 10 unit bank. The spread is $5 to $50 with a $50 unit bank. This gives a risk of ruin of 98.06%.
An equation that BJRM uses, one that I actually came up with myself, is expectation adjusted for ruin. That is, what does the player expect to win on average given that he will sometimes, or often, go broke. We will assume 2000 planned hours of play. Only 2 players per 100 will make it. The others go bust.
In this example, if each player had an infinite bank and could not go bust, they would have an expected win of $17,700 each. But because of the ruin factor it is much less. The 2000 hour expectation, adjusted for ruin, is $383.41. That's how much the group wins, on average, per player. For 100 players it means that 98 go broke, losing a net of $4900. The remaining 2 players who survive win a combined $43,241.
Of course, the losing players will go broke at various points within the 2000 hours. Some will go broke quickly. Others will go broke after getting off to a rather good start. But, in the end, if you divide the win achieved by the two successful players among the total hours played by all you arrive at the theoretical hourly expectation.
Having all of these players with small banks is logically equivalent to a single player with a big bank who employs tight stop losses, which is a gambling system, or system of money management. We all know that know system of money management can alter the expectation.
The concept is important to understand as it has real world implications. One of them is how to play midway through a shoe when you are almost broke and have a high count. Should you scale back bets to finish the shoe? What should you do half way through a 20 hour trip when x% of the bank has been lost. These are not the exact same questions as those you posed but are answered the same way and based on the same principles. Once one has a firm grasp of the basic principles, which are often anti-intuitive, it is easier to answer these questions.
Good Morning DD'
Excellent post. I have two points:
1) "In this example, if each player had an infinite bank and could not go bust, they would have an expected win of $17,700 each. But because of the ruin factor it is much less. The 2000 hour expectation, adjusted for ruin, is $383.41."
--I believe we could crank the tiny bankrolls down small enough to
make the adjusted expectation go negative - no?
2) I agree with your points about real world applicability. In fact,
as I was pondering how to get myself out of this mess I started, I
came to the following idea that I think is related to your point on
this, though more simplistic. Regarding this issue, the relevant
"unit" can be seen not as a round or as a max bet but as "a shoe"
-- the number of units of bank necessary to play through a whole shoe,
that reveals high positive counts requiring large bets, and lose every
hand without going broke.
If the players in these examples considered "a shoe" of bank as the
minimum and considered themselves bust and would not start a new shoe
if their bank was smaller than that amount. Then the amount on
average that the players will win (per hand played) will not be
reduced. Because this removes the bias effect of the first rounds of
the shoe starting with negative expectation.
And "a shoe's" worth of bank can be sizeable. In this example of max
bet $50, a six deck shoe's worth of bank, rough estimate is about
$2,000.
Good Morning DD'
Excellent post. I have two points:
1) "In this example, if each player had an infinite bank and could not go bust, they would have an expected win of $17,700 each. But because of the ruin factor it is much less. The 2000 hour expectation, adjusted for ruin, is $383.41."
--I believe we could crank the tiny bankrolls down small enough to
make the adjusted expectation go negative - no?
No, it can't become negative. The smaller the bankroll of the unsuccessful players the smaller the win needed by the successful ones to overcome their combined losses. If they started with only one unit then the one lucky guy out of a few hundred who wins his first bet and never looks back has to win only a very small amount, a few hundred units, in order to negate all of their losses.
Believe it or not, even if the underfunded (I'm talking about a bank of only 1 unit) walked up and made a very negative expectation bet, such as a single number at roulette, and then parlayed it once if won, before moving on to bj, the one guy who wins recoups all the other losses pretty quickly. One person out of 1444 will succeed. He will have 1225 units to play bj and can play with a very small risk of ruin. By the time he merely gets an 18% return on his 1225 bj bankroll he has already recouped the losses of the other 1443 losers. The same would be true if the one unit players all played a positive progression from the start of a bj shoe. One guy wins enough to play a legitimate positive expectation bj game and quickly recovers the cumulative losses of the rest of the group.
2) I agree with your points about real world applicability. In fact,
as I was pondering how to get myself out of this mess I started, I
came to the following idea that I think is related to your point on
this, though more simplistic. Regarding this issue, the relevant
"unit" can be seen not as a round or as a max bet but as "a shoe"
-- the number of units of bank necessary to play through a whole shoe,
that reveals high positive counts requiring large bets, and lose every
hand without going broke.
That is actually not the best way to go and it is also non-intuitive. When running low on funds during a high count shoe it is best not to scale back bets in order to make it to the finish. YOu should make your system bets, provided they don't exceed half your money on hand. When your system bet is > half your available cash on hand you bet half your chips on every bet until you recover sufficiently (to properly play your system) or go bust. The result is that you will go broke much more frequently than you would by cutting back but, over the long run, the few times when you fully recover you will net more than you did by cutting back. This has been proven using several different methods using all sorts of different scenarios. The thing is that the reduced bets never lower the risk of ruin sufficiently to offset the sacrifice in ev. For example, if you cut bets in half to avoid ruin then you would have to double the expected number of hours played at the reduced stakes to offset cutting the ev in half. You may increase expected hours by 30% or 40%, etc. But you never increase it quite enough to make up the lost ground.
If the players in these examples considered "a shoe" of bank as the
minimum and considered themselves bust and would not start a new shoe
if their bank was smaller than that amount. Then the amount on
average that the players will win (per hand played) will not be
reduced. Because this removes the bias effect of the first rounds of
the shoe starting with negative expectation.
While this is what most people do, from a practical standpoint, it is not neccessarily mathematically optimal. Given a sufficient intended spread, I suspect the entire stochastic analysis would yield a positive result in most cases. My roulette example above probably logically proves this point.
This could be simlulated very easily, but I don't think there is much point. When down to a single bet or two, if one is to continue playing, I'd look to backcount and plop this last ditch effort down with positive ev.
And "a shoe's" worth of bank can be sizeable. In this example of max
bet $50, a six deck shoe's worth of bank, rough estimate is about
$2,000.
Yes, again as a practical matter, a shoe game player must bring a bigger portion of his total bank with him as a session stake than a player of handhelds. It is less of a matter of ruin affecting the return as it is the fact that if you consistently go broke before a shoe's end you will cut a large portion of favorable counts out of your lifetime true count distribution, which will lower your lifetime hourly. When we have sufficient funds we will generally end our session when the count is negative, thus cutting poor counts out of our lifetime true count distribution. When underfunded we more generally go broke during positive counts thus walking away from juicy opportunities. This will cause one's overall hourly return to be less then one sees in simulations where it is assumed that bets are placed in all of the high count situations, which more generally occur near the end of the shoe.
--I believe we could crank the tiny bankrolls down small enough to
make the adjusted expectation go negative - no?
You can't reach a negative amount of time. The reason that the expectation adjusted for ruin is a smaller figure than the infinite bank is that the infinite bank gets the full hourly expectation for all hours of play. With a finite bank the expected win is adjusted downward to meet the expected number of hours of play, which is different from the planned hours of play. On trip where 20 hours of play are planned, for example, a bank with a finite ROR will sometimes go broke such that his average trip hours, over all trips, will equal less than 20 hours. His trip expectation, adjusted for ruin, is equal to the regular hourly theoretical multiplied by the adjusted number of expected hours of play. So you are always multiplying a positive number by another positive number and the outcome cannot be negative.
I'm ready to surrendur this hand, especially if I can get half my
wager back.
I figure your math must be correct. So I must be looking at something
different than you are indicating. Your points about the practicality
of play I understand completely.
But I still do not see one theoretical point as follows. If we consider
both tiny bankrolls and limited duration of play, how can you NOT get
to negative expectation in some extreme cases?
Consider the most extreme case: Ten thousand players, each has
bankroll of one unit, and all players die after one round. The house
will be up 100 units or whatever the first round house advantage yields.
Example 2:
Now consider all non-busted players die after second round of play.
The house will be up around 150 units.
Example 3:
Now consider all non-busted players die after the end of their first
shoe. (These are very slow dealers.) This would be a little more
complicated to show explicitely but I would bet that the house will
still be up in this case.
So, there is clearly some finite duration of player life --
greater than two rounds -- below which the collective expectation is
negative for the players.
I'm sure your math is also correct and you weren't talking about cases
like this. If your formula is multiplying two necessarily positive
numbers, then it must not apply to cases like this.
Consider the most extreme case: Ten thousand players, each has
bankroll of one unit, and all players die after one round. The house
will be up 100 units or whatever the first round house advantage yields.
Note here that the size of the bankroll does not matter. It could be a million units or one unit. But the first round of the shoe is negative. If you place this wager and then die you will be playing with negative ev. This means that those who stand to inherit your wealth will have a little less. Now that you have that one you will do well to completely forget it as it will hamper your ability to learn more complex things about advantage gambling where all that has been discussed in this thread is take as a given... very elementary, not rocket science.
But everything here is based on averages. You can't designate that everyone will die. That would be like me saying that half of them will win their first 100 bets. What you will do is take the number of people participating and determine what percentage of them will die within a given period of time. Out of 10K people, it is most likely that one will survive with sufficient winnings to negate the losses of all others. This is what will happen, on average, which is what positive and negative expectation are all about. Expectations are not based on the worst case scenario, they are based on the weighted average of all possible scenarios.
With a single unit bankroll at bj: even though you are starting a shoe with a disadvantage, some players will acrue 10 or 20 units by the time they encounter their first positive count, will bet half their money, win a double down or to, and never look back. I have been down to 2 or 3 black chips at a $100 table before and left with several thousand. This has been a minority of the time, but the time or two when it has been successful has more than offset the losing attempts.
Bj21 uses cookies, this enables us to provide you with a personalised experience. More info