I've read that the SD equation is 1.1/square root of "N". "N" represents the number of hands. One standard deviation is 68.3% of the time. Doubling the standardard deviation is 95% of the time and 3 standard deviation accounts for over 99.7% of the time I believe. With that said, here is a question broken down to a extremely small sample to make my point.
Lets say, for mathematically purposes only, that you play just 4 hands and the the average bet is $10 per mand. Total wagered amount is $40. Also assume that there are no splits, doubles or blackjacks.
SD for the above mentioned play is 1.1/square root of 4 = .55 or 55%. 55% times the total number of hands which is 4 = 2.2 hands. 2.2 times the average bet of $10 = $22. $22 is the SD. With 1% advantage you would have $.40 ($40x1%). You would potentially make $22.40 and also can lose $21.60 2/3 of the time. 2SD is $44 and 3SD is 66.
Here is my question, how does 2 SD or 3 SD work out when the variance in the SD is greater than the total amount wagered. 2 SD is $44. That would mean you can win $44.40 and lose $43.60. These numbers are not possible considering only a total of $40 was wagered and the most you can win is $40 if you won 100% of the hands.
3 SD is $66 and that is far greater than $40 wagered. Is the math wrong? I know that as you play more and more hands, the percentage for the SD goes down ie: 100 hands played would be 1.1/10 = 11%. 10000 hands played would be 1SD of 1.1%. 1.1% of 10000 = 110 hands. 110x$10 = $1,110. Obviously the total amount wagered is vastly greater than this (10,000 x $10 = $100,000)
Any thoughts on the calculation based on the 4 hand excercise?