# The Grey Labyrinth is a collection of puzzles, riddles, mind games, paradoxes and other intellectually challenging diversions. Related topics: puzzle games, logic puzzles, lateral thinking puzzles, philosophy, mind benders, brain teasers, word problems, conundrums, 3d puzzles, spatial reasoning, intelligence tests, mathematical diversions, paradoxes, physics problems, reasoning, math, science.

Author Message
Persona
Daedalian Member

Posted: Wed Oct 13, 2010 1:51 am    Post subject: 1

Player B offers player A a d6 and the following game:
 Quote: You get to roll a die, and choose to keep or pass on the roll. If you pass, you get up to two more chances to roll, for a total of three rolls. Once you keep your roll (or on the third roll), I will do the same. In the end, if my die is higher than yours, I win, if yours is higher, you win, and if they are a tie, neither of us wins, and we each keep our bet.
Find the dominant strategy for A (fairly easy), the dominant strategies for B given each possible kept roll from A (harder), and the expected value of the game for A, given a 100 dollar bet (Oh the algebra).
Jack_Ian
Big Endian

Posted: Wed Oct 13, 2010 9:02 am    Post subject: 2

Sounds like a simpler version of this.

 Jack_Ian (when he hadn't thought it through) wrote: Unless I misunderstood, the strategy for B is just keep going until he beats A. A hasn't got much of a chance. Let Pb(X) be the probability of being beaten if your score is X. Pb(1) = Probability of B not throwing 3 ones in a row = 1 - 1/(6x6x6) = 0.9954 Pb(2) = 1 - 1/(3x3x3) = 0.962963 Pb(3) = 1 - 1/(2x2x2) = 0.875 Pb(4) = 1 - (2x2x2)/(3x3x3) = 0.7037 Pb(5) = 1 - (5x5x5)/(6x6x6) = 0.4213 Pb(6) = 1 - 1 = 0 Ahh! it's more complicated than that!

Scratch that!
I forgot that B might stop if he has a draw and little chance of winning.
This will take more time (at least more than I have atm).

Back later, hopefully, when I get a break.
Zag
Tired of his old title

 Posted: Wed Oct 13, 2010 12:36 pm    Post subject: 3 I'm still not clear on the game. If I roll and get a 3, decide to try again, get a 2, and try once more and get a 1, is my score 1, 3, or 6. (It was your use of the word 'total' that led me to 6, but I don't think you meant it.) I assume it is 1, right? Jack_Ian, B's strategy is more complex because he has to decide whether or not to keep a tie. I suspect that it only gets interesting when A has a 3 or 4. No time left, though, and I want clarification before spending significant work.
Persona
Daedalian Member

 Posted: Wed Oct 13, 2010 1:18 pm    Post subject: 4 Either player's score is only the last die rolled. So 3,2,1 results in a 1. There's a LoGD implementation I have in mind, that has an algorithm for B that I'm pretty sure isn't ideal. I'll share it if we don't stumble across it for B naturally.
groza528
No Place Like Home

 Posted: Wed Oct 13, 2010 1:47 pm    Post subject: 5 I don't understand why you say B's strategy is harder. Player B doesn't even need to think about his strategy; he stops if his die is higher than A's. I suppose there are times he may want to stop at a tie, but those don't seem like they'd be hard to figure either.
Persona
Daedalian Member

 Posted: Wed Oct 13, 2010 1:58 pm    Post subject: 6 More complicated, then. A's strategy is simply to maximize the expected value |citation needed|. B probably has 6 separate strategies, only 2 or 3 of which are degenerate.
Zag
Tired of his old title

 Posted: Wed Oct 13, 2010 5:53 pm    Post subject: 7 I'll bet that A's strategy is not that simple, either. For instance, you roll 1, then 4. Expected value of the die roll (3.5) suggests you should not reroll. However, what you want to maximize is the expected value of the bet, not of the die roll. So it depends on the complete analysis for B. I suspect that the EV of your 1/3 chance of improving might be enough better. For instance, consider if, instead of 3 rolls, it were 50. In that case, A's best strategy would be to keep rerolling anything less than a 6, even if he's managed to get to roll 49 without hitting one. With 50 rolls, B is nearly guaranteed to beat anything else -- certainly more than the 1/12 chance that you'd need for the decision of keeping a 5 vs. trying one more time to be worth it.
Changabooniggiwan*
Guest

 Posted: Fri Oct 15, 2010 3:31 pm    Post subject: 8 This is very interesting, but strategies will probably be unresolvable without a context for the game. In Persona's original post it appears that a single game is to be played, for a \$100 stake - this shouldn't be too hard to resolve (I haven't done it yet). It is possible, however, that a series of games could be played for fractions of that stake, and that either player could discontinue the game at any point. There are other variants too, each of which would demand subtly different strategies: The players play once for a \$100 stake. The players play an infinite number of times for a \$100 stake per game. The players play as many games as they can, for \$1 stakes (or some other fixed stake), until one of them has lost \$100. The players play as many games as they like, for \$1 stakes (or some other fixed stake), until one of them has lost \$100 or until one chooses to discontinue the game. The players play a fixed number of games (say 10), and whoever wins the most games in the 10 receives \$100 from the other player. The players play with \$100 stakes until the cops burst in, ending their series of games after a random number have been played. It gets even more interesting if we introduce a harsh penal system and a ruthless guard: The players play once to see who lives and who dies. The players play as many games as they need to until one has won 10 games and is freed, the other gets shot. The players play a fixed number of games (say 10), and whoever wins the most games in the 10 is freed, the other gets shot. The players play until the guard gets bored and shoots the one with the lowest number of wins, freeing the other. He may do this after any random number of games.
Trojan Horse
Daedalian Member

Posted: Fri Oct 15, 2010 8:45 pm    Post subject: 9

 Changabooniggiwan* wrote: The players play once for a \$100 stake. The players play an infinite number of times for a \$100 stake per game. The players play with \$100 stakes until the cops burst in, ending their series of games after a random number have been played.

These three seem to require identical strategies.

 Changabooniggiwan* wrote: The players play once to see who lives and who dies.

This one too, assuming that they would play again if there is a tie.
L'lanmal
Daedalian Member

Posted: Fri Oct 15, 2010 10:09 pm    Post subject: 10

You never say whether B gets to see what A rolls. This would seem to make a difference in B's strategy, and whether bluffing strategies for A would need to be considered.

 Persona wrote: Player B offers player A a d6 and the following game: . . . Find the dominant strategy for A (fairly easy), the dominant strategies for B given each possible kept roll from A (harder), and the expected value of the game for A, given a 100 dollar bet (Oh the algebra).

Is the value of the d6 considered in the expected value? It would seem that A would have to put substantial value on the d6 in order to accept.
L'lanmal
Daedalian Member

 Posted: Fri Oct 15, 2010 11:18 pm    Post subject: 11 This was more interesting than I thought. Given B will be told A's final roll, you are incorrect about A maximizing his expected value. Here are my expected payouts for B given A's result, in units of hundreds of dollars per 216 games. Done by hand, corrections welcome. If A has a 6: -125 (i.e., 125 Losses, 91 ties per 216 games) 5: +12 (Standing on 5 if ever rolled, as rerolling would lose 16 times for every 10 won) 4: +117 (Rerolling a 4 if first roll, but standing if it is the 2nd roll. Rerolling on the first time gives 18 wins for every 9 loses and 9 ties) 3: +171 2: +204 1: +215 Therefore, A should always reroll 4s, as the expected payoff for that point (out 117) is greater than the average off all 6 scenarios (out 99). The interesting calculation is if A rolls a 5 on the first roll. Then he has negative expected value, but should stand anyways as rerolling only makes things worse. Standing gives away only 12 (hundreds of dollars per 6^3 games), where rerolling gives 47=(-125 + 12 + 99*4)/6 since we know a 1-4 will be rerolled and a 5-6 stood on from that point out. Combining this all up into an expected value (A always stands on 5/6, always rerolls 1-4s), I get that B is expected to win 2724 hundreds of dollars per 6^6 games played. Apologies for the non-elegance of the calculation, a decision-tree approach seemed the quickest, which doesn't translate conveniently to forum.
Zag
Tired of his old title

 Posted: Sat Oct 16, 2010 12:07 am    Post subject: 12 Nicely done, L'lanmal. I'm glad that my guess in post 7 was correct.
ralphmerridew
Daedalian Member

 Posted: Sat Oct 16, 2010 12:12 am    Post subject: 13 This is the sort of problem that falls pretty nicely to dynamic programming. Let S be the number of sides on the die. Define B(a, n) := Expected value to B before rolling (in terms of unit stake), if A shows 'a', and he has 'n' rolls (including this one). Define A(a, m, n) := Expected value to A if he shows an 'a', has 'm' rolls, and B has 'n' rolls. If I've got my equations right: B(a, 0) := -1 B(a, n > 0) := ((S - a) + (a-1) B(a, n-1) + max (0, B (a, n-1))) / S A(a, 0, n) := - B(a, n) A(a, m > 0, n) := max (- B(a, n), sum A(k = 1 to S : A(k, m - 1, n)) / S) Memoize and compute recursively.
Persona
Daedalian Member

 Posted: Sat Oct 16, 2010 1:12 am    Post subject: 14 Very nice work indeed! Now for the Casino Variant (and I see why A needs the help!). After careful observation, A determines B plays by the following policy: 1) If A rolls a 6, stand on the first 6 rolled. 2) If A rolls any other number, then accept only a win on the first roll, and any win or tie on the second. Specifically, if A stands on a 5, B won't stand on the first 5 rolled, but will on the 2nd. Likewise, if A stands on a 1, B won't stand on the first 1 rolled, but will on the 2nd (silly casino). How much does this tilt away from the ideal play?
lexprod
NOT not a title

 Posted: Sat Oct 16, 2010 1:55 am    Post subject: 15 I was working it out the same way, though did not get as far as l'lanmal I did get to B's strategies on playing against 4s, and had that B would gain 50 for every 100, or 108 for every 216. For me it came out 2/3 wins - 1/6 losses. And 152 for every 216 for 3s (7/8-1/12), and 184 for 2s (26/27-1/216). Am I doing something wrong? Likely? XD I got the same scores for 5s and 6s._________________I'll have a P please...
lexprod
NOT not a title

 Posted: Sat Oct 16, 2010 3:00 am    Post subject: 16 Casino rules, per 216 games: A has 6, casino loses 125 A has 5, casino gains 6 A has 4, casino gains 108 A has 3, casino gains 168 A has 2, casino gains 198 A has 1, casino gains 210 Overall unweighted average house gain: 94 1/6 for every 216. This would make A's best strategy the same as before I believe, since sticking with a 4 is worse than the average loss. I haven't adjusted the average house gain for a player's high likeliness for getting higher numbers, but as it stands that's an almost 45% house edge. For comparison the largest house edge among standard casino games is Keno, with 25-30 percent. Craps bets vary from 1-12% edge, roullette is 3-5%. Basically what I'm saying is winning should have a much bigger payout to make anyone at a casino play it. That or maybe the dealer rolls on anything except a win for the first two rolls, and still pushes ties on the third. Crunching numbers...._________________I'll have a P please...
lexprod
NOT not a title

 Posted: Sat Oct 16, 2010 5:42 am    Post subject: 17 Actually, given A/the gambler doing the hold 5s and 6s, and roll 1-4s, the house edge is only 4.1%. which actually makes it a viable casino game Who wants to set up a booth in vegas?_________________I'll have a P please...
L'lanmal
Daedalian Member

Posted: Sat Oct 16, 2010 1:17 pm    Post subject: 18

 lexprod wrote: I was working it out the same way, though did not get as far as l'lanmal I did get to B's strategies on playing against 4s, and had that B would gain 50 for every 100, or 108 for every 216. For me it came out 2/3 wins - 1/6 losses. And 152 for every 216 for 3s (7/8-1/12), and 184 for 2s (26/27-1/216). Am I doing something wrong? Likely? XD I got the same scores for 5s and 6s.

 Code: First roll:  Second roll:   Third Roll: 5-6 -> 72 W 1-3 ->    5-6  ->  36 W              4    ->  18 D             1-3    ->  5-6     -> 18 W                      -> 4        -> 9 D                      -> 1-3      -> 27 L 4 -> 36 D or 4 ->      5-6   ->   12W             4      ->    6D             1-3   ->   5-6    -> 6W                     ->     4     -> 3D                     ->  1-3      -> 9L Total: 144W, 36D, 36L  = +108  > 126W, 63D, 27L = +99

That makes my payout wrong, but I don't believe it changes any of the strategy decisions.

Re 3s: 7/8s is 189/216, 1/12 is 18/216, so that's 171 right? It has to be an odd payoff since the only way to tie is roll 1-3, 1-3, then 3... the product of 3 odd cardinalities.

Re 2s: You are undercounting loses. (2,2,1) would be a loss (not a tie), as you are counting (2,2,3) as a win.
Zag
Tired of his old title

 Posted: Sat Oct 16, 2010 5:03 pm    Post subject: 19 If you're trying to devise a casino game, your target should be that (1) the player gets to make several decisions, (2) it is fairly hard to figure out, (3) some of the decisions are non-intuitive, and (4) correct play by Player results in a house edge of 1.5% to 3%. (Also, the house play must be completely mechanical, like what lexprod suggests.) I was thinking something along these lines. I haven't worked out the house edge, but my gut tells me that this is close enough that it can be brought into the right range with the tweaks that I suggest. It definitely meets my first two qualifications, and I suspect the third, as well. Player makes a bet and BOTH the house and player roll their first die. (The Player can see both results.) Player can stand, and makes no additional bet, and he is done. (House still has two more rolls if he needs them.) OR Player can increase his wager by half and roll again. (i.e. he must pay to roll again, this first time. He doesn't have to pay for the third roll.) Player can choose to stand and makes no additional bets. (If the house has too much edge, I was considering that the player may choose to increase his bet here, by half the original amount.) OR Player can roll again and must keep this result. He may, if he wants, increase his bet by half the original amount. The house then rolls twice more, according to lexprod's algorithm. If the house is behind here, I would say that the house wins all ties except 6-6 ties, which are pushes. If the house is too far ahead, then I would add that the player wins double on a 6-1 win. I suspect that the player's best play is to sacrifice on the first roll when the house rolls a 6 first, unless he also has a 6. I'm not sure on a 5. ----------- If I were designing this game for sale to casinos, I would create a die roller mechanism where the die is inside a cage that is shaken when the the player pushes a button. The player's button is not active unless the dealer pushes a button in his space to activate it (for one push only). The biggest problem is that the dealer's mechanic is based on the results of the player, which means that you can't have 6 players playing against a single dealer, like in blackjack (or Caribbean stud, or any of a hundred other games of this sort). So I'd also be interested in a dealer decision that is even simpler, something like the following. This would enable him to play one set against any number of customers. First roll, stands on any 6. (or maybe 5 or 6) Second roll, stands on any 5 or 6 (or maybe 4, 5, or 6). Note that you can have the same effect as the extra bets by making the player put it all out up front, and then allowing him to take some back. This is how a number of new table games do it, and I suspect that it is better, psychologically, for the house if they do. In other words, the player position has three circles in front, arranged vertically to him. He puts \$10 in the highest circle and \$5 in each of the other two. If he wants to surrender after the first roll, he can take back the two \$5 bets. After the second roll, he can take back only the closer of the two \$5 bets, or leave it there. You might say that he always takes it back to get the third roll. Notice how that is the same as the game I described above, but it "feels" very different, as if the player is 'saving' some money by taking the third roll.
lexprod
NOT not a title

 Posted: Sat Oct 16, 2010 9:39 pm    Post subject: 20 One way to make player rolls more "fair" is the crap-style roll of having to toss pass a certain distance. Of course raking over each players roll could take forever (though you could take the high-roller [trebek game show] method and use a conveyor belt to whip the dice back, and have the other half of the table felt.) I think the game works better where 90% of the decisions are just roll again or not, rather than bet fiddling. I tried curbing the house to 2 rolls, and standing on only 5 and 6 on the first roll, but with perfect play it was a negative edge :X. Maybe we can make it more like blackjack, the dealer rolls two dice, one in a clear cup the other in a opaque cup. The players all see one die, and roll their own pair each (somehow securely). Everyone can then take up to two "hits" of rerolling both dice. (Maybe add some BJ options like "double down" if you only roll one die once, split pairs, etc.) The idea here is you make your roll or stop decisions based on the die you see. Dealer behaviour would have to be something like 1st roll stand on 9+, 2nd roll stand on 7+, 3rd roll take whatever's there. Players would simply have up to two rerolls to try to get as high as possible. So seeing a 1 or 2, they know the dealer's going to roll at least once more, since there's no chance they have a nine, while seeing a 6 means there's a 50-50 chance of a 9 or better. I'm actually gonna crunch the numbers on a simpler version to see how it shakes out first, same as above but only 1 reroll max for players and dealer. Dealer will stand on 8+. Ties push, but dealer wins if he has a double and you do not (just to add a little flair into the house edge). PS: L'lanmal you are right about my numbers, thanks ^_^_________________I'll have a P please...
Changabooniggiwan*
Guest

 Posted: Mon Oct 25, 2010 9:33 am    Post subject: 21 It's looking as if this one is exhausted as it stands, and I can confirm L'lanmal's results (having brute forced it in Excel). I have a couple of variants on the original that might prove interesting: Variant A (the 4-roll limit) A maximum of 4 rolls are permitted in the game. Player A may make 1, 2 or 3 rolls to make his score, and Player B is permitted as many rolls as he likes that are left over. A's final score is disclosed. So, if A rolls a single die, B gets to roll up to three times. If A rolls twice, B can roll up to two times. And if A rolls three times, B can only roll once. This one is pretty easy to analyse, but there are some interesting decisions for A to make where he weighs the strength of his score against B's advantage of having more rolls. Variant B (reroll disclosure) Like the original, Player A makes up to three rolls then B makes up to three rolls. Unlike the original, A does not disclose his score, only the number of rolls he made. This one has given me problems to say the least. It has a nasty habit of going round in circles. For each variant, find the dominant strategies for both players and their expectation from 216 games with \$1 stakes (to make the maths nicer).
ralphmerridew
Daedalian Member

 Posted: Mon Oct 25, 2010 1:15 pm    Post subject: 22 Define a "base strategy" as any combination of: - Reroll if the first die shows X or lower - Reroll if the second die shows X or lower. There are 36 "base strategies". A can have any of the 36 base strategies. B has potentially 36^3=46656 strategies (Choose base strategy X if A doesn't reroll, base strategy Y is A rerolls ones, and base strategy Z if A rerolls twice). For each combination of 36*36^3 strategies, calculate the expected value to A. Put those in a grid and feed it to the simplex algorithm to determine appropriate mixed strategies for A and B. ---- That is making some assumptions on my part: - Whether or not a person should reroll the second die is independent of what value the first die held (which I'm pretty sure of) - If a person should reroll a particular die when it holds a particular value, s/he should also reroll in a situation where it holds a lower value but is otherwise identical. (Given the result of the poker bluffing problem, I definitely doubt this one as far as A goes. I think it's true for B, but I wouldn't be shocked to be proven wrong.) It can be somewhat remedied by treating A as having 2^12=4096 strategies (any subset of { "Reroll when Xth die shows a Y" for Xth in (first, second), Y in { 1,2,3,4,5,6 }). I think it's infeasible to handle all of B's 2^36 = 69 billion strategies. Even removing all "Rerolls when Xth die shows a 6" choices only knocks it down to 1 billion strategies. But since B has no need to bluff, I think analyzing the 46656 strategies (or even 25^3) will be sufficient.
 Display posts from previous: All Posts1 Day7 Days2 Weeks1 Month3 Months6 Months1 Year by All usersAnonymousgroza528Jack_IanL'lanmallexprodPersonaralphmerridewTrojan HorseZag Oldest FirstNewest First
 All times are GMT Page 1 of 1

 Jump to: Select a forum Puzzles and Games----------------Grey Labyrinth PuzzlesVisitor Submitted PuzzlesVisitor GamesMafia Games Miscellaneous----------------Off-TopicVisitor Submitted NewsScience, Art, and CulturePoll Tournaments Administration----------------Grey Labyrinth NewsFeature Requests / Site Problems
You cannot post new topics in this forum
You can reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum