10

You and your friend are playing a game. You both start with a score of $0$. Also, you both start with $\$1$. At each step, you're allowed to bet a fraction of your $\$1$, and whoever bets more money wins that "round". However, whoever wins the round (bets more money) loses whatever they bet, and whoever loses that round (bets less money) keeps whatever they bet. If you win, your score increases by $1$; if you lose, your score decreases by $1$. The game terminates when a player gets a score of $-3$ (they lose) or $+3$ (they win). If you bet the same amount of money as your opponent, then your opponent wins.

What's the optimal amount of money you should bet in the first round?

I was asked this question for a quantitative research position, but I couldn't solve the problem. They seemed to suggest that the answer was irrational, but I still can't figure it out. Does anyone have any ideas?

Some extreme strategy like betting everything in one round isn't feasible since you lose the money if you win.

Alex Ravsky
  • 106,166
  • 3
    "However, whoever wins the round loses whatever they bet, and whoever wins that round keeps whatever they bet. " Which player keeps his money and which loses it? – Moko19 Sep 01 '20 at 21:21
  • @Moko19 "whoever bets more money wins that round" – saulspatz Sep 01 '20 at 21:22
  • @OP I disagree with your last sentence. Why would betting everything make you lose? You know that at best, your opponent can tie you, so there is no downside. – Rushabh Mehta Sep 01 '20 at 21:25
  • Huh?? "whoever wins the round loses whatever they bet, and whoever wins that round keeps whatever they bet." – David G. Stork Sep 01 '20 at 21:31
  • 1
    Yep, one of those "whoever wins" should be "whoever loses", but which? – Daniel Fischer Sep 01 '20 at 21:32
  • 1
    Also, what happens if both bet the same amount? – Daniel Fischer Sep 01 '20 at 21:35
  • @DanielFischer Whoever bets more money wins the round, and loses the amount that he bet; the other person, who bet less money, loses the round, but keeps the money he bet. A player wins the game when he wins three rounds. I don't see the problem. – saulspatz Sep 01 '20 at 21:36
  • 1
    @saulspatz The OP should clarify that theit mistype is interpreted in that way (and still - what happens in case of a tie) – Hagen von Eitzen Sep 01 '20 at 21:39
  • 1
    This sounds more like an auction for the score point than a bet – Hagen von Eitzen Sep 01 '20 at 21:40
  • @HagenvonEitzen I read right over the typo, several times, just mentally supply an interpretation that made sense. I just couldn't understand what people were objecting to. I agree about ties. – saulspatz Sep 01 '20 at 21:46
  • Hi all. Really sorry about the confusion. I just updated the post. Whoever bets more money loses their money. Whoever bets less money keeps their money. When there's a tie, the opponent wins, so you're at a disadvantage (the game isn't symmetric). –  Sep 01 '20 at 22:03
  • What are your thoughts on the problem? If you've been asked this question in an interview, surely you should have been able to come up with something? – Servaes Sep 01 '20 at 22:25
  • Also, it seems like poor form towards the company to publish a question from a job interview in such a public forum. – Servaes Sep 01 '20 at 22:36
  • So if you bet $0.999999$ a few million times, you're almost certain to reach the $$3$? (if the opponent plays randomly between $(0,1)$) – Déjà vu Sep 01 '20 at 22:46
  • I don't understand: "If you bet the same amount of money as your opponent, then your opponent wins" - If we bet the same, then from my opponents perspective, he has also bet the same as his opponent (me), so his opponent (me) should win. Thus both win and loose at the same time, no? :) – AntimatterHedgehog Sep 02 '20 at 13:07
  • @e2-e4: If you bet that much and win, you have almost no money left, so you can't bet that much again. – Ross Millikan Sep 06 '20 at 03:02

2 Answers2

5

I would argue that the optimal initial betting is $\frac{2 - \sqrt{2}}{2} \sim 0.29$.

To do so, we need a few arguments:

  1. Whatever the initial bet is that you use to go from $0$ to $+1$, the second bet will be the same. Suppose not. Let $x$ be the bet you made to go from $0$ to $+1$, and let $y$ be the bet you make to go from $+1$ to $+2$, with $y = x + \epsilon$ where $\epsilon > 0$. Now, your opponent initially will bet $x + \epsilon/ 4$, and then bet $x + \epsilon/ 2$. Now, either you:

a) Let your opponent win the first round, and then win the second (in which case, now, the money in game is $(1 - (x + \epsilon/2), 1 - (x + \epsilon /4))$, so your opponent gained a net advantage. You are doomed to lose if you repeat this strategy.

b) Your opponent lets you win both rounds, so you are now at $+2$ with $(1 - (x + y), 1)$

c) You let your opponent win both rounds, so your opponent is now at $+2$ with $(1, 1-(2x + 3 \epsilon / 4))$.

Since you want to make sure you are doing as well as your opponent, you need to pick $y$ arbitrarily close to $x$.

Once you reach $+2$ with $(1 - 2x, 1)$, your opponent has to counter this by paying $(1 - 2x)$, so now, we are back at $+1$ with $(1 - 2x, 2x)$.

Note that the game is scale invariant, and if you ended up strictly better up to scaling, your opponent could have copied you, and if you ended up strictly worse, then you will just have to repeat the same game, and will eventually lose, so we assume that the proportions of money is equal i.e. $(1 - x, 1) \propto (1 -2x, 2x)$ so we let $1 - x = \frac{1 - 2x}{2x}$ and solve the quadratic for an $x$ value less than 1, which gives us the answer mentioned before.

E-A
  • 6,075
  • 1
  • 11
  • 21
  • @RossMillikan But I will bet $x$, so if they bet less than $x$, I would win? (Since if I am willing to lose to a bet y < x, then, we would be back at the origin with me having less money than my opponent) With that said, it is quite possible (likely, even) that there are mistakes in this answer, so thank you for reading through it! – E-A Sep 06 '20 at 03:30
  • If I know you are playing this, I bet $0.3$ the first round, win the point, and bet $0.35$ every round thereafter. I claim I will win. – Ross Millikan Sep 06 '20 at 04:06
  • @RossMillikan Ok, so we get to +2 with (0.35, 1), since I will for sure let you win. Now, I have to pay 0.35+\epsilon to take us back to +1 so that is (0.35, 0.65 - \epsilon). If you pay your remaining 0.35 to jump back up to +2, you are out of money, and I win. Going through an example helps by the way; thanks! – E-A Sep 06 '20 at 04:36
  • 2
    You have made me realize that my claim of equivalence to the winner getting one point and the loser getting zero is wrong. You need to win by three, not to get three. I will think some more. – Ross Millikan Sep 06 '20 at 04:40
0

Denote by $(a, b, \sigma)$ the game state where you have $\$a$, your opponent has $\$b$, and your score is $\sigma$. Clearly the game is scale-independent, so the ratio of your money to your opponent's is all that matters to determine the winner. Also, clearly it can only help you to have more money at any point; so there must be exactly one transition (as the ratio is increased) from losing to winning for each score. Let $R_\sigma$ be the critical ratio when your score is $\sigma$; that is, if the ratio is larger, you win, and if the ratio is smaller, you lose. By symmetry, $R_0=1$. (The winner at the critical ratio for each score could still go either way; certainly it depends on the tie-breaker rules, while the rest of this analysis does not.)

First consider $\sigma=+2$. You win games $(r,1,+2)$ with $r > 1$; for $r \le 1$, your opponent can (must) match your bet, which should therefore be as large as possible, leading to the game $(r,1-r,+1)$, which goes from losing to winning at $r/(1-r)=R_1$. So $R_2/(1-R_2)=R_1$, or $R_2=R_1/(1+R_1)$.

Now consider $\sigma=+1$; you again clearly win for $r > 1$, but for $r \le 1$, your opponent can choose whether to match your bet or not. Let $s(r) \le r$ be your bet. Then your opponent can choose whether the next state is $(r-s(r), 1, +2)$ or $(r, 1-s(r), 0)$. The latter is a loser for you if $r < 1-s(r)$, so you must have $s(r) \ge 1-r$. The former is a loser for you if $r-s(r)< R_2$, so you must also have $s(r) \le r-R_2$. These conditions become incompatible (that is, your opponent can always win) if $1-r>r-R_2$, or $r<(1+R_2)/2$. We conclude that $R_1=(1+R_2)/2$. Combining this with the previous relation between $R_1$ and $R_2$, we find that $R_2=-1+\sqrt{2}$, and that $R_1=\frac{1}{2}\sqrt{2}$.

Finally, let's determine your optimal bet from $(1,1,0)$. If you bet $s$, your opponent can choose whether the outcome is $(1-s,1,+1)$ or $(1,1-s,-1)$. One of these is a clear loser unless $1-s=R_1$, or $$s=1-R_1=1-\frac{1}{2}\sqrt{2}.$$ This is your optimal initial play.

Note that because of the unfair tiebreaker rules, your opponent will win this game... if you bet exactly the critical amount each round, then your opponent will exactly match your bets and win in three rounds. However, there is a silver lining: you can draw out your defeat, making your opponent spend an arbitrarily large number of rounds reaching her inevitable victory. This can be done by betting your opponent's wealth plus a sufficiently small $\varepsilon$ when the score reaches $-2$.

mjqxxxx
  • 43,344
  • You can upgrade your arbitrarily large to infinitely long if you add \epsilon/2, \epsilon/4, \epsilon/8, ... instead. – E-A Sep 14 '20 at 04:00
  • @E-A: I don't believe that's so (though it might be an interesting follow-up question to fill in the details). I think after you win your first point with a bid that is too large by $\varepsilon$, you are doomed to lose in $O(1/\varepsilon)$ additional rounds even if the tiebreaker rules were switched to be in your favor. – mjqxxxx Sep 14 '20 at 20:10
  • @mjqxxxx Nice solution. Preparing for quant research, I was wondering if this method belongs to some well known literature. If so, could you provide some references or pointers? – ASA Aug 07 '24 at 15:07