0

Let us say that there are two independent events $A$ and $B$. Event $A$ occurs with probability $0.3$ and event $B$ occurs with probability $0.4$. You can bet on these events with odds of $3:1$ and $7:3$, respectively.

You are given $\$10$ to bet on these two events once. How should you split your money?


Calculating out the expected values, I found that A gives you a better bang for your buck so to maximize EV, one should put all their money on A. However, within the context of the problem, I believe that one is expected to hedge their bets some as there is a 70% chance that they will lose everything if they go all in on A.

However, in the case of splitting your money, I am unsure of the best way to go about it as no matter how much you put into event B, you are lowering your expected value in exchange for less variance. Is there an optimal spread? Or is it just down to one's risk tolerance?

Now let's say that you can bet on these two events as many times as you want. How does your strategy change?

In this case I think that one would use the Kelly Criterion to maximize the growth of their money. But is there any merit to splitting your bet? Or is it better to just keep betting on A according to the Kelly Criterion?

Any help is appreciated!

Danjx
  • 329
  • 1
    Without specifying an objective function it's hard to say more. Generally one wants to have high expectation with low variance, so you need a blend. – lulu Aug 02 '23 at 13:00
  • What would a sample objective function look like? – Danjx Aug 02 '23 at 13:02
  • It should be monotonically increasing because having more money is nicer. It might be concave downwards because the benefit of having another $$$x is reduced the more you have. Unless otherwise specified we use expected value and you should bet all on A. – Ross Millikan Aug 02 '23 at 13:56
  • Ratio of expectation to standard deviation is fairly standard. – lulu Aug 02 '23 at 18:47
  • How did you work out that "A gives you better bang for your buck"? If you bet $\ $1\ $ on A you will lose your $\ $1\ $ with probability $\ 0.7\ $ or win $\ $3\ $ with probability $\ 0.3\ $, for an expected return of $$ 0.3\times3-0.7\times1=0.2 $$ dollars. On the other hand, if you bet $\ $1\ $ on B you will lose your $\ $1\ $ with probability $\ 0.6\ $ or win $\ $\frac{7}{3}\ $ with probability $\ 0.4\ $, for an expected return of $$ 0.4\times\frac{7}{3}-0.6\times1=\frac{1}{3} $$ dollars, which is greater than the expected return from betting on A. – lonza leggiera Aug 03 '23 at 16:44

1 Answers1

0

For simultaneous, independent events one can (usually) just bet with the same fraction of their bankroll that Kelly Criterion would say to do if one were betting on each in isolation. I say usually because this is not the case when the optimal bet amount for each in isolation is close to 1 where you would end up betting more than your actual bankroll if you were to bet with the same fraction as them in isolation (see: Kelly Criterion for simultaneous independent bets)

For your question we have:

$$f_a = p_a - \frac{1-p_a}{B_a}, \quad f_b = p_b - \frac{1-p_b}{B_b}$$ $$f_a = \frac{3}{10} - \frac{\frac{7}{10}}{3}, \quad f_b = \frac{4}{10} - \frac{\frac{6}{10}}{\frac{7}{3}}$$ $$f_a = \frac{1}{15}, \quad f_b = \frac{1}{7}$$