I was looking at the solution for the Code Jam 2014 qualification question but the proof of correctness seems to be incomplete and I was wondering if anyone could help me with it. The full question can be found here but to summarize:
You are given cookies at a rate of 2 cookies/second initially. Your goal is to get to X cookies in the shortest time possible. You can build farms that produce cookies at a rate of F cookies/second but each farm costs C cookies to build. How do you find the shortest time needed to get X cookies given C and F?
Let T(n) be the time it takes get X cookies using n farms. The solution states that to find the shortest time, you can calculate T(n) for n = 0,1,2,.. and once you find T(n+1) > T(n), then you know T(n) is the answer.
Unfortunately the solution omits a proof that T(n) is a global minimum. So far the approach I've been trying to use is that since we know T(n) < T(n+1) and also that T(n) < T(n-1) (because otherwise we would have terminated the algorithm), T(n)i s a local minimum. However I don't know how to show that there is only one minimum and thus the local is the maximum. I also have
$$T(n) = \sum_{i=0}^{n-1} \frac{C}{2 + i*f} + \frac x{2+f*n}$$
Any help would be appreciated! Thanks!