Considering the recent question "Is there an algorithm whose time complexity is between polynomial time and exponential time?", some commenters observed that merely providing a function with growth strictly between polynomial and exponential answers the question, because it is easy to construct an algorithm with the required complexity. For example, if you chose $f(n) = 2^{\sqrt{n}}$, then the following algorithm has complexity $O(f(n))$:
- Compute $x = 2^{\sqrt{n}}$.
- Increment an integer variable until you reach $x$.
Sounds easy - but for many functions step 1 takes longer than step 2. For example, let $f(n)$ be the value of $A(n, n)$ mod 37. Assuming that this requires actually computing $A(n, n)$ (and there is not some theorem like "$A(n, n)$ is always divisible by $37$") then the procedure above will not produce an algorithm with the required complexity.
Of course, since this $f$ is $O(1)$, it's easy to construct an algorithm with the required complexity. But it's conceivable that there is some slow-growing hard-to-compute function which does not have the same "cheat".
My intuition, however, is that there is a cleverer technique that will always work. Is there?