Often in computational number theory, to compute $f(N)$, there are identities like
$$F(N) = \sum_{m=1}^N f \left(\left\lfloor\frac N m \right\rfloor\right) $$
where $F(N)$ is easy to compute, i.e. $O(1)$ time, so can be rewritten (see Efficient computation of $\sum_{k=1}^n \left\lfloor \frac{n}{k}\right\rfloor$)
$$f(N) = F(N) - \sum_{m=2}^N f \left(\left\lfloor\frac N m \right\rfloor\right)$$
There are more elaborate ways to rewrite, including using $\sqrt{N}$ as a cutoff for switching from computing $f(\lfloor N/m\rfloor)$ to iterating through $f(m)$, and pulling out even $m$ terms as $F(\lfloor N/2\rfloor)$.
What is the time complexity of computing $f(N)$ if all values are memoized?
For example, $f(100)$ depends on all the values of $\lfloor N/m \rfloor$, in this case $f(50), f(33), f(25), f(20), f(16), \dots, 50 \times f(1) $.
How can I analyze this? It's obvious we never need to call at least half of the $N$ values, $f(m)$ for $N/2 < m \le N$. The recursive calls depend on how many distinct $f(\lfloor N/m\rfloor)$ there are and their magnitudes. $a(N) = |\{\lfloor N/m \rfloor, 2 \le m \le N\}|$ doesn't seem to be on OEIS, unless I computed incorrectly.
[len(set([N//m for m in range(2, N+1)])) for N in range(2, 100)]