Automatic differentiation is a set of techniques to numerically evaluate the derivative of a function.
Quoting from Wikipedia (emphasis mine):
These classical methods run into problems: symbolic differentiation leads to inefficient code (unless done carefully) and faces the difficulty of converting a computer program into a single expression, while numerical differentiation can introduce round-off errors in the discretization process and cancellation. Both classical methods have problems with calculating higher derivatives, where the complexity and errors increase. Finally, both classical methods are slow at computing the partial derivatives of a function with respect to many inputs, as is needed for gradient-based optimization algorithms. Automatic differentiation solves all of these problems, at the expense of introducing more software dependencies.
However, I haven't been able to find precise statements on the differences in computational cost. While I understand that this will be somewhat dependent on the particular implementation, are there general statements that can be made about it?
In other words, are there general statements that can be made about the computational cost of estimating the derivative of a function numerically as opposed to using automatic differentiation techniques?