13

Below, assume we're working with an infinite-tape Turing machine.

When explaining the notion of time complexity to someone, and why it is measured relative to the input size of an instance, I stumbled across the following claim:

[..] For example, it's natural that you'd need more steps to multiply two integers with 100000 bits, than, say multiplying two integers with 3 bits.

The claim is convincing, but somehow hand-waving. In all algorithms I came across, the larger the input size, the more steps you need. In more precise words, the time complexity is a monotonically increasing function of the input size.

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case? Is there a proof for that beyond hand-waving?

Kaveh
  • 22,661
  • 4
  • 53
  • 113

3 Answers3

13

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case?

No. Consider a Turing machine that halts after $n$ steps when the input size $n$ is even, and halts after $n^2$ steps when $n$ is odd.

If you mean the complexity of a problem, the answer is still no. The complexity of primality testing is much smaller for even numbers than for odd numbers.

JeffE
  • 8,783
  • 1
  • 37
  • 47
4

Is it the case that time complexity is always an increasing function in the input size? If so, why is it the case? Is there a proof for that beyond hand-waving?

Let $n$ denote the input size. To read the entire input, a turing machine already needs $n$ steps. So if you assume that an algorithm has to read it's entire input (or $n/c$ for some constant $c$), you will always end up with at least linear run time.


The problem with defining algorithms with a "monotonically decreasing run time function" is, that you have to define the run time for $n = 1$ somehow. You have to set it to some finite value. But there are infinite possible values for $n > 1$, so you end up with a function which is constant for infinite many values.


Probably sublinear algorithms are of interest for you, which do not read the entire input. See for example http://www.dcs.warwick.ac.uk/~czumaj/PUBLICATIONS/DRAFTS/Sublinear-time-Survey-BEATCS.pdf.

Christopher
  • 197
  • 5
1

The relation $(\mathbb{N},\leq)$ is well-founded, i.e. there are no infinite falling sequences in the natural numbers. Since (worst-case) runtime functions map to the naturals, all runtime functions therefore have to be in $\Omega(1)$, that is all runtime functions are (in the limit) non-decreasing.

That said, average runtimes can contain oscillating components, for example Mergesort.

Raphael
  • 73,212
  • 30
  • 182
  • 400