7

In Sipser's textbook "Introduction to the Theory of Computation, Second Edition," he defines nondeterministic time complexity as follows:

Let $N$ be a nondeterministic Turing machine that is a decider. The running time of $N$ is the function $f : \mathbb{N} \rightarrow \mathbb{N}$, where $f(n)$ is the maximum number of steps that $N$ uses on any branch of its computation on any input of length $n$ [...].

Part of this definition says that the running time of the machine $N$ is the maximum number of steps taken by that machine on any branch. Is there a reason that all branches are considered? It seems like the length of the shortest accepting computation would be a better measure (assuming, of course, that the machine halts), since you would never need to run the machine any longer than this before you could conclude whether the machine was going to accept or not.

Raphael
  • 73,212
  • 30
  • 182
  • 400
templatetypedef
  • 9,302
  • 1
  • 32
  • 62

3 Answers3

6

Because you don't know ahead of time whether or not any given input is a 'yes' instance — that is, whether there exists any accepting path — it makes sense for the sake of uniformity to bound the run-time independently of any particular feature of the computational paths. Thus, it makes sense to require the worst-case behaviour to be polynomial time, regardless of whether or not any accepting paths exist.

Niel de Beaudrap
  • 4,241
  • 1
  • 18
  • 32
3

Ah, but you don't know that the machine will select the shortest accepting path. All nondeterminism gives you is that the machine will select some accepting path, if at least one exists. So in the worst case it selects the longest.

Luke Mathieson
  • 18,373
  • 4
  • 60
  • 87
2

The machine in this definition is a decider, that means it does halt on all inputs and accepts if and only if the input is in its language. That means that the maximum lengths of all computation makes sense, and should be the measure investigated.

You are probably thinking of acceptors. Those are Turing machines that halt and accept on inputs in the language, and do anything (but halt and accept) on inputs outside of the language. In particular, they can never terminate; some definitions even demand (equivalently) that acceptors loop when they don't accept. In this case, runtime does only make sense for positive inputs, but you still have to choose the longest one.

You could, of course, discuss to empower nondeterminism to not only choose a path towards acceptance, but also a shortest one. The standard notion does not include this, however, so the machine might take the longest path.

Raphael
  • 73,212
  • 30
  • 182
  • 400