1

I was reading the Cormen, Leiserson, Rivest and Stein textbook, Introduction to Algorithms.

The book explained the three asymptotic notations literally very well. However, there was this paragraph:

Technically, it is an abuse to say that the running time of insertion sort is $O(n^2)$, since for a given $n$, the actual running time varies, depending on the particular input of size $n$.

and this one:

It is not contradictory, however, to say that the worst-case running time of insertion sort is $Ω(n^2)$, since there exists an input that causes the algorithm to take $Ω(n^2)$ time.

  1. I understand why did the author said that "it is an abuse to say $O(n^2)$ is the running time", as there are the inputs with best cases causing linear time and there are also inputs with worst case causing quadratic time.

  2. I don't understand why it is not contradictory to say that the worst-case running time of insertion sort is $Ω(n^2)$. Isn't $\Omega(g(n))$ supposed to be the best running time? So should it not be $Ω(n)$?

It confuses me. Can you please explain why $Ω(n^2)$ is possible for insertion sort when it should be $Ω(n)$?

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
rsonx
  • 281
  • 1
  • 12

1 Answers1

2

Aren't $\Omega(g(n))$ supposed to tell best running time?

Not at all. It just means that for all large enough $n$, there exist inputs of size $n$ where the algorithm takes time at least proportional to $g(n)$.

Can you please explain why $\Omega(n^2)$ is possible for insertion sort when it should be $\Omega(n)$?

The worst-case running time of insertion sort is $\Omega(n^2)$ because for all $n$, there exist inputs of size $n$ where it takes time proportional to $n^2$.

The running time of insertion sort in general is not always $\Omega(n^2)$, as you observed; it depends on the input. But the worst-case running time (which does not depend on the input, only on $n$) is $\Omega(n^2)$.

Vincenzo
  • 3,469
  • 1
  • 13
  • 24