16

Robert Sedgewick, at his Algorithms - Part 1 course in Coursera, states that people usually misunderstand the big-O notation when using it to show the order of growth of algorithms. Instead, he advocates for the tilde notation.

I understand the big-O is an upper bound for certain problem at certain condition.

What is the difference between the tilde and big-O notations?

Raphael
  • 73,212
  • 30
  • 182
  • 400
thyago stall
  • 263
  • 1
  • 2
  • 6

2 Answers2

13

The $\sim$ notation is similar to the more conventional $\Theta$ notation. There are two main differences between $\sim$ and $O$:

  1. $O$ only provides an upper bounds, while $\sim$ is simultaneously an upper bound and a lower bound. When we say that the running time of an algorithm is $O(n^2)$, this doesn't preclude the possibility that it is $O(n)$. In contrast, if the running time is $\sim n^2$ then it cannot be $\sim n$.
    Another notation with these properties is $\Theta$.

  2. $O$ only holds up to a constant: $f = O(g)$ if $f(n) \leq Cg(n)$ for some $C > 0$ (and large enough $n$). In contrast, for $\sim$ the implied constant is always $1$: if $f \sim g$ then $f/g \to 1$. This contrasts with $\Theta$ in which the implied constant is arbitrary, and indeed there could be different constants for the lower and upper bounds.

Exact constants are impractical in general, for many reasons: they are machine dependent, hard to compute, and could fluctuate depending on $n$. The first problem can be mitigating by measuring some proxy for the actual time complexity, such as the number of comparisons in a sorting algorithm.

Sampling the course, it seems they are using $\Theta$, but call it order of growth.

Yuval Filmus
  • 280,205
  • 27
  • 317
  • 514
7

The definitions are that $f(n) = O(g(n))$ if for some constant $c > 0$ and $N_0$, you have that $f(n) \le c g(n)$ whenever $n \ge N_0$. I.e., $g$ gives an upper bound (within a not specified constant). Similar is $f(n) = \Omega(g(n))$, for other constants $c'> 0, N_0'$ you have $f(n) \ge c' g(n)$ whenever $n \ge N_0'$. I.e., a lower bound (within a constant). It is said that $f(n) = \Theta(g(n))$ if both the above are satisfied.

Note that e.g. $n^2 = O(n^3)$, and $n^2 = \Omega(n)$, while $200 n^{1 + 1/n} = \Theta(n)$.

The definition of $f(n) \sim g(n)$ is more demanding: it is that:

$$\lim_{n \to \infty} \frac{g(n)}{f(n)} = 1$$

No arbitrary constants, the functions grow together. In this sense, it is similar to $\Theta$, but sharper.

vonbrand
  • 14,204
  • 3
  • 42
  • 52