3

I first learned about the Big O notation in an intro to Algorithms class. He showed us that function $g \in O(f(n))$
Afterwords in Discrete Math another Professor, without knowing of the first, told us that we should never do that and that it should be done as $g \in O(f)$ where g and f are functions. The question is which one of these is right, why, and if they both are, what is the difference?

Raphael
  • 73,212
  • 30
  • 182
  • 400

1 Answers1

4

There are two views on this, formal and pragmatic.

Formally, you always have $O(f(n)) = O(1)$ as long as $f : \mathbb{N} \to \mathbb{N}$ since $f(n)$ is just a number, i.e. a constant function. $O(f)$, on the other hand, does what you want.

However, you often want to indicate which variable goes to infinity¹. If you let $f : x \mapsto x^2$ and state

$\qquad\displaystyle 2n + k \in O(f)$,

which variable should identify with $x$? Writing

$\qquad\displaystyle 2n + k \in O(f(n))$

can be used (given mutual understanding) to clarify.

Personally, I'd propose to use

$\qquad\displaystyle 2n + k \in O(f)$ for $n \to \infty$

or

$\qquad\displaystyle 2n + k \in O_n(f)$.

However, use of Landau notation is (sadly) not meant to be rigorous most of the time², but is instead used as a lazy shortcut.


  1. Never mind that Landau notation gets into trouble when two independent variables are around, anyway.

  2. Meaning that many people use it in a sloppy way. Not saying that that's a good thing. Many questions on this site are based in wrong understanding of terms that are founded in teachers promoting sloppiness in the name of (alleged) clarity (imho).

Raphael
  • 73,212
  • 30
  • 182
  • 400