13

Does saying $f(x) = \Theta(1)$ provide any extra information over saying $f(x) = O(1)$?

Intuitively, nothing grows more slowly than a constant, so there should be no extra information in specifying Big Theta over Big O in this case.

Caleb Stanford
  • 7,298
  • 2
  • 29
  • 50
MattCochrane
  • 259
  • 2
  • 5

5 Answers5

17

Remember your definitions! As $n \to \infty$ (the use in CS, almost always) $O(\cdot)$ is an upper bound (within a constant multiple, for large $n$), $\Omega(\cdot)$ is a lower bound (within a constant multiple, for large $n$), and $\Theta(\cdot)$ both of the previous. To see the difference, as $n \to \infty$ you have:

$\begin{align*} 1 + \lvert \sin n \rvert &= O(1) \\ 1 + \lvert \sin n \rvert &= \Omega(1) \\ 1 + \lvert \sin n \rvert &= \Theta(1) \\ e^{-n} &= O(1) \\ e^{-n} &\ne \Omega(1) \\ e^{-n} &\ne \Theta(1) \end{align*}$

For the first three, as $\sin x$ moves between -1 and 1, the expression fluctuates between 1 and 2.

To see why $e^{-n} \ne \Omega(1)$ (and so $e^{-n} \ne \Theta(1)$), note that we are looking for a constant $c > 0$ and an $N_0$ so that for all $n \ge N_0$ we have $e^{-n} \ge c \cdot 1$. For any $c$ you pick, you'll find that taking $n > \ln 1 / c$ makes this false. So no $c$ works.

vonbrand
  • 14,204
  • 3
  • 42
  • 52
15

vonbrand's answer is correct in general, but let me add that if $\boldsymbol{f(n)}$ is the running time of an algorithm, then you are correct, $\boldsymbol{O(1)}$ and $\boldsymbol{\Theta(1)}$ are the same. This is because running times of algorithms are positive integers, so the example $f(n) = e^{-n}$ is impossible. As you said, intuitively, nothing grows slower than a constant.

In particular, if $f$ is the running time of an algorithm, then $f(n) \ge 1$ for all $n$, because it takes at least one step for the machine to halt and return its output. From this we can deduce that $f(n) = \Omega(1)$, that is, every function grows at least as fast as a constant.

This assumes that an algorithm's running time can't be zero. If we allow that (i.e., halting doesn't count as a step), then there is exactly one algorithm that is $O(1)$ but not $\Theta(1)$ -- the "do nothing and halt" algorithm. This has running time $0$, which is $\Theta(0)$, not $\Theta(1)$.

Caleb Stanford
  • 7,298
  • 2
  • 29
  • 50
4

The Big O notation is not only used to express algorithm complexity. It is also commonly used to express precision of approximations in mathematics. See for example the Stirling's approximation, which uses $O(1/n)$ as a term explaining how good the approximation is. Then you are not necessarily dealing only with functions that grow.

Obviously an $f(n) \in O(1/n)$ is also $O(1)$, and any $f(n) \in \Theta(1)$ is also $O(1)$, but if $f(n) \in O(1/n)$, then it's not in $\Theta(1)$.

If in your context it is clear that $f$ has a positive lower bound (e.g. $1$), then whether you use $O(1)$ or $\Theta(1)$ doesn't matter. But in general, there is a difference.

Caleb Stanford
  • 7,298
  • 2
  • 29
  • 50
liori
  • 141
  • 2
1

Yes, it has additional information.

Consider the following algorithm: "If input N is odd, do something that takes a constant time. If N is even, do nothing." In the odd case, this means that Big O is on the order of 1. In the even case, this means that Big Omega is on the order of 0. This, in turn, means that there is no Big Theta, because there is no function that describes both its upper and lower bounds.

This sort of situation is easier to see when you're dealing with Big O factors larger than 1: for instance, an algorithm that said "If N is odd, add all numbers between 1 and N together and return the result; if N is even, return N" would have a Big O of N and a Big Omega of 1 - and it wouldn't have a Big Theta.

nick012000
  • 231
  • 2
  • 8
-5

I went and looked it up for this question. "Big Theta" and "Big O" are defined slightly differently, but then found that "Big O" has different definitions depending on where you look.

Depending on who you ask, you can have an amortized "Big O" resulting in O(1) where every n operations, it would have to run a linear step rather than a constant and still label it O(1). We did it this way when I was in college. More information here: https://stackoverflow.com/questions/200384/constant-amortized-time

"Big Theta" is more precise in that everybody agrees on its definition in that such steps are not allowed.

Joshua
  • 390
  • 2
  • 11