-1

I'd like to double-check my understanding of Big-Oh.

The definition is that $f(n) = O(g(n))$ if $|f(x)| ≤ M\,|g(x)|$ for a natural number $M$ and for sufficiently large values of $x$.

Now, if $g(n) = O(n^2) - O(n^2)$, may we conclude that $g(n) = 0$? If not, what can we say about $g(n)$?

David Richerby
  • 82,470
  • 26
  • 145
  • 239

4 Answers4

3

We have $g(x) = f(x) - h(x)$ for some functions $f,h\in O(x^2)$. Since the functions are measuring the running time of some algorithm, I'll assume explicitly that $f(x)\geq 0$ and $h(x)\geq 0$, for all $x$. So there are constants $c_1$ and $c_2$ such that, for all large enough $x$, $f(x)\leq c_1x^2$ and $h(x)\leq c_2x^2$.

We have $g(x) = f(x) - h(x)$. Since $h(x)\geq 0$ for all $x$, we certainly know that $g(x)\leq f(x)$, since $g(x)$ is "$f(x)$ minus something non-negative." So, in particular, we have $g(x)\leq c_1x^2$ for all large enough $x$, i.e., $g(x) = O(x^2)$.

However, we cannot conclude anything stronger in the general case. It is possible, for example, that $f(x) = x^2$ and $h(x) = 0$ for all $x$. Then $g(x) = x^2$. So, without more information about $f$ and $h$, we can't conclude that $g$ is any smaller than $x^2$.

David Richerby
  • 82,470
  • 26
  • 145
  • 239
1

Let's say we are talking about runtimes of algorithms, not functions. Then we can use the definition of $O (f (n))$ the way it is given here. But where would we encounter something like $O (f (n)) - O (g (n))$?

Here's a practical situation: I have a complex algorithm. One step in that algorithm has a runtime of $O (f (n))$. I could modify that step in a way that it now takes $O (g (n))$. How much faster will my algorithm be?

We can say naively that the improvement is $O (f (n)) - O (g (n))$. If $f (n), g (n) = O (n^2)$ then we can say equally naively that the improvement is $O (n^2) - O (n^2)$. But what is the actual improvement?

The fact is that with the information given, we have no idea. f (n) could always be 100 times larger than g (n), or always 100 times smaller, or sometimes larger and sometimes smaller; that's absolutely allowed by the definition of O (). All we can say is that the runtime of the algorithm will not improve by more than $O (n^2)$, and will not get worse by more than $O (n^2)$. We can't say whether it will improve or get worse, and it might be different depending on n.

But if we talk about functions, and not algorithms, then we must use a definition of O (f (n)) that takes into account negative values and somehow works in a meaningful way for negative values.

gnasher729
  • 32,238
  • 36
  • 56
0

It is OK to say $$O(n^2) - O(n^2) = O(n^2)$$ which means that subtracting a function in $O(n^2)$ from another one in $O(n^2)$ gives a function in $O(n^2)$. Note that in algorithm analysis, we often assume that the functions considered take on positive values.

Be careful, because we don't have $$\Theta(n^2) - \Theta(n^2) = \Theta(n^2).$$ In contrast, we do have $$\Theta(n^2) + \Theta(n^2) = \Theta(n^2).$$


We don't have $g(n) = O(n^2) - O(n^2) \implies g(n) = 0$. For example, $$f_1(n) = n^2 + n = O(n^2), \;f_2(n) = n = O(n^2),$$ $$f_1(n) - f_2(n) = n^2 = \Theta(n^2) \neq 0.$$

David Richerby
  • 82,470
  • 26
  • 145
  • 239
hengxin
  • 9,671
  • 3
  • 37
  • 75
-1

My (proposed) solution is:

$g(n) = O(n^2) - O(n^2) = M*n^2 - M'*n^2 = (M-M')*n^2 = M''*n^2 = O(n^2)$

that is:

$g(n) = O(n^2)$

gnasher729
  • 32,238
  • 36
  • 56