1

What are some examples of $O$ subtleties? I'm not only thinking of the asymmetry of the $O$ relation, but of the ways in which $O$ constants can depend on nearby parameters, and the fact that the notation suppresses this dependency. For example, if $g_k(x) = O(f_k(x))$ for $1 \leq k \leq n$, then $\sum_{1\leq k \leq n}g_k(x) = O(\sum_{1\leq k \leq n} |f_k(x)|)$, but only if the $O$ constants don't depend of the index of summation $k$.

2 Answers2

2

but of the ways in which O constants can depend on nearby parameters, and the fact that the notation suppresses this dependency.

Tarjan's Algorithm runs in $O(V + E)$ time, where $V$ is the number of vertices in the graph and $E$ is the number of edges. Big-O notation doesn't suppress this dependency on multiple variables.

In your example, $A(x) \in O(1)$. That is, as $x \to \infty$, $A(x)$ is asymptotically bounded by a constant. I think your confusion lies in the variable usage. So let's define: $A(n)$ to be the partial sum: $A(n) = \sum_{j \leq n} a(j)$. So $A(n) \in O(1)$.

In particular, the O constant is independent of y.

The function is decreasing. So $f(x) > f(y)$. By the definition of Big-O, $\sum_{x < n \leq y} a(n)f(n) \leq |f(x)|$ (note $a(n) \in O(1)$ as $A(n) \in O(1))$, and so $\sum_{x < n \leq y} a(n) f(n) = O(f(x))$.

In the first case, I think we're allowed to just change letters (how can I be more confident of this?).

What is the first case here? I'm not exactly sure which equation you are referring to specifically.

Edit:

Case (1): $O(f(y)) = O(f(x))$. $f(y) \leq f(x)$ as $f$ is a decreasing function. Since $f$ is non-negative, by assumption, the result follows immediately from the definition of Big-O. Really, Big-O is defined over a function, so $f(y) \in \Theta(f(x))$ is a stronger statement.

I don't really like how Big-O is being used here, but the convention is probably just to say that we have some constant multiplied by the function at $y$. That is, $O(f(y)) = c * f(y)$, for some constant $c$.

ml0105
  • 15,062
  • $A(x) \in O(1)$ just means that $A(x)$ is bounded, not that it converges to a constant. – Antonio Vargas Dec 05 '14 at 03:13
  • Poor wording on my part! I fixed it. Thanks for the correction. :-) – ml0105 Dec 05 '14 at 03:15
  • Case 1: show $O(f(y)) = O(f(x))$. Case 2: show $\int_x^yO(-f'(t))dt = O(f(x))$. – Owen Colman Dec 05 '14 at 03:19
  • I made an edit regarding case (1). Let me know if this clarifies. – ml0105 Dec 05 '14 at 03:28
  • No, not really… Not that it isn't intuitively clear; I'm just a little hesitant about the logical status of $y$ on the RHS of the 'equation'. Never mind. – Owen Colman Dec 05 '14 at 03:54
  • Also, by $O(f(x))$ I mean ${g(x) : \exists c> 0, \exists x_0> 0 \text{ such that } x\geq x_0 \implies |g(x)| \leq c |f(x)| }$. – Owen Colman Dec 05 '14 at 03:54
  • Yes, I'm familiar with the definition. Notice that Big-O is a limit. For every $x$ greater than $x_{0}$, we have the inequality satisfied. So $f(y)$ is clearly $O(f(x))$. Pick $x_{0} = y$. I would assume the Number Theory text is using Big-O as short hand (and abusing notation a bit) to say $O(f(y)) := c * f(y)$, for some constant $c$. – ml0105 Dec 05 '14 at 03:56
1

Here's another illustration, taken from Introduction to Analytic Number Theory by Tom Apostol (ml0105's answer refers to this). What I find potentially confusing about this example is the fact that there are two variables in play, $x$ and $y$, but the $O$ constant depends only on $x$; it also took me a while to see why the assumption that $f$ is decreasing is necessary. I've attempted to provide the details Apostol leaves out, and I'll be interpreting the = in an $O$ expression as $\subset$, as discussed in What are the rules for equals signs with big-O and little-o?

Theorem Suppose $f$ is a nonnegative, decreasing function defined on the positive real axis, and that $a(n)$ is an arithmetic function with partial sums $A(x) = \sum_{n\leq x}a(n)$ satisfying $A(x) = O(1)$. Then if $y>x$, $$ \sum_{x<n \leq y} a(n)f(n) = O(f(x)).$$

Proof By Abel's identity, we have

\begin{aligned} \sum_{x<n \leq y} a(n)f(n) & = f(y)A(y) - f(x)A(x) - \int_x^yA(t)f'(t)dt \\ & = f(y)O(1) - f(x)O(1) + \int_x^yO(1)(-f'(t))dt \\ & = O(f(y)) + O(f(x)) + \int_x^yO(-f'(t))dt. \\ \end{aligned}

We will be done if we can show that $O(f(y))$ and $\int_x^yO(-f'(t))dt$ are $O(f(x))$. In the first case, I think we're allowed to just change letters. To prove the second case, suppose $k(t) \in O(-f'(t))$, so that $\left|k(t)\right| \leq c(-f(t))$ for some constant $c >0$ and $t$ sufficiently large. Then

$$ \left|\int_x^y k(t) dt\right| \leq \int_x^y \left|k(t)\right| dt \leq \int_x^y c(-f(t)) dt = c(f(x) - f(y)) \leq c|f(x)|,$$

since $f'(s) < 0$ implies $0 \leq f(y) \leq f(x)$. This shows $\int_x^y k(t) dt \in O(f(x))$, so $\int_x^yO(-f'(t))dt = O(f(x)).$