-1

$f(n) = 3n+3$ ;
$f(n) = O(n)$
By definition :
$3n+3 \leq c_1.n$
By dividing both side by $n$
$3+\frac{3}{n} \leq c_1$
means we are getting constant range for $c_1$ for any $n$. Again it shows $c_1$'s value must be greater than $3$ at any cost.
e.g. if we take $c_1$'s value 3.5 so $n$'s value will be $6$.
Now if we plot graph ( Because I want to learn this concept by understanding graph )
$c_1.g(n)$ graph goes below of $f(n)$ graph. I have taken following values for both functions :
$f(n)=3n+3$

$ \begin{matrix} n & f(n)\\ 1 & 6\\ 2 & 9\\ 3 & 12\\ -2 & -3 \end{matrix} $

for $g(n) = 3.5n$
$ \begin{matrix} n & g(n)\\ 1 & 3.5\\ 2 & 7\\ 3 & 10.5\\ -2 & -7 \end{matrix} $

If we plot graph by these values it doesn't bind $f(n)$ i.e. $3n+3$ above by the value of $g(n)$ i.e. $3.5n$
Can anyone explain me this concept by graph ?

Raphael
  • 73,212
  • 30
  • 182
  • 400
user1745866
  • 265
  • 3
  • 16

3 Answers3

3

Here are a picture and some description:

The formal definitions associated with the Big Oh notation are as follows:

• f (n)= O (g (n)) means c · g (n)isan upper bound on f (n). Thus there exists some constant c such thatf (n) is always ≤ c · g (n), for large enoughn (i.e. , n ≥ n0 for some constant n0 ).

• f(n)=Ω(g (n)) means c · g (n) is a lower bound onf (n). Thus there exists some constant c such that f (n) is always ≥ c · g (n), for all n ≥ n0.

• f (n)=Θ(g (n)) means c1 · g (n) is an upper bound on f(n) and c2 ·g(n) is a lower bound on f (n), for all n ≥ n0 . Thus there exist constants c1 and c2 such that f(n) ≤ c1 · g(n) and f(n) ≥ c2 · g (n). This means that g(n) provides a nice, tight bound on f(n). enter image description here

You can find more details in the Chapter 2 of The Algorithm Design Manual,Steven S. Skiena.

2

The big $O$ notation describes the limiting behavior of a function. That mean the property you are looking for is true for $n$ large enough.

$g(n)$ may be smaller than $f(n)$ for small $n$ but as $n$ goes to the infinite $g(n)$ will at some point always be greater than $f(n)$.

What the big $O$ notation look like on a graph is: if you look at the two right end of both function $f(n)$ is lower than $g(n)$. But it's hard to look at infinity with a graph...

wece
  • 748
  • 3
  • 13
2

From introduction to the design and analysis of algorithms:

a function $t(n)$ is $O(g(n))$ if $t(n)$ is bounded above by some constant multiple of $g(n)$ for all large $n$.

So, $t(n) \le cg(n) \quad \text{for all $n>n_o$}$

big O

This image is from here.

Edit: I meant to say that the referenced book, as well as the webpage have graphical explanations for Oh, Omega, and Theta notation.

ramgorur
  • 541
  • 1
  • 5
  • 16
andy mcevoy
  • 213
  • 1
  • 5