Questions about asymptotic notations such as Big-O, Omega, etc.
Questions tagged [landau-notation]
271 questions
102
votes
3 answers
How does one know which notation of time complexity analysis to use?
In most introductory algorithm classes, notations like $O$ (Big O) and $\Theta$ are introduced, and a student would typically learn to use one of these to find the time complexity.
However, there are other notations, such as $o$, $\Omega$ and…
Miles
- 1,333
- 3
- 11
- 11
54
votes
4 answers
What is the meaning of $O(m+n)$?
This is a basic question, but I'm thinking that $O(m+n)$ is the same as $O(\max(m,n))$, since the larger term should dominate as we go to infinity? Also, that would be different from $O(\min(m,n))$. Is that right? I keep seeing this notation,…
Zeus
- 1,712
- 1
- 13
- 15
49
votes
4 answers
How do O and Ω relate to worst and best case?
Today we discussed in a lecture a very simple algorithm for finding an element in a sorted array using binary search. We were asked to determine its asymptotic complexity for an array of $n$ elements.
My idea was, that it is obvisously $O(\log n)$,…
Smajl
- 1,055
- 3
- 15
- 19
48
votes
10 answers
O(·) is not a function, so how can a function be equal to it?
I totally understand what big $O$ notation means. My issue is when we say $T(n)=O(f(n))$ , where $T(n)$ is running time of an algorithm on input of size $n$.
I understand semantics of it. But $T(n)$ and $O(f(n))$ are two different things.
$T(n)$ is…
doubleE
- 591
- 1
- 4
- 8
47
votes
2 answers
Order of growth definition from Reynolds & Tymann
I am reading a book called Principles of Computer Science (2008), by Carl Reynolds and Paul Tymann (published by Schaum's Outlines).
The second chapter introduces algorithms with an example of a sequential search which simply iterates through a list…
JW.
- 581
- 4
- 7
41
votes
6 answers
Sorting functions by asymptotic growth
Assume I have a list of functions, for example
$\qquad n^{\log \log(n)}, 2^n, n!, n^3, n \ln n, \dots$
How do I sort them asymptotically, i.e. after the relation defined by
$\qquad f \leq_O g \iff f \in O(g)$,
assuming they are indeed pairwise…
JAN
- 619
- 1
- 6
- 10
35
votes
1 answer
What does tilde mean, in big-O notation?
I'm reading a paper, and it says in its time complexity description that time complexity is $\tilde{O}(2^{2n})$.
I have searched the internet and wikipedia, but I can't find what this tilde signifies in big-O/Landau notation. In the paper itself I…
Johannes Schaub - litb
- 453
- 1
- 4
- 6
29
votes
5 answers
Is O(mn) considered "linear" or "quadratic" growth?
If I have some function whose time complexity is O(mn), where m and n are the sizes of its two inputs, would we call its time complexity "linear" (since it's linear in both m and n) or "quadratic" (since it's a product of two sizes)? Or something…
user541686
- 1,187
- 1
- 10
- 17
26
votes
7 answers
Justification for neglecting constant factors in Big O
Many a times if the complexities are having constants such as 3n, we neglect this constant and say O(n) and not O(3n). I am unable to understand how can we neglect such three fold change? Some thing is varying 3 times more rapidly than other! Why do…
gpuguy
- 1,819
- 3
- 22
- 32
24
votes
2 answers
Changing variables in recurrence relations
Currently, I am self-studying Intro to Algorithms (CLRS) and there is one particular method they outline in the book to solve recurrence relations.
The following method can be illustrated with this example. Suppose we have the recurrence
$$T(n) =…
erickg
- 341
- 1
- 2
- 3
20
votes
1 answer
Can a Big-Oh time complexity contain more than one variable?
Let us say for instance I am doing string processing that requires some analysis of two strings. I have no given information about what their lengths might end up being, so they come from two distinct families. Would it be acceptable to call the…
corsiKa
- 423
- 1
- 3
- 11
20
votes
2 answers
Construct two functions $f$ and $g$ satisfying $f \ne O(g), g \ne O(f)$
Construct two functions $ f,g: R^+ → R^+ $ satisfying:
$f, g$ are continuous;
$f, g$ are monotonically increasing;
$f \ne O(g)$ and $g \ne O(f)$.
Jessie
- 333
- 2
- 6
19
votes
3 answers
Why is there the regularity condition in the master theorem?
I have been reading Introduction to Algorithms by Cormen et al. and I'm reading the statement of the Master theorem starting on page 73. In case 3 there is also a regularity condition that needs to be satisfied to use the theorem:
... 3. If
$\qquad…
GrowinMan
15
votes
4 answers
What does $\log^{O(1)}n$ mean?
What does $\log^{O(1)}n$ mean?
I am aware of big-O notation, but this notation makes no sense to me.
I can't find anything about it either, because there is no way a search engine interprets this correctly.
For a bit of context, the sentence where I…
Oebele
- 253
- 1
- 4
15
votes
6 answers
n*log n and n/log n against polynomial running time
I understand that $\Theta(n)$ is faster than $\Theta(n\log n)$ and slower than $\Theta(n/\log n)$. What is difficult for me to understand is how to actually compare $\Theta(n \log n)$ and $\Theta(n/\log n)$ with $\Theta(n^f)$ where $0 < f < 1$.
For…
mihsathe
- 251
- 1
- 2
- 5