0

I'm studying for an algorithms exam and came upon the problem :

    for (i = 0; i < n * n; i++) {
        i++;
        for (j = 0; j < (log(n) * log(n) * log(n)); j++) {
            j++;
        }
    }

I figure that line 1 costs $ c_1(n^2) $, line 2 costs $ c_2(n^2 - 1) $, line 3 costs $ \sum_{j=0}^{n^2} \log(j)^3 $, and line 4 costs $\sum_{j=0}^{n^2} \log(j)^3 - 1 $. Am I on the right track? If so, what do the summations get simplified to?

Raphael
  • 73,212
  • 30
  • 182
  • 400
Batman
  • 1

2 Answers2

-1

Those lines with the increments matters for the final complexity function, but do not take them in count as it will only make things messy. The function can be simplified as follow:

The outer for runs $\frac{n^2}{2}$ times, and for each of it's iterations it runs $\frac{log^3{n}}{2}$ times.

We can call the complexity function $T(n) = \frac{n^2}{2} * log^3{n}$ which leads to $O(n^2 log^3{n})$.

aajjbb
  • 121
  • 5
-1

(Original question started with for (i = 0; i = n*n; ...) ) The runtime is zero if n = 0, and the algorithm runs forever otherwise.

I suspect strongly that the algorithm that you wrote in your question is not what you wanted to ask about. aajjbb's answer probably answers what you intended to ask.

(i = n * n in the for loop is most likely not what you wanted. Using i++ and j++ twice is most likely not what you wanted either. If you change i = n * n to i < n * n then a good compiler may notice that the statements inside the inner loop are idempotent and reduce the runtime massively. )

gnasher729
  • 32,238
  • 36
  • 56