0

So I have function that I have written to return the maximum value between index i and index x of a given array using recursion as follows:

FindMax( A, i, j )
    if (i == j):
        return A[i] 
    else:
        k = (i + j) / 2
    return max( FindMax(A, i, k), FindMax(A, k+1, j) )

What is the correct way to go about calculating the big O asymptotic time complexity of this algorithm? I would appreciate if someone could document a general way to calculate the correct complexity even if it is a different recursive function. I have spent a lot of time watching videos and reading up on the web but sadly I come up with different complexities after each.

Raphael
  • 73,212
  • 30
  • 182
  • 400
learnerX
  • 137
  • 1
  • 2
  • 10

1 Answers1

2

Let's say the time of FindMax is $T(n)$ where $n$ is the size of array A. Notice, that FindMax access only the part of array between $i$ and $j$. Look at the pseudocode step by step. The first if is just comparison, so it's O(1). IF comparison is true, returning is O(1) (exit recursion step). The else step is doing again some arithmetic things which is O(1) and then calling three functions: max with time O(1), FindMax with time $T(n/2)$ and again FindMax. The time is the half, because again those function are "working" on the half of the original array. This gives the recursion $T(n) = O(1) + 2T(n/2)$. Using the master's theorem we get $T(n) = O(n)$ see here. Another proof without using the theorem mentioned above is by induction, pretty straightforward for this relation.

Eugene
  • 1,106
  • 1
  • 6
  • 14