3

I would appreciate an intuitive way to find the time complexity of dynamic programming problems. Can anyone explain me “#subproblems * time/subproblem”? I am not able to grok it.

Code for LCS -

public static String findLCS(String str1, String str2 ) {
    // If either string is empty, return the empty string
    if(null == str1 || null == str2)
        return "";
    if("".equals(str1) || "".equals(str2)) {
        return "";
    }
    // are the last characters identical?
    if(str1.charAt(str1.length()-1) == str2.charAt(str2.length()-1)) {
        // yes, so strip off the last character and recurse
        return findLCS(str1.substring(0, str1.length() -1), str2.substring(0, str2.length()-1)) + str1.substring(str1.length()-1, str1.length());
    } else {
       // no, so recurse independently on (str1_without_last_character, str2)
       // and (str1, str2_without_last_character)
       String opt1 = findLCS(str1.substring(0, str1.length() -1), str2); 
       String opt2 = findLCS(str1, str2.substring(0, str2.length()-1));
       // return the longest LCS found
       if(opt1.length() >= opt2.length())
           return opt1;
       else
           return opt2;
    }
}

I am just providing the actual code instead of pseudo code (i hope pseudo code or the algo is pretty self explanatory from above)

Raphael
  • 73,212
  • 30
  • 182
  • 400
abipc
  • 207
  • 3
  • 8

2 Answers2

1

I assume you are interested in worst-case complexity.

A dynamic programming approach (recursively) breaks a problem into "smaller" problems, and records the solution of these subproblems to make sure that subproblems are not computed several times. So when the time required to combine subproblems is constant (it is the case for LCS), you only have to find an upper bound for the number of possible subproblems that will be solved along the program.

  • Here the subproblems are of the form findLCS(s1,s2) where s1 is a prefix of str1, and s2 a prefix of str2. There are (1+str1.length()) possible prefixes for str1, and (1+str2.length()) for str2.

  • The number of possible subproblems is therefore (1+str1.length()) * (1+str2.length()). Furthermore, a solution is obtained in constant time from its subproblems. Hence complexity O( str1.length()*str2.length() ).

N.B. on some sequences all subproblems will be indeed computed, e.g., when letters in str1 and str2 are distinct. Consequently, the quadratic bound is "tight".

N.B.2 you may find (short) explanations on the wikipedia page for LCS

Joseph Stack
  • 146
  • 5
-1

The code posted doesn't implement Dynamic Programming, so the time complexity is in fact O(2^n).

See this Wikipedia article and this GeeksforGeeks post for pseudocode and specific implementations.This post also shows how to get the LCS in a recursive/iteratively way using DP.

To implement Dynamic Programming is necessary to implement a cache mechanism to prevent calculating the same sub-results multiple times.