Dynamic Programming

This summary is mainly based on chapter 15 of <<Introduction to Algorithms>>, Third Edition.

Application:  

  • Usually used for optimization problem. (Max, min, optimal…)

Characteristics:

  • Original problem is composed of sub-problems (optimal substructure)
  • Sub-problems overlap

Examples:

  1. Rod cutting problem: give a rod with length n inches and an array p with p[i] storing the revenue from length i rod. The goal is to maximize the revenues from the rods cut
  • top-down with memoization:

Memorized-cut-rod(p,n):
    let r[0...n] be a new array
    for i = 0 to n:
        r[i] = -infinity
    return Memoized-cut-rod-aux(p,n,r)

Memoized-cut-rod-aux(p,n,r):
    if r[n]&amp;gt;=0:
        return r[n]
    if n==0:
        result = 0
    else:
        result = -infinity
        for i to n:
            result = max(result, p[i]+Memoized-cut-rod-aux(p,n-i,r))
    r[n] = result
    return result
  • Bottom-up version:
Bottom-up-cut-rod(p,n):
    let r[0...n] be a new array
    r[0] = 0
    for i = 1 to n:
        q = -infinity
        for j = 1 to i:
            q = max(q,p[i]+r[j-i])
        r[i] = q
    return r[n]

Note that the above method only output the max revenue without giving us the cutting method. So the algorithm below will track the cutting pattern

Extended-bottom-up-cut-rod(p,n)
    Let r and s be new arrays
    r[0]=0
    for i = 1 to n:
        q = -infinity
        for j = 1 to i:
            if q&lt;p[i]+r[j-i]:
                q = p[i]+r[j-i]
                s[i] = j
            r[i] = q
    return r[n]
  1. Matrix-chain multiplication problem: A1A2A3 = (A1A2)A3 = A1(A2A3)=…. But the order affects the number of multiplications of numbers. The goal is to minimize the number of multiplication computation
  • bottom-up-method, using m[i,j] to represent the number of multiplications for Ai..Aj
Matrix-chain-order(p):
    let m and p be new tables
    for i from 1 to n:
        m[i,i]=0
    for l from 2 to n://the chain length
        for i from 1 to n-l+1:
            j=i+l-1
            m[i,j] = infinity
            for k=i to j-1:
                q = m[i,k]+m[k+1,j]+p[i-1]p[k]p[j]
                if q &lt; m[i,j]:
                    m[i,j] = 1
                    s[i,j] = k
     return m and s

Print-optimal-parens(s,i,j):
    if i==j:
        print &quot;Ai&quot;
    else:
        print &quot;(&quot;
        Print-optimal-parens(s,i,s[i,j])
        Print-optimal-parens(s,s[i,j]+1,j)
        print &quot;)&quot;
  • Also, there is a top-down recursive method, similar to the previous problem. But the bottom-up method is usually faster
  1. Longest-common-subsequence problem: (note that subsequence is different from sub-string). Given X={x1,x2…xm}, Y={y1,y2,…yn}. The goal is to obtain the longest common subsequence from X and Y.
  • If we use c[i,j] to represent the length of LCS of X[x1…xi] and Y[y1…yj]. Then the subproblems are c[i-1, j-1], c[i, j-1],c[i-1, j]. The relationship between xm and yn will give us different scenarios and thus different answers
LCS-LENGTH(X,Y):
    let c[0...m,0...n] be new tables
    for i=1 to m:
        c[i,0] = 0
    for j=0 to n:
        c[0,j] = 0
    for i=1 to m:
        for j=1 to n:
            if X[i]==Y[j]:
                c[i,j] = c[i-1,j-1]+1
            else:
                c[i,j] = max(c[i-1,j],c[i,j-1])
    return c
  • We can also get the actual LCS from the table c output from the algorithm above
  1. Optimal binary search tree problem: Given a series K = {k1,k2,…,kn} (k1<k2<…<kn), its probability being searched p1,p2,…pn, D={d0,d2,…dn} being the leaves of the tree and their corresponding probability q0,q1,…,qn. The goal is to construct a tree where the expected running time of searching is the minimum.
  • A optimal tree is composed of two child trees that are also optimal.
  • Denote e[i,j] as the expected cost of searching in the tree that contain the keys ki,…,kj. Then we have a relation between e[i,j] and e[i,r-1], e[r+1,j], w(i,j) for different values of r, with w(i,j) = pi+…pj+qi+…+qj
Optimal-BST(p,q,n):
    let e[1..n+1,0...n], w[1..n+1,0..n],and root[1..n,1..n] be new tables
    for i=1 to n+1:
        e[i,i-1] = q_{i-1}
        w[i,i-1] = q_{i-1}
    for l=1 to n: //length (range of the keys)
        for i=1 to n-l+1:
            j=i+l-1
            e[i,j] = infinity
            w[i,j] = w[i,j-1]+p_j+q_j
            for r = i to j:
                t = e[i,r-1] + e[r+1,j] + w[i,j]
                if t&lt;e[i,j]:
                    e[i,j] = t
                    root[i,j] = r
    return e and root

 

  • Note that for both matrix multiplication problem and optimal bst problem, the two “end points” of the input can move to construct subproblems, while other problems can allow just one point move to construct subproblems.
  • The key step is to observe the structure of the original problem: how is it composed of optimal subproblems and how can we derive a recursive formula for the relationship between optimal problems and optimal subproblems.

 

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s