# The Master Method

The *master method* is a cookbook method for solving recurrences. Although it cannot solve all recurrences, it is nevertheless very handy for dealing with many recurrences seen in practice. Suppose you have a recurrence of the form

*T(n) = aT(n/b) + f(n)*,

where *a* and *b* are arbitrary constants and *f* is some function of *n*. This recurrence would arise in the analysis of a recursive algorithm that for large inputs of size *n* breaks the input up into *a* subproblems each of size *n/b*, recursively solves the subproblems, then recombines the results. The work to split the problem into subproblems and recombine the results is *f(n)*.

We can visualize this as a recurrence tree, where the nodes in the tree have a branching factor of *a*. The top node has work *f(n)* associated with it, the next level has work *f(n/b)* associated with each of *a* nodes, the next level has work *f(n/b ^{2})* associated with each of

*a*nodes, and so on. At the leaves are the base case corresponding to some

^{2}*1 ≤ n < b*. The tree has

*log*levels, so the total number of leaves is

_{b}n*a*=

^{logbn}*n*.

^{logba}The total time taken is just the sum of the time taken at each level. The time taken at the *i*-th level is *a ^{i}f(n/b^{i})*, and the total time is the sum of this quantity as

*i*ranges from 0 to

*log*, plus the time taken at the leaves, which is constant for each leaf times the number of leaves, or

_{b}n−1*O(n*. Thus

^{logba})*T(n) = Σ _{0≤i<logbn} a^{i}f(n/b^{i}) + O(n^{logba})*.

What this sum looks like depends on how the asymptotic growth of *f(n)* compares to the asymptotic growth of the number of leaves. There are three cases:

- Case 1:
*f*(*n*) is O(*n*^{logba − ε}). Since the leaves grow faster than*f*, asymptotically all of the work is done at the leaves, so*T*(*n*) is Θ(n^{logb a}). - Case 2:
*f*(*n*) is Θ(*n*^{logba}). The leaves grow at the same rate as*f*, so the same order of work is done at every level of the tree. The tree has O(log n) levels, times the work done on one level, yielding*T*(*n*) is Θ(n^{logb a}log*n*). - Case 3:
*f*(*n*) is Ω(*n*^{logba + ε}). In this case*f*grows faster than the number of leaves, which means that asymptotically the total amount of work is dominated by the work done at the root node. For the upper bound, we also need an extra smoothness condition on*f*in this case, namely that*af*(*n*/*b*) ≤*cf*(*n*) for some constant*c < 1*and large*n*. In this case*T*(*n*) is Θ(*f*(*n*)).

As mentioned, the master method does not always apply. For example, the second example considered above, where the subproblem sizes are unequal, is not covered by the master method.

Let’s look at a few examples where the master method does apply.

**Example 1** Consider the recurrence

*T(n) = 4T(n/2) + n*.

For this recurrence, there are *a=4* subproblems, each dividing the input by *b=2*, and the work done on each call is *f(n)=n*. Thus *n*^{logba} is *n ^{2}*, and

*f(n)*is

*O(n*for

^{2-ε})*ε=1*, and Case 1 applies. Thus

*T(n)*is

*Θ(n*.

^{2})**Example 2** Consider the recurrence

*T(n) = 4T(n/2) + n ^{2}*.

For this recurrence, there are again *a=4* subproblems, each dividing the input by *b=2*, but now the work done on each call is *f(n)=n ^{2}*. Again

*n*

^{logba}is

*n*, and

^{2}*f(n)*is thus

*Θ(n*, so Case 2 applies. Thus

^{2})*T(n)*is

*Θ(n*. Note that increasing the work on each recursive call from linear to quadratic has increased the overall asymptotic running time only by a logarithmic factor.

^{2}log n)**Example 3** Consider the recurrence

*T(n) = 4T(n/2) + n ^{3}*.

For this recurrence, there are again *a=4* subproblems, each dividing the input by *b=2*, but now the work done on each call is *f(n)=n ^{3}*. Again

*n*

^{logba}is

*n*, and

^{2}*f(n)*is thus

*Ω(n*for

^{2+ε})*ε=1*. Moreover,

*4(n/2)*for

^{3}≤ kn^{3}*k=1/2*, so Case 3 applies. Thus

*T(n)*is

*Θ(n*.

^{3})### Example: Yet Another Sorting Algorithm

The following function sorts the first two-thirds of a list, then the second two-thirds, then the first two-thirds again:

let rec sort3 (a : 'a list) : 'a list = match a with [] -> [] | [x] -> [x] | [x; y] -> [(min x y); (max x y)] | _ -> let n = List.length a in let m = (2*n + 2)/3 in let res1 = sort3 (take a m) @ (drop a m) in let res2 = (take res1 (n-m)) @ sort3 (drop res1 (n-m)) in sort3 (take res2 m) @ (drop res2 m)

Here `take a m`

is the list consisting of the first `m`

elements of `a`

, and `drop a m`

is the list consisting of all but the first `m`

elements of `a`

. Perhaps surprisingly, this algorithm does sort the list. We leave the proof that it sorts correctly as an exercise. The key is to observe that the first two passes ensure that the last third of the list contains the correct elements in the correct order.

We can derive the running time of the algorithm from its recurrence using the master method. The routine does O(*n*) work in addition to three recursive calls on lists of length 2*n*/3. Therefore its recurrence is:

T(n) =cn+ 3T(2n/3)

If we apply the master method to the `sort3`

algorithm, we see that we are in case 1, so the algorithm is O(*n*^{log3/23}) = O(n^{2.71}), making it even slower than insertion sort! Note that being in Case 1 means that improving *f(n)* will not improve the overall time. For instance, replacing lists with arrays improves *f(n)* to constant from linear time, but the overall asymptotic complexity is still O(n^{2.71}).